You are viewing a single thread.
View all comments View context
8 points

I use chatGPT for any topic I’m curious about, and like half the time when i double check the answers it turns out they’re wrong.

For example i asked for a list of phones with screens that don’t use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don’t use PWM. Why does it straight up lie?!

permalink
report
parent
reply
8 points

It’s not lying. It has no concept of context or truth. Auto complete on steroids.

permalink
report
parent
reply

Showerthoughts

!showerthoughts@lemmy.world

Create post

A “Showerthought” is a simple term used to describe the thoughts that pop into your head while you’re doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. Avoid politics (NEW RULE as of 5 Nov 2024, trying it out)
  4. Posts must be original/unique
  5. Adhere to Lemmy’s Code of Conduct

Community stats

  • 7.4K

    Monthly active users

  • 1.4K

    Posts

  • 49K

    Comments