You are viewing a single thread.
View all comments View context
8 points

I use chatGPT for any topic I’m curious about, and like half the time when i double check the answers it turns out they’re wrong.

For example i asked for a list of phones with screens that don’t use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don’t use PWM. Why does it straight up lie?!

permalink
report
parent
reply
8 points

It’s not lying. It has no concept of context or truth. Auto complete on steroids.

permalink
report
parent
reply

Showerthoughts

!showerthoughts@lemmy.world

Create post

A “Showerthought” is a simple term used to describe the thoughts that pop into your head while you’re doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.

Rules

  • All posts must be showerthoughts
  • The entire showerthought must be in the title
  • Posts must be original/unique
  • Be good to others - no bigotry - including racism, sexism, ableism, homophobia, transphobia, or xenophobia
  • Adhere to Lemmy’s Code of Conduct

Community stats

  • 6.4K

    Monthly active users

  • 1.3K

    Posts

  • 44K

    Comments