ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women’s Hospital found that cancer treatment plans generated by OpenAI’s revolutionary chatbot were full of errors.

You are viewing a single thread.
View all comments View context
1 point

Or they could feed the current model with a reputable source of medical information.

permalink
report
parent
reply
2 points

That wouldn’t guarantee correct answers.

It’s arguably more dangerous if ChatGPT gives mostly sane specific medical advice because it makes people put more trust in it than they should.

permalink
report
parent
reply
1 point

True but it would reduce the chances of it making stuff up entirely.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 556K

    Comments