You are viewing a single thread.
View all comments View context

did we really regress back from that?

i mean giving a confidence for recognizing a certain object in a picture is relatively straightforward.

But LLMs put together words by their likeliness of belonging together under your input (terribly oversimplified).the confidence behind that has no direct relation to how likely the statements made are true. I remember an example where someone made chatgpt say that 2+2 equals 5 because his wife said so. So chatgpt was confident that something is right when the wife says it, simply because it thinks these words to belong together.

permalink
report
parent
reply
2 points

ChatGPT what is the Gรถdel number for the proof of 2+2=5?

permalink
report
parent
reply

Gรถdel numbers are typically associated with formal mathematical statements, and there isnโ€™t a formal proof for 2+2=5 in standard arithmetic. However, if youโ€™re referring to a non-standard or humorous context, please provide more details.

permalink
report
parent
reply
1 point

Of course I donโ€™t know enough about the actual proof for it to be anything but a joke but there are infinite numbers so there should be infinite proofs.

there are also meme proofs out there I assume could be given a Gรถdel number easily enough.

permalink
report
parent
reply

Programmer Humor

!programmerhumor@lemmy.ml

Create post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.

Community stats

  • 5.6K

    Monthly active users

  • 1.5K

    Posts

  • 35K

    Comments