Earlier this week I discussed an example of ChatGPT giving ‘placeholder’ answers in lieu of real answers. Below is an example of what that looks like. I could swear this didn’t used to happen, but it basically just ‘doesn’t’ answer your question. I’m interested how often other people see this behavior.

You are viewing a single thread.
View all comments
2 points

I see this a lot too, i think chatgpt anticipates the task being difficult?

I just be more specific, or have it expand its placeholder.

permalink
report
reply
1 point

Yeah I am kinda guessing it’s a cost cutting measure for more work the part of the llm.

permalink
report
parent
reply
1 point

Yeah I’ve noticed lately it likes to be lazy

permalink
report
parent
reply

ChatGPT

!chatgpt@lemmy.world

Create post

Unofficial ChatGPT community to discuss anything ChatGPT

Community stats

  • 243

    Monthly active users

  • 296

    Posts

  • 2.3K

    Comments

Community moderators