The same reason people use google to look something up instead of going to the library
Google returns sources that you can evaluate for accuracy.
Chatgpt just says things.
Every output of chatgpt should end with “source: just trust me bro”.
Chat gpt said things you can evaluate, which i did by googling it. And when i could not find the event in question, i went back into Lemmy and asked for more information. So tell me where i err’d? Was it not taking the posters word on it? Or trying to get context in the first place?
You can retrieve sources from chat gpt. And that is besides the point that i didn’t simply rely on gpt. Even without prompting, i did my own digging on google, found his wiki page looked up articles about Paul and filming at a memorial and only found the incident from 2002. Thats two more paths to sources that failed me.
Chat gpt is a tool that is useful if used right, but even i did not take its word for it.
Chat GPT often makes mistakes. They call them “hallucinations”. And at one point it completely made up court cases that got two lawyers sanctioned for using.
Chat GPT is not a search engine no matter how much Bing tries to tell you it is.
Yep, no doubt. I have used chat gpt extensibly and have found it hallucinating on my own questions. It was not the case when it referred to the 2002 event, but i know it does that. It is a tool like google. And google puts pseudoscience and conspiracy theories at the top of the list sometimes too when trying to fact find. You have to know the limitations of what it is capable of. Case in point, when i asked about this event, i didn’t assume gpt answer was correct, google gave links exclusively to coverage of the the 2002 event, completely ignoring the Vietnam portion of my query. And i still returned to ask the poster for more info to get context. I don’t know what more people could have wanted from me.