That’s what i did. I looked it up and did not find what i was looking for. Closest thing i found was the event in 2002 which was eerily close to the event you mentioned, but i could not find what you said. So i asked you for more information. Jees! You snapped back at me like a stack overflow user.
I’m not the original commenter. And I didn’t want to offend you, just wanted give some friendly advice, because it irks me how many people seem to use chatGPT for fact finding. Chill
Edit: If you look for “Paul Wellstone” and “Vietnam War Memorial” you’ll find what the OC was on about.
Chat GPT often makes mistakes. They call them “hallucinations”. And at one point it completely made up court cases that got two lawyers sanctioned for using.
Yep, no doubt. I have used chat gpt extensibly and have found it hallucinating on my own questions. It was not the case when it referred to the 2002 event, but i know it does that. It is a tool like google. And google puts pseudoscience and conspiracy theories at the top of the list sometimes too when trying to fact find. You have to know the limitations of what it is capable of. Case in point, when i asked about this event, i didn’t assume gpt answer was correct, google gave links exclusively to coverage of the the 2002 event, completely ignoring the Vietnam portion of my query. And i still returned to ask the poster for more info to get context. I don’t know what more people could have wanted from me.
“Wellstone also staged a news conference in front of the Vietnam War Memorial on the National Mall, drawing the ire of many veteran groups. Wellstone later said the event was a mistake.”
Fair enough, though editing the comment made it difficult to realize you did provide what i asked for. I never called any one a lier. Just that when i went to chat gpt for context to what the other said it flat out told me it didn’t happen, which is uncharacteristic of chat gpt sycophantic tendencies. Usually talking a passive voice, or assuming i am making up a scenario and rolling with it. So i did google it and found that he was not considered controversial, and if the wiki mentioned it, it was only as a foot note.
Everything i looked up pointed to the 2002 incident. Not sure why i am being dog piled here.
Either way it seems odd to reference this as a Paul Wellstone ‘event’
That’s what i did. I looked it up and did not find what i was looking for. Closest thing i found was the event in 2002 which was eerily close to the event you mentioned, but i could not find what you said. So i asked you for more information. Jees! You snapped back at me like a stack overflow user.
I’m not the original commenter. And I didn’t want to offend you, just wanted give some friendly advice, because it irks me how many people seem to use chatGPT for fact finding. Chill
Edit: If you look for “Paul Wellstone” and “Vietnam War Memorial” you’ll find what the OC was on about.
https://www.latimes.com/archives/la-xpm-2002-oct-26-na-wellstone26-story.html “Wellstone also staged a news conference in front of the Vietnam War Memorial on the National Mall, drawing the ire of many veteran groups. Wellstone later said the event was a mistake.”
Just an example of many
I do not understand why people ask chatgpt for factual information
The same reason people use google to look something up instead of going to the library
Chat GPT often makes mistakes. They call them “hallucinations”. And at one point it completely made up court cases that got two lawyers sanctioned for using.
https://www.forbes.com/sites/mattnovak/2023/05/27/lawyer-uses-chatgpt-in-federal-court-and-it-goes-horribly-wrong/
Chat GPT is not a search engine no matter how much Bing tries to tell you it is.
Yep, no doubt. I have used chat gpt extensibly and have found it hallucinating on my own questions. It was not the case when it referred to the 2002 event, but i know it does that. It is a tool like google. And google puts pseudoscience and conspiracy theories at the top of the list sometimes too when trying to fact find. You have to know the limitations of what it is capable of. Case in point, when i asked about this event, i didn’t assume gpt answer was correct, google gave links exclusively to coverage of the the 2002 event, completely ignoring the Vietnam portion of my query. And i still returned to ask the poster for more info to get context. I don’t know what more people could have wanted from me.
“Wellstone also staged a news conference in front of the Vietnam War Memorial on the National Mall, drawing the ire of many veteran groups. Wellstone later said the event was a mistake.”
Fair enough, though editing the comment made it difficult to realize you did provide what i asked for. I never called any one a lier. Just that when i went to chat gpt for context to what the other said it flat out told me it didn’t happen, which is uncharacteristic of chat gpt sycophantic tendencies. Usually talking a passive voice, or assuming i am making up a scenario and rolling with it. So i did google it and found that he was not considered controversial, and if the wiki mentioned it, it was only as a foot note.
Everything i looked up pointed to the 2002 incident. Not sure why i am being dog piled here.
Either way it seems odd to reference this as a Paul Wellstone ‘event’
His spelling suggests he’s more Reddit.
I mean i was on reddit before the api thing, so ehh. Maybe i interpreted it harsher then it was meant.