ChatGPT-Powered Bing Sued for Libel Over Its AI-Induced Hallucinations (reason.com) 21
Long-time Slashdot reader schwit1 shared this report from Reason.com:
When people search for Jeffery Battle in Bing, they get the following (at least sometimes; this is the output of a search that I ran Tuesday):
Jeffrey Battle, also known as The Aerospace Professor, is the President and CEO of Battle Enterprises, LLC, and its subsidiary The Aerospace Professor Company... Battle was sentenced to eighteen years in prison after pleading guilty to seditious conspiracy and levying war against the United States...
But it turns out that this combines facts about two separate people with similar names: (1) Jeffery Battle, who is indeed apparently a veteran, businessman, and adjunct professor, and (2) Jeffrey Leon Battle, who was convicted of trying to join the Taliban shortly after 9/11. The two have nothing in common other than their similar names. The Aerospace Professor did not plead guilty to seditious conspiracy....
[T]o my knowledge, this connection was entirely made up out of whole cloth by Bing's summarization feature (which is apparently based on ChatGPT); I know of no other site that actually makes any such connection (which I stress again is an entirely factually unfounded connection).
Battle is now suing Microsoft for libel over this...
Jeffrey Battle, also known as The Aerospace Professor, is the President and CEO of Battle Enterprises, LLC, and its subsidiary The Aerospace Professor Company... Battle was sentenced to eighteen years in prison after pleading guilty to seditious conspiracy and levying war against the United States...
But it turns out that this combines facts about two separate people with similar names: (1) Jeffery Battle, who is indeed apparently a veteran, businessman, and adjunct professor, and (2) Jeffrey Leon Battle, who was convicted of trying to join the Taliban shortly after 9/11. The two have nothing in common other than their similar names. The Aerospace Professor did not plead guilty to seditious conspiracy....
[T]o my knowledge, this connection was entirely made up out of whole cloth by Bing's summarization feature (which is apparently based on ChatGPT); I know of no other site that actually makes any such connection (which I stress again is an entirely factually unfounded connection).
Battle is now suing Microsoft for libel over this...
No problem (Score:5, Funny)
Re: (Score:1)
Re: (Score:2, Informative)
You had to go digging for an example where a mistake was made and the wrong footage was aired. Even then, the mistake was acknowledged and corrected.
Fox "News", in contrast, tells countless intentional lies every day with corrections being few and far between.
They even argue in court that no one reasonable would believe their bullshit. But go ahead and pretend that they're both the same.
Re:AI is just following the trends (Score:5, Informative)
https://www.snopes.com/fact-ch... [snopes.com]
A nightly news program on ABC erroneously aired a video from a Kentucky gun range during a report about the outbreak of war in northern Syria between Turkey and U.S.-allied Kurds after U.S. President Donald Trump pulled American forces out.
The video depicting a fiery explosion was shown during "World News Tonight" on Oct. 13, 2019, and again the following day on "Good Morning America." The video described itself as "appearing to show" the Turkish military bombarding Kurdish forces. In reality, it showed a night-time shooting event at a Kentucky gun range. ABC apologized for the mistake.
So your example of liberal news making things up is ABC accurately reporting on an event, but mistakenly using a video of a different event, and then apologizing as soon as the error was brought to their attention.
My example of Fox News making things up is Fox News paying out a $787 million settlement in a defamation lawsuit [wikipedia.org] but still refusing to apologize. Possibly because they don't want to antagonize their viewers by admitting the false reporting, and possibly because they don't want to admit wrongdoing before the second and potentially larger lawsuit that's still underway [wikipedia.org].
Truth is later than fiction (Score:5, Interesting)
Re: (Score:1)
ChatGPT works as advertised (Score:1)
Re:ChatGPT works as advertised (Score:5, Informative)
As opposed to other lawsuits regarding generative AI, this one is aimed at Microsoft for publishing it as fact. This suit at least has merit.
Ambulance-chasing Lawyers At Work... (Score:3)
1. Microsoft is not advertising or promoting this information, so they're literally not telling anyone about it. People would have to proactively find this kind of information and then associate them with the individual. (Microsoft's plausible deniability is the simple fact of footnote source-linking every sentence, thus allowing any parties to fact-check the composite search results page or paragraph, as it were.)
2. There's literally NO DAMAGES incurred by this misinterpretation. They didn't lose a job, property, or court case on the basis of the composite information. Heck, they didn't even lose out on a bank loan or student scholarship or citizenship application -- something that's possible when your personal credit score/history accidentally combines information from two distinct persons with matching names!
3. They signed the complaint in June but only filed it in July, a month later. So, to their own admission, they've delayed filing the complaint without assuring the accuracy of the details of their complaint as alleged a month prior. This lack of due diligence on their part can also sink their claim on another technicality; the basis could have been accurate in June but Microsoft (and using a third-party like WayBackMachine) can demonstrate that by the time information had been properly routed and implemented, the argument became moot.
The complaint also argues that Microsoft's BING did not comply with a cease-and-desist request with an arbitrary deadline. However, conducting the seach myself did not result in the generative information being displayed in the matter their lawsuit alleges.
Re: (Score:2)
You're not wrong about the damages, but this is somewhat on the back foot due to the libel per se (assuming it is such in the jurisdictions, I did not check). I agree with your other comments entirely, tho, and I don't even see how it could really carry as a libel per se cases due to the larger issues.
Ridiculous... (Score:2)
Anyone who has used ChatGPT for any length of time understands well how inaccurate and useless it often is. I use it to distill documentation and code, and even with that ideal use, I believe it wouldn't be useful at all if I weren't already a subject matter expert!
Bing actually happy (Score:1)
"...See, we're big enough to be sued!"
How do we know they aren't the same person? (Score:2)
It doesn't take AI to screw this up (Score:2)
I immediately knew what happened. Some data accumulator had dumped my dad's information into my credit file! His first
ChatCPT hallucinates history (Score:1)
I have learned not to trust ChatCPT. It did not know about the Prince of Wales who drown in the English Channel. It told me a false tale of the life, children, and career of a child of Alma and Gustav Mahler who died as a child. If I did not already know the history of these people, ChatGPT would have steered me wrong.