

Judges Are Fed Up With Lawyers Using AI That Hallucinate Court Cases (404media.co) 31
An anonymous reader quotes a report from 404 Media: After a group of attorneys were caught using AI to cite cases that didn't actually exist in court documents last month, another lawyer was told to pay $15,000 for his own AI hallucinations that showed up in several briefs. Attorney Rafael Ramirez, who represented a company called HoosierVac in an ongoing case where the Mid Central Operating Engineers Health and Welfare Fund claims the company is failing to allow the union a full audit of its books and records, filed a brief in October 2024 that cited a case the judge wasn't able to locate. Ramirez "acknowledge[d] that the referenced citation was in error," withdrew the citation, and "apologized to the court and opposing counsel for the confusion," according to Judge Mark Dinsmore, U.S. Magistrate Judge for the Southern District of Indiana. But that wasn't the end of it. An "exhaustive review" of Ramirez's other filings in the case showed that he'd included made-up cases in two other briefs, too. [...]
In January, as part of a separate case against a hoverboard manufacturer and Walmart seeking damages for an allegedly faulty lithium battery, attorneys filed court documents that cited a series of cases that don't exist. In February, U.S. District Judge Kelly demanded they explain why they shouldn't be sanctioned for referencing eight non-existent cases. The attorneys contritely admitted to using AI to generate the cases without catching the errors, and called it a "cautionary tale" for the rest of the legal world. Last week, Judge Rankin issued sanctions on those attorneys, according to new records, including revoking one of the attorneys' pro hac vice admission (a legal term meaning a lawyer can temporarily practice in a jurisdiction where they're not licensed) and removed him from the case, and the three other attorneys on the case were fined between $1,000 and $3,000 each. The judge in the Ramirez case said that he "does not aim to suggest that AI is inherently bad or that its use by lawyers should be forbidden." In fact, he noted that he's a vocal advocate for the use of technology in the legal profession.
"Nevertheless, much like a chain saw or other useful [but] potentially dangerous tools, one must understand the tools they are using and use those tools with caution," he wrote. "It should go without saying that any use of artificial intelligence must be consistent with counsel's ethical and professional obligations. In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."
In January, as part of a separate case against a hoverboard manufacturer and Walmart seeking damages for an allegedly faulty lithium battery, attorneys filed court documents that cited a series of cases that don't exist. In February, U.S. District Judge Kelly demanded they explain why they shouldn't be sanctioned for referencing eight non-existent cases. The attorneys contritely admitted to using AI to generate the cases without catching the errors, and called it a "cautionary tale" for the rest of the legal world. Last week, Judge Rankin issued sanctions on those attorneys, according to new records, including revoking one of the attorneys' pro hac vice admission (a legal term meaning a lawyer can temporarily practice in a jurisdiction where they're not licensed) and removed him from the case, and the three other attorneys on the case were fined between $1,000 and $3,000 each. The judge in the Ramirez case said that he "does not aim to suggest that AI is inherently bad or that its use by lawyers should be forbidden." In fact, he noted that he's a vocal advocate for the use of technology in the legal profession.
"Nevertheless, much like a chain saw or other useful [but] potentially dangerous tools, one must understand the tools they are using and use those tools with caution," he wrote. "It should go without saying that any use of artificial intelligence must be consistent with counsel's ethical and professional obligations. In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."
Q: How do you save a lawyer from drowning? (Score:5, Funny)
A: Shoot him before he hits the water.
This joke is not fabricated using AI.
Re: (Score:2)
Like in anything else, a minority of lawyers are pretty good and know what they are doing. All those stupid lawyers would have to do is check if the cases really exist and review the cases. Just use the silly AI as hints which have to be verified. Same thing for any type of information really, it doesn't have to come from AI.
Re: (Score:1)
A: Shoot him before he hits the water.
This joke is not fabricated using AI.
Isn't that no different than what trump does every second he's alive?
It's called lying or fraud. You're using something that doesn't exist to try to represent something that does. Isn't it the lawyer's job to make sure he isn't committing fraud or outright lying? This goes with trump, also.
Why would you hire, trust or vote for someone you couldn't trust at all?
Re: Q: How do you save a lawyer from drowning? (Score:2)
It shouldn't be any more persuasive as some court somewhere than an AI hoovering up the internet and positing the same.
the application of actual intelligence (Score:4, Insightful)
There you have it, the courts have ruled that AI is not actually intelligent. But you already knew that.
A measly fine? (Score:1)
"Shepardizing" (Score:2)
This lawyer was failing in multiple ways. He clearly did not Shepadize his citations to ensure that they had not been overturned.
Headline is a lie as usual (Score:4, Insightful)
If you read the decision linked above, headline is the opposite of the ruling. Judge clearly states that using AI is not just fine, but probably the future of the profession.
What judge sanctions lawyers for is not fact checking findings of AI for hallucinations. I.e. AI that occasionally hallucinates and using it to draft documents is fine. Not checking the references to see if they're accurate or not is not fine.
Re: (Score:2)
Bizarre edit considering that definitionally there's no need for such a sanction.
Judges and lawyers both are terrified of AI (Score:4, Interesting)
AI has the potential to bust that wide open. An AI could find case law in seconds that you would normally pay tens of thousands of dollars for a lawyer to know about and leverage to win your case. They could find all the little tricks that you would pay a lawyer big bucks for and do it instantaneously. And they could apply complex rules that we normally need a judge with a ton of training to do.
I don't think there is anything, even programming, that is more ripe to be eliminated by AI. And lawyers know it and they're damn well going to make sure they protect themselves.
Re: (Score:2)
An AI can clearly make up case law that doesn't exist in seconds.
Basing your legal defence on made up case law is pretty risky.
So right now all the effort (Score:2)
That said I think lawyers, unlike programmers, are smart enough to see when their jobs are at stake and aren't going to let it happen.
Re: (Score:2)
Maybe they'll improve but unless you suck as a developer you'll be better off with normal Google searches an
Re: (Score:2)
In the old days they could start businesses and compete and make new products but now with a complete lack of antitrust law enforcement and the banks having all the cap
Re: (Score:2)
That said I think lawyers, unlike programmers, are smart enough to see when their jobs are at stake and aren't going to let it happen.
How are they going to stop it from happening? They don't control the technology (that's what the programmers do, remember?), and anybody is allowed to represent themself in court without an attorney. So the lawyers have already lost that battle.
But back to reality: until somebody can create an AI that demonstrably, provably, does not hallucinate case citations, anybody relying on AI output is taking a huge risk. Even if AI is proven to never hallucinate citations, somebody (hopefully a lawyer!) will stil
Re: (Score:2)
An AI can clearly make up case law that doesn't exist in seconds.
Basing your legal defence on made up case law is pretty risky.
A paralegal could do the same. Ultimately, the lawyer in charge is responsible for everything resented to the court/judge ...
Re: (Score:2)
The parent comment was implying you don't need a lawyer if you have a fancy AI. Or a judge.
I wouldn't want an AI judge making up laws and precedence in cases.
Re: (Score:2)
I wouldn't want an AI judge making up laws and precedence in cases.
Indeed. Besides, we have SCOTUS for that. :-)
Never were, never will be. (Score:2)
case against a hoverboard manufacturer
Sideways skateboards are not "hoverboards".
"In other words:" (Score:2)
"In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."
Or, in laymen terms: "Stop being a dumbass and quit fucking up at your high-paying job, Mr. Attorney."
Er ... (Score:1)
... couldn't lawyers make up stuff manually before? I mean perish the thought, but I think it was in fact possible.
Re: (Score:2)
Lazy, greedy assholes (Score:2)
Re: Lazy, greedy assholes (Score:2)
Zealous representation is actually a higher ethical priority than candor in every US state I'm aware of.
Judges can't deal with current tech. Fine them!!! (Score:2)
Meaning they have no automated and effective way to know if a lawyer is lying about a citation or not. Meaning they're not doing their own jobs effectively, and to point that out... they're going to penalize someone for a mistake that they DID catch.
Unrealistic (Score:2)
In an ideal world the judge would know all the case law pertaining to every case and so be able to rule on them without difficulty. In practice it is unreasonable to expect a judge to know everything, so they are dependent on lawyers to quote case law accurately to establish their case. In theory the judge's clerks could be expected to confirm the validity of the quoted precedents, but in practice that isn't going to happen. So the attorney for the other side should be encouraged to do the checking, and the
Judges Are Fed Up? (Score:2)