Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI The Courts

Judges Are Fed Up With Lawyers Using AI That Hallucinate Court Cases (404media.co) 32

An anonymous reader quotes a report from 404 Media: After a group of attorneys were caught using AI to cite cases that didn't actually exist in court documents last month, another lawyer was told to pay $15,000 for his own AI hallucinations that showed up in several briefs. Attorney Rafael Ramirez, who represented a company called HoosierVac in an ongoing case where the Mid Central Operating Engineers Health and Welfare Fund claims the company is failing to allow the union a full audit of its books and records, filed a brief in October 2024 that cited a case the judge wasn't able to locate. Ramirez "acknowledge[d] that the referenced citation was in error," withdrew the citation, and "apologized to the court and opposing counsel for the confusion," according to Judge Mark Dinsmore, U.S. Magistrate Judge for the Southern District of Indiana. But that wasn't the end of it. An "exhaustive review" of Ramirez's other filings in the case showed that he'd included made-up cases in two other briefs, too. [...]

In January, as part of a separate case against a hoverboard manufacturer and Walmart seeking damages for an allegedly faulty lithium battery, attorneys filed court documents that cited a series of cases that don't exist. In February, U.S. District Judge Kelly demanded they explain why they shouldn't be sanctioned for referencing eight non-existent cases. The attorneys contritely admitted to using AI to generate the cases without catching the errors, and called it a "cautionary tale" for the rest of the legal world. Last week, Judge Rankin issued sanctions on those attorneys, according to new records, including revoking one of the attorneys' pro hac vice admission (a legal term meaning a lawyer can temporarily practice in a jurisdiction where they're not licensed) and removed him from the case, and the three other attorneys on the case were fined between $1,000 and $3,000 each.
The judge in the Ramirez case said that he "does not aim to suggest that AI is inherently bad or that its use by lawyers should be forbidden." In fact, he noted that he's a vocal advocate for the use of technology in the legal profession.

"Nevertheless, much like a chain saw or other useful [but] potentially dangerous tools, one must understand the tools they are using and use those tools with caution," he wrote. "It should go without saying that any use of artificial intelligence must be consistent with counsel's ethical and professional obligations. In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."

Judges Are Fed Up With Lawyers Using AI That Hallucinate Court Cases

Comments Filter:
  • by haxor.dk ( 463614 ) on Tuesday March 04, 2025 @04:47PM (#65211141)

    A: Shoot him before he hits the water.

    This joke is not fabricated using AI.

    • by ls671 ( 1122017 )

      Like in anything else, a minority of lawyers are pretty good and know what they are doing. All those stupid lawyers would have to do is check if the cases really exist and review the cases. Just use the silly AI as hints which have to be verified. Same thing for any type of information really, it doesn't have to come from AI.

    • A: Shoot him before he hits the water.

      This joke is not fabricated using AI.

      Isn't that no different than what trump does every second he's alive?

      It's called lying or fraud. You're using something that doesn't exist to try to represent something that does. Isn't it the lawyer's job to make sure he isn't committing fraud or outright lying? This goes with trump, also.

      Why would you hire, trust or vote for someone you couldn't trust at all?

      • Is it lying or fraud? There are only two purposes to case citation: 1) a superior, controlling court has decided this and therefore you must as well; and 2) when I make my logical argument, I'm going to add some italics, underlining, and parentheses as if you've never heard of a citation to authority fallacy.

        It shouldn't be any more persuasive as some court somewhere than an AI hoovering up the internet and positing the same.
  • by awwshit ( 6214476 ) on Tuesday March 04, 2025 @04:52PM (#65211157)

    There you have it, the courts have ruled that AI is not actually intelligent. But you already knew that.

  • That they'll make back in 2 hours? No. Disbarred immediately. JFC why haven't we implemented a rule for being caught doing shit like this? Oh yeah....the people who make the rules are giving AI sector full rein.
  • This lawyer was failing in multiple ways. He clearly did not Shepadize his citations to ensure that they had not been overturned.

  • by Luckyo ( 1726890 ) on Tuesday March 04, 2025 @05:01PM (#65211193)

    If you read the decision linked above, headline is the opposite of the ruling. Judge clearly states that using AI is not just fine, but probably the future of the profession.

    What judge sanctions lawyers for is not fact checking findings of AI for hallucinations. I.e. AI that occasionally hallucinates and using it to draft documents is fine. Not checking the references to see if they're accurate or not is not fine.

  • by rsilvergun ( 571051 ) on Tuesday March 04, 2025 @05:02PM (#65211197)
    Judges and lawyers exist to apply extremely complex rules that themselves exist in order to get around Weasley attempts to avoid complying with laws by using language tricks.

    AI has the potential to bust that wide open. An AI could find case law in seconds that you would normally pay tens of thousands of dollars for a lawyer to know about and leverage to win your case. They could find all the little tricks that you would pay a lawyer big bucks for and do it instantaneously. And they could apply complex rules that we normally need a judge with a ton of training to do.

    I don't think there is anything, even programming, that is more ripe to be eliminated by AI. And lawyers know it and they're damn well going to make sure they protect themselves.
    • An AI can clearly make up case law that doesn't exist in seconds.

      Basing your legal defence on made up case law is pretty risky.

      • Is on replacing programmers and not lawyers. But that's going to change once they finish up with programmers. Better models will avoid creating case law and instead just find and summarize existing case law.

        That said I think lawyers, unlike programmers, are smart enough to see when their jobs are at stake and aren't going to let it happen.
        • by Luthair ( 847766 )
          I think I've tried LLMs about 10 times over the past 1.5 years for code generation and questions. All of the code generation contained hallucinations, the tests generated both contained hallucinations and didn't provide meaningful tests, they failed to generate a working build, and every question it answered was wrong. To be frank, this doesn't surprise me if they used StackOverflow as a data source.

          Maybe they'll improve but unless you suck as a developer you'll be better off with normal Google searches an

          • ars technica has a detailed article from a professional programmer describing his use of LLMs to write code. Perhaps for your exact very specific use case LLMs can't yet easily replicate your code. That doesn't matter because even if you found a niche for yourself that niche is about to get flooded with millions of unemployed programmers.

            In the old days they could start businesses and compete and make new products but now with a complete lack of antitrust law enforcement and the banks having all the cap
        • by kmoser ( 1469707 )

          That said I think lawyers, unlike programmers, are smart enough to see when their jobs are at stake and aren't going to let it happen.

          How are they going to stop it from happening? They don't control the technology (that's what the programmers do, remember?), and anybody is allowed to represent themself in court without an attorney. So the lawyers have already lost that battle.

          But back to reality: until somebody can create an AI that demonstrably, provably, does not hallucinate case citations, anybody relying on AI output is taking a huge risk. Even if AI is proven to never hallucinate citations, somebody (hopefully a lawyer!) will stil

      • An AI can clearly make up case law that doesn't exist in seconds.
        Basing your legal defence on made up case law is pretty risky.

        A paralegal could do the same. Ultimately, the lawyer in charge is responsible for everything resented to the court/judge ...

        • The parent comment was implying you don't need a lawyer if you have a fancy AI. Or a judge.
          I wouldn't want an AI judge making up laws and precedence in cases.

          • I wouldn't want an AI judge making up laws and precedence in cases.

            Indeed. Besides, we have SCOTUS for that. :-)

  • case against a hoverboard manufacturer

    Sideways skateboards are not "hoverboards".

  • "In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."

    Or, in laymen terms: "Stop being a dumbass and quit fucking up at your high-paying job, Mr. Attorney."

  • ... couldn't lawyers make up stuff manually before? I mean perish the thought, but I think it was in fact possible.

    • Of course, and if they got caught which they would, the punishment was a lot worse than a small fine. After all, if the opposing attorney is doing his job, he will check any cases cited in an opposing brief to determine what they actually say. What you are suggesting is fraud versus this which is merely incompetence.
  • I've done IT contracts for lawyers offices. They're all exactly as inept with technology, lazy, slow, and overpaid as you suspect they might be. I'm sure the top 10% of lawyers are efficient, smart, and great but not the rest! Using AI to do your job and not taking 5 seconds to see if a case actually exists when you're paying $200+/hr for them to do it? Someone should be disbarred.
  • Meaning they have no automated and effective way to know if a lawyer is lying about a citation or not. Meaning they're not doing their own jobs effectively, and to point that out... they're going to penalize someone for a mistake that they DID catch.

    • In an ideal world the judge would know all the case law pertaining to every case and so be able to rule on them without difficulty. In practice it is unreasonable to expect a judge to know everything, so they are dependent on lawyers to quote case law accurately to establish their case. In theory the judge's clerks could be expected to confirm the validity of the quoted precedents, but in practice that isn't going to happen. So the attorney for the other side should be encouraged to do the checking, and the

  • Judges don't seem to fed up. They didn't do anything meaningful about it!

Lawrence Radiation Laboratory keeps all its data in an old gray trunk.

Working...