Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
AI The Courts

Judges Are Fed Up With Lawyers Using AI That Hallucinate Court Cases (404media.co) 74

An anonymous reader quotes a report from 404 Media: After a group of attorneys were caught using AI to cite cases that didn't actually exist in court documents last month, another lawyer was told to pay $15,000 for his own AI hallucinations that showed up in several briefs. Attorney Rafael Ramirez, who represented a company called HoosierVac in an ongoing case where the Mid Central Operating Engineers Health and Welfare Fund claims the company is failing to allow the union a full audit of its books and records, filed a brief in October 2024 that cited a case the judge wasn't able to locate. Ramirez "acknowledge[d] that the referenced citation was in error," withdrew the citation, and "apologized to the court and opposing counsel for the confusion," according to Judge Mark Dinsmore, U.S. Magistrate Judge for the Southern District of Indiana. But that wasn't the end of it. An "exhaustive review" of Ramirez's other filings in the case showed that he'd included made-up cases in two other briefs, too. [...]

In January, as part of a separate case against a hoverboard manufacturer and Walmart seeking damages for an allegedly faulty lithium battery, attorneys filed court documents that cited a series of cases that don't exist. In February, U.S. District Judge Kelly demanded they explain why they shouldn't be sanctioned for referencing eight non-existent cases. The attorneys contritely admitted to using AI to generate the cases without catching the errors, and called it a "cautionary tale" for the rest of the legal world. Last week, Judge Rankin issued sanctions on those attorneys, according to new records, including revoking one of the attorneys' pro hac vice admission (a legal term meaning a lawyer can temporarily practice in a jurisdiction where they're not licensed) and removed him from the case, and the three other attorneys on the case were fined between $1,000 and $3,000 each.
The judge in the Ramirez case said that he "does not aim to suggest that AI is inherently bad or that its use by lawyers should be forbidden." In fact, he noted that he's a vocal advocate for the use of technology in the legal profession.

"Nevertheless, much like a chain saw or other useful [but] potentially dangerous tools, one must understand the tools they are using and use those tools with caution," he wrote. "It should go without saying that any use of artificial intelligence must be consistent with counsel's ethical and professional obligations. In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."
This discussion has been archived. No new comments can be posted.

Judges Are Fed Up With Lawyers Using AI That Hallucinate Court Cases

Comments Filter:
  • by haxor.dk ( 463614 ) on Tuesday March 04, 2025 @05:47PM (#65211141)

    A: Shoot him before he hits the water.

    This joke is not fabricated using AI.

    • by ls671 ( 1122017 ) on Tuesday March 04, 2025 @06:11PM (#65211223) Homepage

      Like in anything else, a minority of lawyers are pretty good and know what they are doing. All those stupid lawyers would have to do is check if the cases really exist and review the cases. Just use the silly AI as hints which have to be verified. Same thing for any type of information really, it doesn't have to come from AI.

  • by awwshit ( 6214476 ) on Tuesday March 04, 2025 @05:52PM (#65211157)

    There you have it, the courts have ruled that AI is not actually intelligent. But you already knew that.

    • I am not a lawyer. Can this actually be used as precedent to that effect?
    • by Ksevio ( 865461 )

      More they've said lawyers using AI tools poorly is not intelligent

    • The courts have ruled that the lawyers are not intelligent. The LLM is a tool and they didn't know how to use it properly. If you keep bashing a screw with a hammer, you can usually get it into wood. But it still won't do the job it was supposed to.

  • That they'll make back in 2 hours? No. Disbarred immediately. JFC why haven't we implemented a rule for being caught doing shit like this? Oh yeah....the people who make the rules are giving AI sector full rein.
    • That was my reaction too. Use an AI to hallucinate fake cases that get you a win in court? Charge huge legal fees for your win and laugh all the way to the bank. Get caught doing it? Pay a piddling fine and get another chance to try it the next time round. Imagine if the criminal justice system worked like that: "OK, we caught you this time but have another go next week, you might get away with it then".
    • If we react too hastily, then suddenly we'll have no lawyers left. And then.....And then....OK, we can go ahead.

  • by whoever57 ( 658626 ) on Tuesday March 04, 2025 @06:00PM (#65211189) Journal

    This lawyer was failing in multiple ways. He clearly did not Shepadize his citations to ensure that they had not been overturned.

  • by Luckyo ( 1726890 ) on Tuesday March 04, 2025 @06:01PM (#65211193)

    If you read the decision linked above, headline is the opposite of the ruling. Judge clearly states that using AI is not just fine, but probably the future of the profession.

    What judge sanctions lawyers for is not fact checking findings of AI for hallucinations. I.e. AI that occasionally hallucinates and using it to draft documents is fine. Not checking the references to see if they're accurate or not is not fine.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      If you read the decision linked above, headline is the opposite of the ruling. Judge clearly states that using AI is not just fine, but probably the future of the profession.

      What judge sanctions lawyers for is not fact checking findings of AI for hallucinations. I.e. AI that occasionally hallucinates and using it to draft documents is fine. Not checking the references to see if they're accurate or not is not fine.

      - This particular set of lawyers got greedy. Rather than underpaying paralegals to do their work for them, they tried to substitute AI to do the same work. With disastrous results.
      - The fines seem minuscule overall. If I can replace 10 paralegals per year and at worst only pay fines which amount to the cost of 1 paralegal ... what's the incentive not to do this?*.
      - *In full fairness, this is asking the question only for morally abhorrent/reprehensive attorneys. Many good ones out there who would neve

      • by Luckyo ( 1726890 )

        No, they didn't get greedy. They got lazy. When it's your name on the filing, you are responsible for what is in it. So you should check what's in it. That's what every lawyer will tell you.

    • by necro81 ( 917438 )

      AI that occasionally hallucinates and using it to draft documents is fine. Not checking the references to see if they're accurate or not is not fine.

      I could see pitfalls with this, too. Sure, the AI may reference a genuine case that actually exists - rather than just a hallucination - but who's to say if the AI's interpretation of the cited case law is actually correct. One would expect the AI summary to be correct based on the mass of training data available, but that's hardly a guarantee.

      As an exam

      • by Luckyo ( 1726890 )

        The expectation is that you read the documents you file, and you agree with things stated in the documents.

        Unironically that is what legal advice given by lawyers is. It is naturally follows that if a lawyer delegates formulation of legal arguments to someone or something else, that lawyer should at minimum verify that arguments are sound and citations are correct.

        This is literally what lawyers are paid for. And notably what they were fined for in this case. Because the fine didn't come from not checking th

  • by rsilvergun ( 571051 ) on Tuesday March 04, 2025 @06:02PM (#65211197)
    Judges and lawyers exist to apply extremely complex rules that themselves exist in order to get around Weasley attempts to avoid complying with laws by using language tricks.

    AI has the potential to bust that wide open. An AI could find case law in seconds that you would normally pay tens of thousands of dollars for a lawyer to know about and leverage to win your case. They could find all the little tricks that you would pay a lawyer big bucks for and do it instantaneously. And they could apply complex rules that we normally need a judge with a ton of training to do.

    I don't think there is anything, even programming, that is more ripe to be eliminated by AI. And lawyers know it and they're damn well going to make sure they protect themselves.
    • An AI can clearly make up case law that doesn't exist in seconds.

      Basing your legal defence on made up case law is pretty risky.

      • Is on replacing programmers and not lawyers. But that's going to change once they finish up with programmers. Better models will avoid creating case law and instead just find and summarize existing case law.

        That said I think lawyers, unlike programmers, are smart enough to see when their jobs are at stake and aren't going to let it happen.
        • by Luthair ( 847766 )
          I think I've tried LLMs about 10 times over the past 1.5 years for code generation and questions. All of the code generation contained hallucinations, the tests generated both contained hallucinations and didn't provide meaningful tests, they failed to generate a working build, and every question it answered was wrong. To be frank, this doesn't surprise me if they used StackOverflow as a data source.

          Maybe they'll improve but unless you suck as a developer you'll be better off with normal Google searches an

          • ars technica has a detailed article from a professional programmer describing his use of LLMs to write code. Perhaps for your exact very specific use case LLMs can't yet easily replicate your code. That doesn't matter because even if you found a niche for yourself that niche is about to get flooded with millions of unemployed programmers.

            In the old days they could start businesses and compete and make new products but now with a complete lack of antitrust law enforcement and the banks having all the cap
            • by Zurk ( 37028 )

              thats literaly the duumbest take on anything ive ever read. are you sure youre not an AI ? AI's are pretty worthless except for garbage tasks. they hallucinate 30% of the time. you need to feed answers back to them hundreds of times to get anything accurate. and then the models sometimes blow up. yeah they are useful for drafting time sheets and mundane bs but they arent going to replace good coders anytime soon.

            • by wed128 ( 722152 )
              Most of the "Sheer force of effort" I see is copying and pasting from Stack Overflow. I'm fine.
        • by kmoser ( 1469707 )

          That said I think lawyers, unlike programmers, are smart enough to see when their jobs are at stake and aren't going to let it happen.

          How are they going to stop it from happening? They don't control the technology (that's what the programmers do, remember?), and anybody is allowed to represent themself in court without an attorney. So the lawyers have already lost that battle.

          But back to reality: until somebody can create an AI that demonstrably, provably, does not hallucinate case citations, anybody relying on AI output is taking a huge risk. Even if AI is proven to never hallucinate citations, somebody (hopefully a lawyer!) will stil

      • An AI can clearly make up case law that doesn't exist in seconds.
        Basing your legal defence on made up case law is pretty risky.

        A paralegal could do the same. Ultimately, the lawyer in charge is responsible for everything resented to the court/judge ...

        • The parent comment was implying you don't need a lawyer if you have a fancy AI. Or a judge.
          I wouldn't want an AI judge making up laws and precedence in cases.

          • I wouldn't want an AI judge making up laws and precedence in cases.

            Indeed. Besides, we have SCOTUS for that. :-)

  • case against a hoverboard manufacturer

    Sideways skateboards are not "hoverboards".

    • Kind of a ridiculous misnomer, right? It's the bottom of a Segway. Damn thing doesn't hover a frikkin micrometer.
  • "In other words, the use of artificial intelligence must be accompanied by the application of actual intelligence in its execution."

    Or, in laymen terms: "Stop being a dumbass and quit fucking up at your high-paying job, Mr. Attorney."

  • ... couldn't lawyers make up stuff manually before? I mean perish the thought, but I think it was in fact possible.

    • Of course, and if they got caught which they would, the punishment was a lot worse than a small fine. After all, if the opposing attorney is doing his job, he will check any cases cited in an opposing brief to determine what they actually say. What you are suggesting is fraud versus this which is merely incompetence.
  • I've done IT contracts for lawyers offices. They're all exactly as inept with technology, lazy, slow, and overpaid as you suspect they might be. I'm sure the top 10% of lawyers are efficient, smart, and great but not the rest! Using AI to do your job and not taking 5 seconds to see if a case actually exists when you're paying $200+/hr for them to do it? Someone should be disbarred.
  • Meaning they have no automated and effective way to know if a lawyer is lying about a citation or not. Meaning they're not doing their own jobs effectively, and to point that out... they're going to penalize someone for a mistake that they DID catch.

    • In an ideal world the judge would know all the case law pertaining to every case and so be able to rule on them without difficulty. In practice it is unreasonable to expect a judge to know everything, so they are dependent on lawyers to quote case law accurately to establish their case. In theory the judge's clerks could be expected to confirm the validity of the quoted precedents, but in practice that isn't going to happen. So the attorney for the other side should be encouraged to do the checking, and the

  • by oldgraybeard ( 2939809 ) on Tuesday March 04, 2025 @08:39PM (#65211529)
    Judges don't seem to fed up. They didn't do anything meaningful about it!
  • by joe_frisch ( 1366229 ) on Tuesday March 04, 2025 @11:34PM (#65211737)

    I can understand the early mistakes before people were aware of the ways in which AI can make mistakes, but its been long enough that there is no longer any excuse. A lawyer who uses AI without verifying the results is endangering his clients and committing perjury.

    The same applies to any industry where the results can have a major impact on peoples lives. I don't want to see engineers designing structures based on AI supplied calculations.

    • Agree. What method a lawyer used to come up with a false document is irrelevant. AI should be considered to be a tool similar to a word processor. A lawyer is completely in control of what they submit. It's reckless and it's perjury.
    • by Mal-2 ( 675116 )

      It has been quite a while for those who are paying attention to the AI field. For everyone else, it's only been "a couple years" and most people are not prepared for tech that goes from ELIZA to Skynet in less than a decade. The more change accelerates, the less humans can stay in the loop on all of them.

  • Couldn't you take the legal brief generated by an AI, and feed it into ChatGPT asking "please look up all of the cases cited in this brief and give a URI for each"? Personally I feel that getting AI to check for errors in work is much more useful than getting it to write the work itself.
    • by Mal-2 ( 675116 )

      Confirmation as you seek is a job for tools like Deep Research, not another similar type of LLM to the one that hallucinated. Such tools have only become available to the public in the last month or so.

  • Forget about the lawyers and the AI. Why on earth would any company allow a union to do a full audit of its books and records? If the business allowed that to be negotiated in to a Union contract then they deserve to go out of business.

"Never face facts; if you do, you'll never get up in the morning." -- Marlo Thomas

Working...