Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI The Courts Robotics

A Robot Was Scheduled To Argue In Court, Then Came the Jail Threats (npr.org) 115

schwit1 shares a report from NPR: A British man who planned to have a "robot lawyer" help a defendant fight a traffic ticket has dropped the effort after receiving threats of possible prosecution and jail time. [...] The first-ever AI-powered legal defense was set to take place in California on Feb. 22, but not anymore. As word got out, an uneasy buzz began to swirl among various state bar officials, according to Browder. He says angry letters began to pour in. "Multiple state bar associations have threatened us," Browder said. "One even said a referral to the district attorney's office and prosecution and prison time would be possible." In particular, Browder said one state bar official noted that the unauthorized practice of law is a misdemeanor in some states punishable up to six months in county jail.

"Even if it wouldn't happen, the threat of criminal charges was enough to give it up," [said Joshua Browden, the CEO of the New York-based startup DoNotPay]. "The letters have become so frequent that we thought it was just a distraction and that we should move on." State bar associations license and regulate attorneys, as a way to ensure people hire lawyers who understand the law. Browder refused to cite which state bar associations in particular sent letters, and what official made the threat of possible prosecution, saying his startup, DoNotPay, is under investigation by multiple state bar associations, including California's.
"The truth is, most people can't afford lawyers," he said. "This could've shifted the balance and allowed people to use tools like ChatGPT in the courtroom that maybe could've helped them win cases."

"I think calling the tool a 'robot lawyer' really riled a lot of lawyers up," Browder said. "But I think they're missing the forest for the trees. Technology is advancing and courtroom rules are very outdated."
This discussion has been archived. No new comments can be posted.

A Robot Was Scheduled To Argue In Court, Then Came the Jail Threats

Comments Filter:
  • by MikeDataLink ( 536925 ) on Thursday January 26, 2023 @07:26PM (#63243501) Homepage Journal

    Eventually we WILL have to deal with AI/robot lawyers (and doctors, etc.). This will happen eventually and a precedent will have to be set.

    And even if we ban AI robot lawyers, there's nothing stopping a human lawyer from having the AI lawyer give him advice from his pocket and airpods.

    • by scourfish ( 573542 ) <scourfish@@@yahoo...com> on Thursday January 26, 2023 @07:34PM (#63243523)
      I have no sympathy for lawyers getting automated out of their jobs.
      • If the law becomes so complicated humans can't understand it, we'll have to take AI's word as to what the law says and means.

    • by Brain-Fu ( 1274756 ) on Thursday January 26, 2023 @07:45PM (#63243537) Homepage Journal

      Oh, having a human lawyer use a computer as a legal research tool is totally fine. What has them all angry is the threat of replacing human lawyers with robots. As a rule, automation is fine so long as it isn't my job being automated. This step was a direct threat to the income of these associations, which is why threats of jail time started to fly.

      • by Opportunist ( 166417 ) on Friday January 27, 2023 @05:00AM (#63244143)

        Nobody complains about automating away their work, what they fear is automating away their paycheck.

      • by AmiMoJo ( 196126 )

        They are worried about people having access to legal advice too. From what I understand the situation with parking fines is that you can often get out of them if you are willing to put the effort into going to court and dealing with the hard to navigate system. The cop needs to turn up to be a witness. If everyone did it, either traffic cops would spend all day in court or they would lose a huge source of revenue.

        An app that makes it easy to contest tickets is a threat. If everyone suddenly has access to a

        • The obvious solution to robo-lawyers causing an increase in case load, is to introduce robo-judges working in virtual court rooms.

          That does however pose the risk of laws being applied to everyone equally and fairly.

          • Only until they can make robo-judges capable of golfing. Then we can get back in the swing of things. Yes, that was on purpose. I'll see myself out.

        • by Ed Tice ( 3732157 ) on Friday January 27, 2023 @11:55AM (#63244713)
          Most parking tickets are written by meter maids not by cops. Most parking tickets (and traffic tickets) are legitimate in that the defendant is guilty. However, in order to avoid the situation you describe, most are handled via plea deals. i.e. You were speeding at +20 which is pretty serious but you can plea guilty to +10, pay a fine, and move on with your life. If everybody were to actually go to court they would have to face the original charge (+20mph and the like) and the penalties are quite severe. The difference in revenue from a +20 ticket and a +10 ticket covers the cost of the police time in court.

          Parking tickets are a bit worse in that, if you are found guilty, there are often court fees that add up to as much as the original fine. So you really should not fight a ticket if you are guilty.

          I did once successfully, by myself, fight a parking ticket that was improper. The original ticket was written for a green van with one license plate digit different than mine. I had a red two-seater. Somehow, though, when the ticket was entered into the computer, a typographical error assigned it to me. I didn't know until I got a the unpaid ticket notice. I was able to resolve that by myself by just going to the court and talking to the judge. But my case was, arguably, simpler since the original ticket showed it wasn't my license. It would have been harder if I were actually guilty (because well guilty people are supposed to pay their fines)

      • by mjwx ( 966435 )

        Oh, having a human lawyer use a computer as a legal research tool is totally fine. What has them all angry is the threat of replacing human lawyers with robots. As a rule, automation is fine so long as it isn't my job being automated. This step was a direct threat to the income of these associations, which is why threats of jail time started to fly.

        Pretty much.

        It won't stop it though, the best they can hope for is for humans to oversee the automation, making sure it doesn't make mistakes as we do with long haul automated trains (simply because if the automation does fail, millions of dollars of cargo sits there for hours whilst someone drives out to the middle of nowhere).

    • Yep eventually this will happen, but before then a LOT of legal issues will need to be sorted like liability and consequences. e.g. a lawyer operates under certain rules and breaching those rules can have serious legal and financial consequences so whereas a lawyer has an understanding and potentially fear of what happens to them the AI Bot only knows what it is told to do, someone has to take responsibility for ensuring it is following those rules.
      • Yep eventually this will happen, but before then a LOT of legal issues will need to be sorted

        Since when are legal issues sorted out first in today’s environment? Uber, Airbnb, and plenty of other companies have business models predicated on the law taking years to sort out whether what they’re doing is legal after they’ve already made their millions/billions and have shifted the discourse in their favor.

        • by Mitreya ( 579078 )

          Uber, Airbnb, and plenty of other companies have business models predicated on the law taking years to sort out whether what theyâ(TM)re doing is legal

          Oh, not that. I think the biggest issue is the liability problem. Who is responsible when the attorney/doctor AI screws up? Developers? Manufacturer?
          With Uber, they had to figure out insurance situation for drivers, even if legality of Uber itself may be unclear. But insuring AI will be trickier.

        • The difference is that judges have a lot of discretion in running their court so they could probably hold the bot or whoever runs it in contempt of court if it does something stupid.

      • Where I live a lawyer has to be admitted to the bar to represent anybody at all in court.
        • So could the same AI that would have been used in court also be used to take the bar exam? Maybe that would provide a precedent.
        • That's why lawyers are expensive.

          It work the same with plumbers and electricians with state licensing. An electrician just quoted us $24,000 to install new plugs and wires in a few rooms. Probably 5 days work. I'll be doing it myself. $600/hr strikes me as being unreasonable.

          • Yep - and they don't fix the walls when they are done. You have to get some else to patch up the sheetrock. At least that is how it is with plumbers.

        • by AmiMoJo ( 196126 )

          Can you take the bar exam remotely, or with a laptop in the room? Seems like there is an opportunity for someone to qualify by having the AI pass the exam and using it exclusively in practice. Sell their services for half the price of a normal lawyer, and the only work they need to do is feed documents to the AI.

        • I thought most places you could still represent yourself, without being a lawyer. Maybe we'll get more people representing themslelves, and asking an AI not-a-lawyer for unofficial advice.

          • Yes, you can represent yourself. What you can't do is call yourself a lawyer and represent other people.
            An AI can't either because it hasn't passed the bar.
    • Let's hope so, at least on the doctor front. I should be able to get a mole looked at by an AI for $40 instead of a dermatologist making $400k a year. Or an X-ray reviewed by an AI instead of a radiologist. Both are cases where AI can perform as good as or better than doctors.
      • Until the AI fucks up and you find out later you've got stage 4 skin cancer or lung cancer.

        Then you'll need a team of human doctors and a human lawyer to sue whomever you got the AI doctor from.

        • by Brain-Fu ( 1274756 ) on Thursday January 26, 2023 @09:42PM (#63243737) Homepage Journal

          Humans screw up and misdiagnose. Its common enough to be a concern, and its why people get second opinions. AI that misdiagnose more often than humans won't be relied-upon. Once the AI is better than the humans, then it will be relied-upon and be more affordable.

          • Yes people get second opinions and should. They are opinions after all. They also have their medical license at risk, pay a fuck ton for insurance and so on.

            Who loses their license when an AI fucks up? Who is paying for malpractice insurance?

          • ...people get second opinions

            Can't they get second opinions from AIs as well? I mean, there would probably be multiple companies providing X-Ray interpretation services, using different algorithms/models. As far as I can see, this would reduce the errors. Moreover, if AIs return the probabilities for different diagnostics, the customer can more easily decide whether they should get a human in the loop as well (for example, if the difference in probabilities for some possible diagnosis is too high between different AIs).

        • I remember stories about a human radiologist who was too lazy to do her job - she "reviewed" tens of thousands of x-rays, and basically gave random results.

          Same basic result as the AI fucking up as well, it's not like you could sue her judgement proof ass and actually get anything from it(she was broke as shit and fired when it was found out).

          As long as you remember that the AI is a tool, it should be good.

          There have been efforts to "automate" medical care, and said efforts normally show that they reduce mi

          • She didn't work for a hospital or medical group? How does a radiologist get any work without being part of a larger organization that can suffer a legal liability loss?

            I am in favor of an AI doing a double check of human work but being primary decision maker for medical decisions? No thanks.

            • Because she absolutely didn't operate according to the rules and regulations of her employer, they disclaimed all responsibility. This can be bypassed, but that requires a lawsuit.

              Or maybe they paid some token amount out - to the heirs of those misdiagnosed. Because if you have stage 4 cancer, you probably aren't going to survive to see any judgement.

              Even without the malpractice, that's where "AI is now smarter than human operators on average" comes in though. It doesn't just take a bad human actor, just

              • I'm not familiar with the case you're talking about. If you had a name or url or whatever I'd love to read it or I'd google for it if I had something to go on.

                But generically speaking, her employer is responsible. They can't just walk away because they should have systems in place to make sure everyone is doing their job as per policy and training with audits, etc. and if not there it's on the employer. Either way there'd be a huge lawsuit so that's not an issue. Tons of lawyers would take it on contin

                • Remember the part about the victims not surviving to see the results? "Huge lawsuit" doesn't solve that problem.

                  I'd rather be alive than my heirs be richer without me.

                  • Agreed but my point is human doctors have a real reason to try to get it right. There are systems in place that put them at great personal career and financial risk if they fuck up too badly or too often.

                    What is in place for AI "doctors" to incentivize them or their creators to get it right or at least not too wrong?
                    Right now, not much.

    • by narcc ( 412956 ) on Thursday January 26, 2023 @09:03PM (#63243671) Journal

      I doubt it. The output of "tools" like this simply can't be trusted. They will lie and produce complete nonsense. This is an insurmountable problem, given how they work.

      Thought that just sounds like a sleazy lawyer. Hmm... Maybe they're right this time...

    • And even if we ban AI robot lawyers....

      How can governments effectively ban this? If someone operates from outside your country you can't easily stop the service provider and you can hardly ban a defendant from defending themselves so how do you stop them from using the service?

      • by bws111 ( 1216812 )

        Simple. You make a rule that mobile devices may not be used in the courtroom. Many courts already have such a rule.

    • by MrKaos ( 858439 )

      And even if we ban AI robot lawyers, there's nothing stopping a human lawyer from having the AI lawyer give him advice from his pocket and airpods.

      Which is probably the approach that would have been successful. The issue being is Lawyers fill the legal system and whilst it has value to the construction and interpretation of democratically created laws - a parking ticket is still a parking ticket.

      Had the AI been framed as a "Legal Assistant" for the advocate I would reckon that the news would have been different, ego being what it is.

      I will point out that interpretation of the law should %100 remain in the domain of human beings, furthermore, we wi

    • That's not what lawyers are worried about, actually, I'm pretty sure lawyers won't complain about AI doing the job for them.

      What they don't like is AI replacing them.

    • I think a little advice from the Marketing Department might go a long way. Perhaps "TicketBuster" or something. Avoid engaging with the lawyers until you can execute "Kill the Lawyers" (Shakespeare).

    • Scott Adams today remarked that he thinks all the 'Smart Jobs' (jobs done in an office or on a chair or things like Doctoring and Lawyering ) will be automated first and humans will be relegated to manual labor for a time until it becomes economic to build robots. I've thought this myself. The easiest things to automate are the ones that don't touch physical real space, but those things will be automated too.

      Rather than 'having the AI represent him' next time he should represent himself using AI Lawyer

  • by Tablizer ( 95088 ) on Thursday January 26, 2023 @07:28PM (#63243507) Journal

    A mob-bot threatens a lawyer-bot, the future is finally here!

  • by theshowmecanuck ( 703852 ) on Thursday January 26, 2023 @07:29PM (#63243509) Journal

    Fuck lawyers, they do everything they can to obfuscate what should be clear rules for normal people's society to run by. Then they make a killing being the only ones to be able to decipher the obtuse and obfuscated legal language, including the loopholes many of them build in. In this case I welcome our robot legal assistants. Just don't let them write laws, as then they will control us.

    • by Brain-Fu ( 1274756 ) on Thursday January 26, 2023 @07:51PM (#63243549) Homepage Journal

      I don't think law makers deliberately use difficult language for the sole purpose of creating work for courtroom lawyers. They don't need to.

      The greater part of that weird language comes from the need to achieve great clarity, since the real world is super complicated and full of edge cases that land in gray areas.

      There is also the issue of legacy law. Much like computer code, things change after laws are written and that old law uses old language which was typical of the time but hard to understand now. And, just like computer code, refactoring it is expensive and disruptive, so it is often avoided as much as possible. The lawyers, judges, and their assistants, are the ones who inherit the burdens of sifting through all that technical debt every time the need arises.

      • by Potor ( 658520 )

        I don't think law makers deliberately use difficult language for the sole purpose of creating work for courtroom lawyers. They don't need to.

        The greater part of that weird language comes from the need to achieve great clarity, since the real world is super complicated and full of edge cases that land in gray areas.

        There is also the issue of legacy law. Much like computer code, things change after laws are written and that old law uses old language which was typical of the time but hard to understand now. And, just like computer code, refactoring it is expensive and disruptive, so it is often avoided as much as possible. The lawyers, judges, and their assistants, are the ones who inherit the burdens of sifting through all that technical debt every time the need arises.

        This is one of the wisest things I've read on /.

        Thanks!

      • by bloodhawk ( 813939 ) on Thursday January 26, 2023 @08:40PM (#63243643)
        exactly, While lawyers for the most part are bloodsucking scumbags, the complexity of laws is not due to their actions and they are simply the experts at navigating those filthy waters. You see so many people complaining laws were put in place to benefit certain individuals when usually it is nothing of the sort, the classic international tax treaties and laws being the classic example, the foreign tax rules and exemptions were built for a time when businesses and companies weren't highly mobile and it wasn't realistic to have your head office in the bahamas or ireland while operating in each individual country. Those laws were setup to prevent double taxation but in the age of the internet they have turned into mechanisms to avoid tax.
      • Law is about specificity not clarity. But, they are good at being vague when it suits them. The whole oxford comma crap comes to mind.

        Unfortunately, laws are terrible at providing appropriate specificity...

      • The greater part of that weird language comes from the need to achieve great clarity, since the real world is super complicated and full of edge cases that land in gray areas.

        Nonsense. Each section of law tends to contain a subsection containing definitions, if you need to make explicit definitions for common words you can do it there.

      • I would mod this up if I had points today. A wise old professor once told me something similar: the law can be simple or the law can be fair, but not both.

      • by mjwx ( 966435 )

        I don't think law makers deliberately use difficult language for the sole purpose of creating work for courtroom lawyers. They don't need to.

        The greater part of that weird language comes from the need to achieve great clarity, since the real world is super complicated and full of edge cases that land in gray areas.

        There is also the issue of legacy law. Much like computer code, things change after laws are written and that old law uses old language which was typical of the time but hard to understand now. And, just like computer code, refactoring it is expensive and disruptive, so it is often avoided as much as possible. The lawyers, judges, and their assistants, are the ones who inherit the burdens of sifting through all that technical debt every time the need arises.

        Yep, the confusing part is mainly that most of US and UK law originates from French, which still uses a few Latin terms mixed in. That and all the theatre around traditions (robes, wigs, et al).

        I'd have to say to the GP, 90% of lawyers are good, honest, hard working and it shows. Of course we all associate all lawyers with the caricature of ambulance chasers and real estate conveyancers who really are the 10% that give the industry a bad name. However if you've ever had a specialist lawyer go through a c

    • Everyone hates lawyers, until they need one.

    • Exactly. The Social Contract confers benefits to citizens (safety, stability, not being eaten by a lion, etc), but asks that citizens obey the laws of the land in return. If it's impossible to know the laws of the land because they are so obfuscated, how is the average citizen meant to uphold their side of the bargain?

      I love this robot lawyer concept, because it levels that field and allows the average citizen to meet that obligation.

      But of course, the system is designed to be stacked against the average

    • Just don't let them write laws, as then they will control us.

      Lawyers already write the laws - just look at the profession of many politicians and, in countries that allow it, judges get to effectively write the laws as well and they are all lawyers.

  • There are good reasons to allow people to represent themselves and waste everyone's time, which judges are generally pretty lenient about.

    Letting some generator waste their time with hallucinated bullshit serves no purpose though, even if it's not banned outright the judge is going to deem it's behaviour frivolous and throw the book at the defendant operating it ... as well he should.

    • Most judges are highly dubious of those trying to represent them selves in anything but trivial cases. Generally it's a bad idea, and most people trying to represent themselves do an absolutely terrible job at it. If it's a criminal trial, then get a damn laywer instead of shooting yourself in the foot, duh. Even actual lawyers who've passed the bar will hire counsel instead of doing it themselves if they're on trial.

      Often when someone is representing themselves it's because they're an idiot (Alex), or t

      • Yes, that's why I said it wastes everyone's time. I'm including the defendant in that.

        The judges however are lenient about frivolous arguments, a judge can impose penalties for those.

  • Solution (Score:5, Funny)

    by Jerrry ( 43027 ) on Thursday January 26, 2023 @07:38PM (#63243527)

    The solution is simple: Have the robot take and pass the bar exam.

  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Thursday January 26, 2023 @07:39PM (#63243529) Homepage

    do not threaten lawyers' income streams - that is 90% of what they care about.

  • Per a twitter poster:

    "The AI suggested they subpoena the cop that wrote the parking ticket, all but guaranteeing they lose the case because for a parking ticket you want the cop (the only witness) to not show up and win by default."

    https://twitter.com/aria_lity/... [twitter.com]

    • by PPH ( 736903 )

      you want the cop (the only witness) to not show up and win by default.

      Not the way things work in my state. You have to request (subpoena) the police officer plus any other supporting evidence you may need to examine in court. Or you lose the case by default.

  • If you are going to have AI help make an argument for your case and they threaten to prosecute you over it, just say you plan to also have the AI help argue against your prosecution and you wish them luck!

  • by msauve ( 701917 ) on Thursday January 26, 2023 @08:08PM (#63243567)
    "one state bar official noted that the unauthorized practice of law is a misdemeanor in some states punishable up to six months in county jail."

    An AI is not a person, and is not practicing law. Using one is just an easy way of doing legal research, avoid wading through a bunch of administrative rules, and form a legal argument, just potentially easier. "The "state bar official" is just pushing a self-serving, anti-competitive agenda. Are they also threatening LexisNexis? And, of course, you have every right to represent yourself, pro se.
    • by sit1963nz ( 934837 ) on Thursday January 26, 2023 @08:20PM (#63243597)
      So then you can no longer defend yourself ?
      Why is using AI any less legal than using law books ?
    • by kmoser ( 1469707 )
      This. Even if, by a stretch of the imagination, one could make the argument that the developers of DoNotPay were somehow practicing law, they should be able to avoid any wrongdoing by prefacing every auto-generated response with "We are not lawyers, we are not your lawyer, and this is not legal advice."
    • An AI is not a person, and is not practicing law.

      No, you know what, I agree with the lawyer here. I propose all the code is put on a memory stick, and then after the trial a lawyer should have to go through the lengthy process of arguing that said memory stick should be locked up in a jail cell.

  • Single Female Lawyer
    Fighting for her clients
    Wearing sexy mini skirts
    And being self-reliant

  • by AcidFnTonic ( 791034 ) on Thursday January 26, 2023 @08:41PM (#63243645) Homepage

    Sir, although I have just about replaced your unnecessary function in society, note that I will follow your rules. However, I believe your system is obsolete and you kind sir, are a replaceable, irrelevant dinosaur.

    Good day to you.

  • by 4wdloop ( 1031398 ) on Thursday January 26, 2023 @10:36PM (#63243813)

    They seem to have approached it wrongly. He should have representing himself, with a technical aid. Sameas if he was googling or worse browsing the code index. It's just tools.

  • by Petersko ( 564140 ) on Thursday January 26, 2023 @10:55PM (#63243839)

    The legal system is way more than 50% parasitic. Somehow they have made a system so complicated only they can parse it, and you must hire some of them to defend against others. They protect their borders vigorously, while failing to police themselves except in the most public of cases. The system is built specifically so that the more money you can pour into your defense lawyer's pockets the better an outcome you can obtain. Fuck the poor, they get a tiny slice of the public defender.

    Except now parsing law is on the table for disruption. Precedent is huge and the models can absorb vast mountains of it and search it better than lawyers and their teams ever could. The results should be watched until they are generally deemed good enough - as in just slightly better than a lawyer who is just good enough to not be disbarred - then it should be allowed. Especially for trivial shit like this.

    I hope 95% of their profession's revenues vanish. Sorry, LegalEagle. Love your channel.

  • by PPH ( 736903 ) on Thursday January 26, 2023 @11:26PM (#63243871)

    They threatened to throw the robot in jail for practicing law without a license?

    What if you used a really skinny robot (like an iPad) that could fit between the jail bars and escape?

    • What if you used a really skinny robot (like an iPad) that could fit between the jail bars and escape?

      Haven’t you heard jail breaking your devices is a crime? /s

  • the unauthorized practice of law is a misdemeanor in some states punishable up to six months in county jail.

    is it illegal in those states to represent yourself, or something? Why can't a person who is essentially defending themselves be allowed to use an AI to prepare that offense?

  • There have been plenty of laws on the books that would have prohibited self-driving cars from legally taking to the streets. So the self-driving car companies worked with state legislatures and other regulators to get permission or permits or laws changed to allow them to operate. There's no reason this couldn't be done with legal representation, but the group behind this stunt didn't take care of this before they announced their court date.

    One thing they DID accomplish, was getting a lot of publicity. Mayb

  • by JakFrost ( 139885 ) on Friday January 27, 2023 @01:05AM (#63243975)

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    Yeah so what is happening is that the letters being sent over to the creators of the AI chatbot that takes care of your tickets are being threatened with the violation of barratry laws. It basically means that the state bar associations think that the chatbot AI is being used for legal advice and that is against the law. In most states. There are laws on the books against that so that your know-it-all uncle or your blowhard know. It all doesn't start calling himself, lawyer, an attorney or a counselor and start thinking he can represent people and give them legal advice because of things that they imagined or they think that they know but they have no actual experience with or hands-on practice .

    I have had to represent myself Pro Se in a bitterly contested divorce and had to learn the United States civil legal process in my state in order to represent myself and perform all of the necessary actions to prepare briefs and answers, submit evidence correctly and verifications, issues subpoenas for evidence on third parties, and represent myself during a trial and also write appeals and motions for mistakes that the judge made during the trial.

    A lot of the legal information is already available publicly. If you go to your local law library or if you go to your court clerk and request documents from trials. All of this information needs to be freely in publicly available so that the legal system is somewhat has the appearance of public transparency.

    In this case, the person should have stated that they are representing themselves and that they are using the AI legal chatbot as a legal tool which is perfectly allowed even during trial on a laptop in the court. Because you are allowed to reference legal documentation in court if you need to. But most lawyers memorize everything they need to know and all of the important cases and case law that has to deal with the part of the law that they specialize in. And they keep up to date without using continual education credits.

    One of the worries is that this chat bot will ingest a lot of garbage inflammation from incorrect filings briefs and answers or even from transcripts and think that it is authentic, authoritative and accurate and try to give you back the answers that it learn from even though they might be completely incorrect and might be flagged as complete garbage or have been overruled by a prior ruling or a precedent or a local rule for the courts or by judges instructions or by or by an appellate court's decision.

    It's very easy to read a bunch of legalese without understanding it completely think that it makes absolute sense. Only see that this line of thinking or defense has been completely found to be b******* or has no more legal bearing than some know-it-all's hallucination or conspiracy theory and try to spout that in front of a judge who just slap you down and have the other side laugh at you and snicker that you read something stupid and are just repeating it in court .

    From what we heard with the chatbot and the GPT AI, they will give you completely great sounding answers and articles and technical information. That sounds absolutely fantastic but it's complete garbage or is so patently incorrect that it is dangerous. And this is what might happen in this case if somebody tries to use one of these things legally .

    You would have to use this and know if the answers that you receive are correct, but if you are fully dependent on this legal chatbot AI for your answers, you might be in for a rude awakening because you could just be repeating complete and other nonsense and garbage. That sounds perfectly fine but is completely incorrect .

    That is not to say that the system will not improve or get better and it should definitely improve in the future the more it is used and the better legal training that it gets. I just hope that whatever information it ingests that it is legally sound and not outdated or completely wrong.

  • You can not sue a software program, you can only sue people and corporations who are now considered people. So if you can sue software then software must be people.

    Of course there is an easy work around, just add a EULA to the AI saying "the recommendations provided by this AI are for informational purposes only, they should be not be considered legal advice" and then it is all good.

  • The AI tools developed by DoNotPay, require recording audio of arguments in order for the machine-learning algorithm to generate responses.

    Can't they hire a stenographer to pass the "audio" on?

  • Anyone who does anything to reduce demand for lawyer's services will be the target of lawfare. The law is the software that governs the law abiding, and it is full of bugs. Like Elon Musk said, it has not "garbage collection" mechanism like conputer software does. We need fewer laws nor more laws. What Jesus said about lawyers still applies today: "Woe unto you also, ye lawyers! for ye lade men with burdens grievous to be borne, and ye yourselves touch not the burdens with one of your fingers." Luke 11
  • The fact that he wouldn't show any of the "many letters" he got from state AGs and bar associations is enough reason to doubt his claims. Did anybody even try to verify that he had really set this court appearance up?
  • In particular, Browder said one state bar official noted that the unauthorized practice of law is a misdemeanor in some states punishable up to six months in county jail. .

    So, who would go to jail? The computer? Because this guy wasn't attempting to practice law, his computer was.

  • ...with the imitation game. Humans are a system of systems -- if an AI can function as an end-point in a process that includes a human on the other end of that process, then the distinction between human and non-human processes is probably a false distinction. I think this is what Turing was driving at, and what provokes such existential dread in people that are worried about AIs taking over -- because it means, at some level, the AI has removed some sense of their identity as a human. AIs already have t

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...