Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Open Source The Courts Transportation

Who's Responsible For Accidents Caused By Open Source Self-Driving Car Software? (ieee.org) 114

Here's the problem. "You could download Comma.ai's new open-source Python code from Github, grab the necessary hardware, and follow the company's instructions to add semi-autonomous capabilities to specific Acura and Honda model cars (with more vehicles to follow)," writes IEEE Spectrum. But then who's legally responsible if there's an accident? Long-time Slashdot reader Registered Coward v2 writes: While many legal experts agree OSS is "buyer beware" and that Comma.ai and its CEO Georg Hotz would not be liable, it's a gray area in the law. The software is release under the MIT OSS license and the Read Me contains the disclaimer "This is alpha-quality software for research purposes only... You are responsible for complying with local laws and regulatons." The U.S. Supreme Court, in a series of court cases in the 1990s, ruled open source code as free speech protected under the First Amendment of the U.S. Constitution.

The question is does that release the author(s) from liability. The EU has no EU wide rules on liability in such cases. One open question is even if the person who used the software could not sue, a third party injured by it might be able to since they are not a party to the license agreement.

An EFF attorney told HotHardware "Prosecutors and plaintiffs often urge courts to disregard traditional First Amendment protections in the case of software." But not everyone agrees. "Most legal experts that spoke with IEEE Spectrum -- and Hotz himself -- believe that if you use the company's code and something goes wrong, then it isn't liable for damages. You are."
This discussion has been archived. No new comments can be posted.

Who's Responsible For Accidents Caused By Open Source Self-Driving Car Software?

Comments Filter:
  • by chromaexcursion ( 2047080 ) on Saturday January 28, 2017 @06:44PM (#53756381)
    Software liability is not new. Its use in automobiles is.
    If you download, compile (or trust some 3rd party build), and use you are responsible.
    If your self driving car, using open source software, is in an accident, you are responsible. There is no responsible 3rd party.
    This is old law.

    This is the meaning of use at your own risk.
    Accept it, or don't use it.
    • by El Cubano ( 631386 ) on Saturday January 28, 2017 @07:06PM (#53756459)

      I have to agree here.

      If I buy a part for my car and the part's manufacturer claims that it complies with some ASE or similar standard, then if the part fails I might have a legal case (e.g., if I can prove negligence in the design or manufacture, or something that the established case law will respect). However, if I buy a part off a guy who makes them in his tool shed and hey tells me "hey, I'm not sure that this thing won't explode when apply the brake," then I am pretty sure I have no recourse whatsoever.

      How is software different? If the manufacturer warrants it, then it should work as it is represented and if it fails then there is a discussion to be had. If the manufacturer disclaims warranty and it breaks (and the applicable laws don't override that; you know that in some jurisdictions that there are laws that still hold the maker or seller responsible to a degree for things they make or sell?) then I don't have a legal case.

      Of course, even professionally produced commercial software normally has a EULA with a clause that reads something like "the manufacturer provides no warranty of merchantability or fitness for a particular purpose and shall not be liable for losses arising from blah, blah, blah..." If you are NASA and paying your contractors (an enormous amount of money) to mathematically prove their software correct then you might get a "yup, we certify that this software will work as designed," otherwise you have no such assurance.

      • While you make some good points a counter arguement could be since it is alpha software and they released it knowing that and even potentially admitted it may be flawed they were negligent in releasing it. IIRC, you can't declaim negligence although many Agreements try to do that; the question here is not would the owner of the vehicle be liable but could the software developer also be liable.
      • if I buy a part off a guy who makes them in his tool shed and hey tells me "hey, I'm not sure that this thing won't explode when apply the brake," then I am pretty sure I have no recourse whatsoever.

        There are limits to this: if the seller in question made the brake pads out of plastic explosive I'm pretty sure the police will soon be knocking on his door. Putting a disclaimer on things does not magically allow you to get away with anything and some countries like the EU have mandatory minimum guarantees.

        However in this case the software is not sold but given away so there is no sale which probably keeps the author protected unless they put something deliberately bad in the code.

    • by Anonymous Coward

      Parent is 100% wrong. The reason you are typically responsible for open source software is because the license comes with a disclaimer of warranty and of liability. For example, the GPL [gnu.org] license says:

      15. Disclaimer of Warranty.

      THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.

      16. Limitation of Liability.

      IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

      This does not prevent a company from selling you a warranty for a fee. e.g. Enterprise software customers get a warranty, because they pay for it. If you're buying a self driving car, insist on a warranty.

      tl;dr: Parent is wrong; do not buy a self-driving car that doesn't come with a warranty.

      • So what you are saying is, al queda and ISIS have to simply open source their bomb making recipes and release it under GPL, then they will be free of liability. Right?
        • So what you are saying is, al queda and ISIS have to simply open source their bomb making recipes and release it under GPL, then they will be free of liability. Right?

          Nah, they don't even need to do that. Publishing is not a problem in most of the world. Even building and posessing them isn't a problem if you follow the law and basic safety rules. People and companies use explosives that could be used as bombs all the time.

          If you're talking about publication only, The Anarchist Cookbook, first published back in 1971, describes how to make all kinds of bombs, explosives, and poisons, as well as assorted drugs like LSD. It is still in print, and pirated versions are av

    • The letter of the law is sort of unimportant. If you thought Ford was going to be responsible, then it is not your understanding of law I would call into question first, but your basic common sense. A car would practically have to cost twice as much, if the manufacturer was responsible for all accidents and deaths it caused.

    • Much like pretty much every open source license the one here is no different. It's open source, you have the freedom to do whatever you like with it, but you are the one who assumes responsibility for it too.

      Copyright (c) 2016, Comma.ai, Inc.

      Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publi

  • Who is going to be responsible for it, open source or not.
    • Whomever installs the software is going to be on the hook.

    • by guruevi ( 827432 )

      The installer/maintainer (whoever you paid to 'make it work'), the source code license doesn't matter really in these kinds of things. Just because I make the code to a pacemaker open source and it fails, doesn't mean that I'm not liable. That would be a very easy way to avoid liability though.

      • You do realise in this context installer and maintainer are two very very very different things . Right?
        This whole article is retarded.
        Of course it is the person who decided to let it control the car. Black and white. No gray area at all.
        In exactly the same way they would be responsible if they decided to fit cheese for brake pads.. Or tint their front window with house paint.
        The disclaimer didn't matter. The fact is there is no CLAIM that it is suitable for normal road use.

        • by guruevi ( 827432 )

          Not necessarily the person that wanted it, the person that ended up installing it in contravention of rules and safety. I meant the installer that mechanically connects the drive shaft and the brakes to this machine. The maintainer of the software has nothing to do with it in this instance he's just putting his science project on the net, unless he sells it (and/or gives it away) with some sort of guarantee that it will work.

          Once you have a company and you sell it to do some sort of function and it ends up

      • Really? Because I hear people saying things like "no software is perfect", and "this only needs to be better than an average human". So when it is not superhuman and someone dies, will people just to make the excuse that they are saving lives overall? What if the open source software is proven to be safer than an average human, than the person using the software should be able to make the same excuse that the commercial automakers are... that it is supposed to be safer on average, so a few deaths don't m
        • by guruevi ( 827432 )

          With the license attached to it, the software maker has given no guarantee that this will work for any purpose. Most open source licenses don't guarantee the product to work. This is where proper engineering comes into play (including mechanical etc), if there is someone (a company perhaps) that wants to commercialize it, then they'll be on the hook for any faults, problems, testing, licensing.

          Obviously at some point, failures will happen just like current cars fail, who is on the hook depends on whose faul

        • My dad is getting old, and his driving skills are slowly getting worse to the point where he's more liable to cause an accident. Suppose, he needs to go on a long drive, and I offer to take the wheel so he can relax. Now, image that I cause a deadly accident, despite being a better driver than my dad. Nobody's saying that this death doesn't matter, or that I would be making excuses for driving the car instead of my dad. Basically, we'd all agree it was a good decision, but shit sometimes happens even if you
  • by Anonymous Coward
    and make the system integrator responsible. Like when you contract a home improvement store to add a window onto your house and it leaks. While the home improvement store just hired an independent contractor (Joe in his truck home services and window installation), you as a home owner have a contract with the store and not with Joe. It's their responsibility (the store's) to make sure the job is done correctly.

    Sure, the store may go after Joe the installer, but that is their problem, not yours and the re

  • If you drive a car over a carload of nuns, you're liable. What's new here?

  • If you package something for sale, you are responsible. Here's a box filled with stuff for 49.99, oops the box exploded one or 10 times. The boxing vendor is responsible, regardless of the interaction that caused it. There is no specific law, but tort law generally plays out that way. A separate boxer? Not that guy's problem but he always has to go to court to be found, not at fault. Nothing here about software liability applies excepting between the packaging vendor and the development team. Will we see th

  • Whoever the courts/governments decide is responsible on a case by case basis, or course!

  • If someone publishes some bomb making recipe, would that person be free of liability? You can's simple slap EULA and license agreements to dodge the liability. If the bomb recipe you provide or the software you provide enables a person to do something that they would not have been able to otherwise, are you completely free of liability?
  • This begs the question (assumes it and ignores it): When a NON-open-source software program is involved in an accident, the responsibility is that of the manufacturer.

    That is not true according to current cases dealing specifically with Tesla.

    A better question isn't "Hey if an open-source independent vehicle software program causes a crash, who's responsible" but rather: 'Who is responsible when software causes a crash" or better yet "How can people be responsible for their own behavior even if relying on

    • That is not true according to current cases dealing specifically with Tesla.

      Considering it triggered a federal investigation that cleared Tesla I would say the exact opposite, the current case established that the non-open-source vendor was potentially liable depending on the error.

      In the Tesla case the only reason they were let-off was because Tesla's legal team required numerous liability warnings that are enforced by the software and the situation was sufficiently challenging that it would be unreasonable to expect the software to handle the edge case.

      All of this though is poin

    • The more interesting and practical question is: in case of an accident, who's going to pay out ? And the answer is simple: the owner's insurance company. As long as the software works better than the human driver, the insurance company will be happy to insure the car and the software.
  • by Anonymous Coward

    God is responsible, die suckers.

  • by Rick Schumann ( 4662797 ) on Saturday January 28, 2017 @10:51PM (#53757143) Journal
    This is one of my arguments about so-called 'self driving cars': The manufacturer is technically driving it if it's in self-driving mode, but just like so many of them, they'll dodge responsibility one way or the other if someone gets killed, and in the end there'll be no justice at all. It'll get tied up in the civil court system for years, or in some sort of arbitration, and in the end you'll either get nothing or some paltry chunk of change, and meanwhile a person is dead, all because of some shitty technology that was rushed to market that doesn't even have the cognitive ability of a smart dog. These so-called 'autonomous/self driving cars' are not going to be the panacea that some of you think it's going to be, we're just going to trade human error behind the wheel for human error in the development division of an auto manufacturer, or human error in the marketing department for rushing it to market before it was really safe to do so. All I can say is I'm not going to be the one strapped into a car seat with no controls whatsoever to stop the gods-be-damned thing when it goes haywire and kills me, and I have nothing but feelings of horror for whoever does, and deep sympathy for whoever the poor bugger leaves behind them.

    Oh, and by the way? The way I think this should be handled legally-speaking, since it's obvious we'll be subjected to these gods-be-damned things regardless, is the same way aircraft mechanics are treated, legally: If a plane they worked on crashes and people die, and the cause of the crash is proven to be a mechanical failure that's the responsibility of the mechanic that worked on it, he is arrested and tried for murder. A so-called 'self-driving car' kills someone? The programmer(s) responsible for not doing their job correctly get thrown in jail charged with murder. Oh and before any of you give me shit for this? You all make a big point about how 'self driving cars will save lives'; well if they TAKE a life then some HUMAN has to be held criminally responsible for it, plain and simple. Otherwise you're just hypocrites.
    • As long as there are fewer people killed by autonomous cars than by human drivers, it's a win.
      • Looking at all the comments following this, you are apparently wrong. If fewer people are killed, then, it must be a loss for you to be wrong. Clearly, then, acording to the reasoning of Slashdot, we should strive for larger numbers of fatal traffic accidents.

  • In this particular case, since it is described as "alpha quality" then the owner and/or person who installed the software is liable. The question should be, "If there were OSS of stable quality properly installed on a compatible vehicle in excellent condition (so that neither the vehicle nor the installation procedure can be considered as "at fault"), who would be considered liable in an accident that can't be blamed on local conditions (weather, etc.) or the other party?
  • by johannesg ( 664142 ) on Sunday January 29, 2017 @01:48AM (#53757667)

    The owner of the vehicle is responsible for ensuring his vehicle is safe to use. If he modifies it, by installing some untested software, he is most certainly responsible for the consequences if he then injures someone. And if the disclaimers are clear enough, his chances of successfully suing the software developer are slim.

    The sad thing is that, as with the Tesla, you know some idiots will install this and go on to kill people - hopefully just themselves. After that I wouldn't be surprised to see the software developer sued claiming the warnings on the software were just not clear enough. Or even the car manufacturer, for allowing the vehicle to be so modified in the first place...

  • Many software licenses include similar like this:

    "The software is provided "As is", without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement. "

    This is from the MIT license, I'm sure other open-source licenses have a similar clause. It basically tells you that using this software, is at your own risk.
  • The principle is that the state are responsible for ensuring the safety of the vehicle So if a vehicle have been certified safe by the authorities they take responsibility for the safety. On the other hand if there is no regulations regarding software used in cars or that software is proprietary then the manufaturer needs to take responsibility.

    In the case of OpenSource there is definitely a possibility that the owner might be responsible. But American law is funny so it could go both ways.

  • It is about who declares to take responsiblity.

    Someone (i.e. a auto company) can create an open software, release a signed version and take the resposiblity (to a defined degree). If they don't, nobody actually needs to software, as you cannot use it without driving yourself, or your car is driving illegally.

    If you now use this software, the liability is clear. When you start modifying it, it becomes interesting. Because either you have the full liability for the selfdriving part yourself (which probably me

  • "who's legally responsible if there's an accident?"

    Nobody is liable going by the Microsoft Windows 10 EULA [microsoft.com]. Indeed the license specifically bars you from sueing them in a court of law and even then you can only get back what you paid for the software or $50.

    "One open question is even if the person who used the software could not sue, a third party injured by it might be able to since they are not a party to the license agreement."

    The third party can't sue the first party precicely because the first
  • by PPH ( 736903 )

    The AI left plenty of white space. But other drivers kept pulling into it.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...