Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation Crime Software Technology

Arizona Prosecutor Says Uber Not Criminally Liable In Fatal Self-Driving Crash (reuters.com) 190

Uber is not criminally liable in a March 2018 crash in Tempe, Arizona, in which one of the company's self-driving cars struck and killed a pedestrian, prosecutors said on Tuesday. "The Yavapai County Attorney said in a letter made public that there was 'no basis for criminal liability' for Uber, but that the back-up driver, Rafaela Vasquez, should be referred to the Tempe police for additional investigation," reports Reuters. From the report: Vasquez, the Uber back-up driver, could face charges of vehicular manslaughter, according to a police report in June. Vasquez has not previously commented and could not immediately be reached on Tuesday. Based on a video taken inside the car, records collected from online entertainment streaming service Hulu and other evidence, police said last year that Vasquez was looking down and streaming an episode of the television show "The Voice" on a phone until about the time of the crash. The driver looked up a half-second before hitting Elaine Herzberg, 49, who died from her injuries. Police called the incident "entirely avoidable."

Yavapai County Attorney's Office, which examined the case at the request of Maricopa County where the accident occurred, did not explain the reasoning for not finding criminal liability against Uber. Yavapai sent the case back to Maricopa, calling for further expert analysis of the video to determine what the driver should have seen that night. The National Transportation Safety Board and National Highway Traffic Safety Administration are still investigating.

This discussion has been archived. No new comments can be posted.

Arizona Prosecutor Says Uber Not Criminally Liable In Fatal Self-Driving Crash

Comments Filter:
  • by nickmalthus ( 972450 ) on Tuesday March 05, 2019 @11:46PM (#58223186)
    Arizona wants Uber investment dollars so they would gladly scape goat an Uber employee while giving the company a mulligan. Reminds me of the Arab Bank Supreme Court decision last year where the court dismissed a lawsuit against a foreign company that funded terrorism because the precedent would be bad for business. The legal partiality for the corporate person over the individual is becoming more and more apparent.
    • Scapegoat? (Score:5, Insightful)

      by virtig01 ( 414328 ) on Tuesday March 05, 2019 @11:56PM (#58223210)

      Human drivers are liable for breaking traffic laws when using company-owned vehicles. This isn't new. Just because this vehicle was testing autonomous driving doesn't mean the human sitting inside is exempt from liability. She had one job while sitting in that car, and it wasn't watching videos on her phone.

      Secondly, it was found that Uber isn't criminally liable; they could still be hit with a civil suit.

      • Re: (Score:2, Interesting)

        by Anonymous Coward
      • by Ly4 ( 2353328 )

        The family settled with Uber a couple of weeks after the crash - a civil suit is unlikely:
        https://www.azcentral.com/stor... [azcentral.com]

      • Human drivers are liable for breaking traffic laws when using company-owned vehicles.

        Sure, my dad had a company-owned vehicle and he was responsible for speeding tickets in it. But you know what? Not once did his company tell him how to drive or take the steering wheel or accelerator from his control. It changes the came drastically when that happens.

      • I agree - IMHO, a 'backup driver' should be able to see what the AI is 'thinking'. That is, it should be saying "road clear ahead" at the time of the crash. If it were, and the driver took no action (which she didn't), then she's liable. Knowing Uber's somewhat slap-dash approach to rules, I'll bet the car had no readout that was saying what the AI was up to at the time of the crash though.

        If the car's just trundling along, then there's no reason for the human to do much about it - after all, it might swerv

      • by dougmc ( 70836 )

        Secondly, it was found that Uber isn't criminally liable; they could still be hit with a civil suit.

        Honestly, minus some aggravating factor such as driving drunk, drag racing or being involved in a high-speed chase with the police, drivers are almost never charged criminally for motor vehicle crashes, even when somebody dies.

        At most they will get a traffic ticket -- and even that is kind of iffy, as the police will often refrain from giving a ticket until their investigation is over, and once the investigation is over they often don't follow through and give out those tickets.

        So ... this result is expecte

    • Arizona wants Uber investment dollars so they would gladly scape goat an Uber employee while giving the company a mulligan.

      To be clear are you saying you or anyone else in their right mind would not scape goat someone being paid for a task who instead decided to show up to work, kick back and watch TV instead? All the while actually causing your company to operate outside it's legally approved framework for which the person was employed in the first place?

      That doesn't sound like a scape goat to me, that sounds like blaming someone who utterly failed to do their job.

  • back-up driver needs to get source code and logs if they go to CRIMINAL COURT. and if they try any of the NDA / EULA BS then YOU MUST ACQUIT!

  • by Joe_Dragon ( 2206452 ) on Wednesday March 06, 2019 @12:20AM (#58223268)

    airline pilot's errors do not have Criminal proceedings most of the time. So sticking this on a under trained backup driver with poor systems in place + really bad video that looks like was made to make uber look good is very bad.

    • by c6gunner ( 950153 ) on Wednesday March 06, 2019 @07:15AM (#58224156) Homepage

      When airline pilots make an error worth prosecuting, there is generally not enough left of them to identify, let alone prosecute.

      Also there is a fucking world of difference between "made an error" and "decided to watch a TV show instead of working".

    • by thegarbz ( 1787294 ) on Wednesday March 06, 2019 @07:32AM (#58224212)

      airline pilot's errors do not have Criminal proceedings most of the time.

      Only because airline pilots often die due to their mistakes. And calling this a "pilot error" is disingenuous. Sorry wrong word. err. No it's absolutely fucking stupid. The person didn't make an error, they outright were not attempting to remotely do the job they were being paid to do all while operating a motor vehicle illegally.

      Even if Uber was 100% liable the driver should still be charged with manslaughter for their actions.

    • by mjwx ( 966435 )

      airline pilot's errors do not have Criminal proceedings most of the time.

      That's because most of the time they don't kill or injure people.

      Same as most driver errors do not have criminal proceedings. For mistakes not including a fatality or serious injury, you're dealing with insurance, not the cops.

      When people die due to pilot error, the pilot is definitely charged... Its rare because if the pilot is making an error big enough to kill a passenger, the pilot is usually killed as well.

    • by Pyramid ( 57001 )

      Pilots bear SOLE responsibility for the operation of their aircraft, even under autopilot, etc. Up to and including failures that should have been caught by pre-flight inspection or following ATC instructions that are dangerous. The Pilot In Command responsible for the safety of the flight and is authorized, expected to verify the condition of their aircraft and deviate from instructions that are potentially dangerous.

      The PIC is essentially GOD of that aircraft and bears all the responsibility.

    • Not always true.

      A Singapore Airline pilot took off on a runway undergoing maintenance. Bulldozers and cranes were on that runway, 2 or 3 km from the starting point. Plane collided with them. They prosecuted him for being negligent.

      France too had prosecuted some pilots for disobeying the safety rules.

      Surprisingly, in the USA, the land of liability litigation, FAA and NTHSB has somehow managed to give immunity to the pilots, as long as they dont cover up anything. As long as every mistake, every violati

  • when they become conscious, we can sue them :)
  • by DrTJ ( 4014489 ) on Wednesday March 06, 2019 @02:57AM (#58223602)

    Although the driver definitely deserves criticism for her actions, this ruling is very convenient for Uber.

    This means that they can relax significantly when it comes to the safety of the testing vehicles; if an accident happens, the courts says it's the driver's fault. Uber can sit back and blame the driver. Very handy.

    The court should take into account the hazards posed to the public; if a company tests out new driving technology on public roads where the sole purpose is to disconnect the driver from the driving task, the hazards arising from the driver actually becoming disconnected should be taken into account. Especially, the automotive safety standard (ISO-26262) explicitly states that 'foreseeable misuse' MUST taken into account in the hazard analysis.

    Is 'using the phone in the car while driving' a foreseeable misuse? You bet it is. We have all seen it (and done it). NHTSA has emphasized this in a recent report, saying that this is not only applicable for L3 and L4 AD systems, but even for L2 systems: https://www.nhtsa.gov/sites/nh... [nhtsa.gov]

    So, if the hazard analysis identifies this risk, but the company neglects to handle it in such a way that someone gets killed, shouldn't it be held liable? I think so. If the company neglects to perform a proper hazard analysis (i.e. not using established, acknowledged standards and best practice) and therefore misses hazards and therefore have someone killed, shouldn't they be held liable? I think so.

    Ergo; I think Uber needs to be held responsible for this.

    • by AmiMoJo ( 196126 ) on Wednesday March 06, 2019 @04:32AM (#58223790) Homepage Journal

      Uber's liability stems from not monitoring their test drivers. It's unlikely that this was the first time she wasn't paying attention, and they have a camera pointed at her but apparently no-one was bothering to review it.

      They could even use a hands-on-wheel detection or gaze detection system like level 2 cars do.

    • This means that they can relax significantly when it comes to the safety of the testing vehicles; if an accident happens, the courts says it's the driver's fault.

      Err no, that doesn't follow from this example at all. Uber put a system in place to prevent the accident and the single person wasn't even remotely attempting to do the job they did. This would be no different in any other industry. If I watched TV instead of doing what I got paid to do and someone died I fully expect to be held liable.

      Even if Uber is liable for not installing driver facing "incompetent job slacking" detection cameras in their cars the driver is still 100% at fault.

    • You can't put a person in a car for 8 hours with literally nothing to do and expect them to be alert and vigilant. That's just not how our brains work. There's plenty of blame to go around for this woman's death, but Uber cheaping out by putting only one person in the test vehicle gets a good chunk of it. And by doing this on public roads they're putting all of us at risk, and that should be criminal.
  • Liabilty (Score:4, Insightful)

    by stealth_finger ( 1809752 ) on Wednesday March 06, 2019 @03:42AM (#58223692)
    You programmed it, you built it, you set it up, you put it on the roads. Not your fault though.
    • You programmed it, you built it, you set it up, you put it on the roads. Not your fault though.

      It certainly could have done better but the human that was watching the movie inside the vehicle was supposed to be there to prevent this kind of catastrophe. Whether it is reasonable to expect a person to sit there and do nothing for hours at a time and then act in an emergency is another question that ought to be addressed. But I am sure that Uber would be held civilly liable for this, even if the criminal justice system has decided not to do anything.

      .

      • by sphealey ( 2855 )

        The person who made the decision to disable the base vehicle's collision detection/automatic braking system was supposed to be paying attention to his job. Whether it would be reasonable to expect a person to sit and do nothing while a colossally stupid design decision was made that deliberately put the non-Uber-stockholder public at risk is another question that ought to be addressed.

    • You left out the bit where the person who you paid money to do something then proceeded to kill someone.

      • You left out the bit where the person who you paid money to do something then proceeded to kill someone.

        Who was driving the car? Who disabled the auto stop? Who approved this person to be the safety backup and what was their training? Who let her in the car with her phone in the first place? It's not like this "driver" took over control and purposefully ran someone over. From the video she looked to paying about as much attention as any passenger expecting the car to do what its been doing for however long its been going which is less than ideal to be sure but to say that makes her solely responsible is just

        • Who was driving the car?

          The driver. Legally as required by the government during this test.
          The driver. As employed by Uber to oversee the machine under test.

          Who approved this person to be the safety backup and what was their training?

          Irrelevant given the evidence that it wasn't training which was lacking. Or are you suggesting everyone in the world do mandatory "don't watch TV when you're on the clock" training? That's just stupid.

          Who let her in the car with her phone in the first place?

          God did. That's where you're heading with this right? I mean by going back throughout all the irrelevant things that lead up this incident you'll eventually get to "First there w

          • God did. That's where you're heading with this right? I mean by going back throughout all the irrelevant things that lead up this incident you'll eventually get to "First there was nothing and then God created the universe" right?

            No, I was heading in the direction of reasonable oversight and good management but you take it where you want.

    • that Uber disabled safety features to make the ride smoother for a Demo to their CEO and then forgot to turn them back on. It was on Ars [arstechnica.com].

      I'm not sure how that effects things, but it probably should have been investigated more.
    • 1) Program, build, set up and and let loose on the roads

      2) Put a minimum wage human on the driver seat to take the fall,.

      ? 3) ...

      4) Profit.

  • Drive recorder (Score:3, Interesting)

    by spinitch ( 1033676 ) on Wednesday March 06, 2019 @06:45AM (#58224068)
    Did Uber consider monitoring their backup drivers? With all the snazzy tech for a self driving car they could not fathom a low paid temporary staff might try to pass the time by doing something else besides paying attention? Basic AI could alert Uber their driver not paying attention. Seems like this would be a prerequisite before letting a car loose.
  • A person would have at least lost their license to drive. Just saying. Again large corporation gets off scott free. I look forward to a future where millions of self-driving car passengers with no control over the car get held responsible for them just like getting held responsible for an accident in a taxi or a bus today. That's going to be awesome.
    • Let's have where some one loses there kid in a self driving car (with no manual control) crash and they are held criminally liable and go to prison.
      In prison they learn how to be an criminal FF to them getting out and after the life of mc job's and no health care they so I had it better in lock up so they hunt the down the CEO of the self driving car corp and KILL there son SAYING NOW YOU KNOW WHAT IT'S LIKE TO LOSE YOUR KID!!! AND I DON'T GIVE A DAM LOCK ME UP I LIKED IT.

  • I'm pretty sure (but happy to be corrected) that individual states are responsible for administering the rules that govern whether a self-drive vehicle can be permitted on public roads. I'n not sure where that leaves access to Federally-maintained interstates, however.

    But the real question here concerns "fitness to drive". Today, if you want to drive a vehicle on the road, unsupervised, you have to pass a driving test. You have to be able to demonstrate to an examiner that you can control the vehicle saf
  • The current laws for these tests require an operator. Ultimately it's their responsibility. This finding has 0 precedence to a case that occurs once an attentive driver is no longer a requirement.

    • I compare this to the invention of the washing machine. Back when it was a manual roller, it was your fault if it mangled your clothes. But if you're clothes get mangled in an automatic machine and maintenance isn't a factor, then the responsibility is the manufacturer. The current laws are set up to blame the person who presses the button on the washing machine because the job once used to be manual.
  • The old adage if you think your application is fool proof, set a fool in front of it and they will prove you wrong.
    Same thing applies here.
    I have driven many times in that area at night and people are idiots when it comes to crossing the street.
    Drunk, not paying attention, not using cross walks. Arizona has one of the highest pedestrian deaths in the nation because of stupid people.
    So if people are getting run over by humans on a consistent basis, it makes sense that a computer would run someone ove
    • The old adage if you think your application is fool proof, set a fool in front of it and they will prove you wrong.

      If that's true then automated cars shouldn't be at the public testing point yet. They will need to be as reliable as airplanes, doing things on their own more complex than an airplane. And look at all the rigorous testing and oversight that goes into that. A whole government department with regulations and many eyes to watch over a single plane's movement. That's what we will need from self driving cars.

      The pedestrian was not a fast moving "surprise". The pedestrian was crossing at a highly visible s

  • What an incredibly horrible precedent this sets. Corporations can *literally* get away with murder so long as machines do it.

  • The law enforcement is using a very special net. It will catch all the little fish, while letting the big ones escape.
  • Does anyone remember watching the video of this accident?

    If not, you should go back and rewatch it. Here is what I remember...

    The idiot pedestrian walked out in front of an oncoming car in the dark and wearing dark clothing without looking for oncoming traffic. The speed limit in the area was fast enough that a normal person would not be able to stop in time, even if they were paying attention.

    The fault is the pedestrian's. They were stupid and walked in front of a moving vehicle. The driver of the vehicl

If all else fails, lower your standards.

Working...