Arizona Prosecutor Says Uber Not Criminally Liable In Fatal Self-Driving Crash (reuters.com) 190
Uber is not criminally liable in a March 2018 crash in Tempe, Arizona, in which one of the company's self-driving cars struck and killed a pedestrian, prosecutors said on Tuesday. "The Yavapai County Attorney said in a letter made public that there was 'no basis for criminal liability' for Uber, but that the back-up driver, Rafaela Vasquez, should be referred to the Tempe police for additional investigation," reports Reuters. From the report: Vasquez, the Uber back-up driver, could face charges of vehicular manslaughter, according to a police report in June. Vasquez has not previously commented and could not immediately be reached on Tuesday. Based on a video taken inside the car, records collected from online entertainment streaming service Hulu and other evidence, police said last year that Vasquez was looking down and streaming an episode of the television show "The Voice" on a phone until about the time of the crash. The driver looked up a half-second before hitting Elaine Herzberg, 49, who died from her injuries. Police called the incident "entirely avoidable."
Yavapai County Attorney's Office, which examined the case at the request of Maricopa County where the accident occurred, did not explain the reasoning for not finding criminal liability against Uber. Yavapai sent the case back to Maricopa, calling for further expert analysis of the video to determine what the driver should have seen that night. The National Transportation Safety Board and National Highway Traffic Safety Administration are still investigating.
Yavapai County Attorney's Office, which examined the case at the request of Maricopa County where the accident occurred, did not explain the reasoning for not finding criminal liability against Uber. Yavapai sent the case back to Maricopa, calling for further expert analysis of the video to determine what the driver should have seen that night. The National Transportation Safety Board and National Highway Traffic Safety Administration are still investigating.
Corporate Collectivism (Score:5, Insightful)
Scapegoat? (Score:5, Insightful)
Human drivers are liable for breaking traffic laws when using company-owned vehicles. This isn't new. Just because this vehicle was testing autonomous driving doesn't mean the human sitting inside is exempt from liability. She had one job while sitting in that car, and it wasn't watching videos on her phone.
Secondly, it was found that Uber isn't criminally liable; they could still be hit with a civil suit.
Re: (Score:2, Interesting)
Cool story bro, check this out, Uber in fatal crash detected pedestrian but had emergency braking disabled [techcrunch.com]
Re: (Score:2)
The family settled with Uber a couple of weeks after the crash - a civil suit is unlikely:
https://www.azcentral.com/stor... [azcentral.com]
Re: (Score:2)
Human drivers are liable for breaking traffic laws when using company-owned vehicles.
Sure, my dad had a company-owned vehicle and he was responsible for speeding tickets in it. But you know what? Not once did his company tell him how to drive or take the steering wheel or accelerator from his control. It changes the came drastically when that happens.
Re: (Score:2)
I agree - IMHO, a 'backup driver' should be able to see what the AI is 'thinking'. That is, it should be saying "road clear ahead" at the time of the crash. If it were, and the driver took no action (which she didn't), then she's liable. Knowing Uber's somewhat slap-dash approach to rules, I'll bet the car had no readout that was saying what the AI was up to at the time of the crash though.
If the car's just trundling along, then there's no reason for the human to do much about it - after all, it might swerv
Re: (Score:2)
Secondly, it was found that Uber isn't criminally liable; they could still be hit with a civil suit.
Honestly, minus some aggravating factor such as driving drunk, drag racing or being involved in a high-speed chase with the police, drivers are almost never charged criminally for motor vehicle crashes, even when somebody dies.
At most they will get a traffic ticket -- and even that is kind of iffy, as the police will often refrain from giving a ticket until their investigation is over, and once the investigation is over they often don't follow through and give out those tickets.
So ... this result is expecte
Re: (Score:2)
The settlement was fast because it was found that the woman was high on drugs and crossing illegally. The family knew that if they took it to court they would probably lose. Uber didn't want the bad publicity so they offered up enough to keep the family quiet.
I'm familiar with these kinds of pedestrians. They cross the street , in the dark, while wearing dark clothes, pushing (or even riding) a bicycle which has no reflectors or running lights and then step right in front of the lights of the oncoming vehi
Re: (Score:2)
But that "safety monitor" had a job that is impossible to successfully execute if needed. Sorry, but "impossible" is the correct term unless there are at least 5 seconds available for task switch-over. (Probably more. The lab experiments were rather idealized.)
The family knew that the case would be long & (Score:2)
The family knew that the case would be long & costly so they took an quick settlement.
Re: (Score:2)
The 'not an accident' comment was likely directed at the 'video footage portrayed a scene with far less light than was actually available' attempt by Uber to claim that this highly avoidable incident was unavoidable.
Re: Scapegoat? (Score:4, Informative)
Did the person killed accept that risk?
Experiments like this should not be allowed on public roads.
It's the same thing as putting up an automated gun turret on a public street set to only shoot bad guys..... well, it's an experiment but we are pretty sure it won't shoot any good people.... We had a guy watching the monitor with a button to prevent shooting people the wrong people....
Re: (Score:2)
Did the person killed accept that risk?
Personally no. Of course she didn't get any direct say in the qualifications for getting a driver's license, what cars are road worthy, penalties for bad driving, medical requirements to keep you license or anything else either. Was there a legal process enacted by a democratically elected institution? Yes. Should there be? Good question, but it's not like they need individual approval.
Re: (Score:2)
Experiments like this should not be allowed on public roads.
Every new teenage driver is an experiment in driving.
And their insurance rates reflect this.
Re: (Score:2)
Yes, they did accept the risk, by crossing the road not at an intersection in the face of oncoming traffic. Did they think it was a likely outcome? No.
The pedestrian certainly shares some of the blame, it would be very hard to argue otherwise. But it simply isn't relevant. Based on the dash cam video, the collision was precisely 100% avoidable.
Re: (Score:3)
Arizona wants Uber investment dollars so they would gladly scape goat an Uber employee while giving the company a mulligan.
To be clear are you saying you or anyone else in their right mind would not scape goat someone being paid for a task who instead decided to show up to work, kick back and watch TV instead? All the while actually causing your company to operate outside it's legally approved framework for which the person was employed in the first place?
That doesn't sound like a scape goat to me, that sounds like blaming someone who utterly failed to do their job.
back-up driver needs to get source code and logs (Score:2)
back-up driver needs to get source code and logs if they go to CRIMINAL COURT. and if they try any of the NDA / EULA BS then YOU MUST ACQUIT!
Re:back-up driver needs to get source code and log (Score:4, Insightful)
if they try any of the NDA / EULA BS then YOU MUST ACQUIT!
But what if Chewbacca lives on Endor?
Re: (Score:2)
put the trail on hold till he get's hear.
airline pilot's errors do not have Criminal procee (Score:5, Insightful)
airline pilot's errors do not have Criminal proceedings most of the time. So sticking this on a under trained backup driver with poor systems in place + really bad video that looks like was made to make uber look good is very bad.
Re: airline pilot's errors do not have Criminal pr (Score:5, Insightful)
When airline pilots make an error worth prosecuting, there is generally not enough left of them to identify, let alone prosecute.
Also there is a fucking world of difference between "made an error" and "decided to watch a TV show instead of working".
Re:airline pilot's errors do not have Criminal pro (Score:4, Interesting)
airline pilot's errors do not have Criminal proceedings most of the time.
Only because airline pilots often die due to their mistakes. And calling this a "pilot error" is disingenuous. Sorry wrong word. err. No it's absolutely fucking stupid. The person didn't make an error, they outright were not attempting to remotely do the job they were being paid to do all while operating a motor vehicle illegally.
Even if Uber was 100% liable the driver should still be charged with manslaughter for their actions.
Re: (Score:2)
airline pilot's errors do not have Criminal proceedings most of the time.
That's because most of the time they don't kill or injure people.
Same as most driver errors do not have criminal proceedings. For mistakes not including a fatality or serious injury, you're dealing with insurance, not the cops.
When people die due to pilot error, the pilot is definitely charged... Its rare because if the pilot is making an error big enough to kill a passenger, the pilot is usually killed as well.
Re: (Score:3)
Pilots bear SOLE responsibility for the operation of their aircraft, even under autopilot, etc. Up to and including failures that should have been caught by pre-flight inspection or following ATC instructions that are dangerous. The Pilot In Command responsible for the safety of the flight and is authorized, expected to verify the condition of their aircraft and deviate from instructions that are potentially dangerous.
The PIC is essentially GOD of that aircraft and bears all the responsibility.
Re: (Score:3)
A Singapore Airline pilot took off on a runway undergoing maintenance. Bulldozers and cranes were on that runway, 2 or 3 km from the starting point. Plane collided with them. They prosecuted him for being negligent.
France too had prosecuted some pilots for disobeying the safety rules.
Surprisingly, in the USA, the land of liability litigation, FAA and NTHSB has somehow managed to give immunity to the pilots, as long as they dont cover up anything. As long as every mistake, every violati
Blame it on AI (Score:2)
Very convenient for Uber (Score:3)
Although the driver definitely deserves criticism for her actions, this ruling is very convenient for Uber.
This means that they can relax significantly when it comes to the safety of the testing vehicles; if an accident happens, the courts says it's the driver's fault. Uber can sit back and blame the driver. Very handy.
The court should take into account the hazards posed to the public; if a company tests out new driving technology on public roads where the sole purpose is to disconnect the driver from the driving task, the hazards arising from the driver actually becoming disconnected should be taken into account. Especially, the automotive safety standard (ISO-26262) explicitly states that 'foreseeable misuse' MUST taken into account in the hazard analysis.
Is 'using the phone in the car while driving' a foreseeable misuse? You bet it is. We have all seen it (and done it). NHTSA has emphasized this in a recent report, saying that this is not only applicable for L3 and L4 AD systems, but even for L2 systems: https://www.nhtsa.gov/sites/nh... [nhtsa.gov]
So, if the hazard analysis identifies this risk, but the company neglects to handle it in such a way that someone gets killed, shouldn't it be held liable? I think so. If the company neglects to perform a proper hazard analysis (i.e. not using established, acknowledged standards and best practice) and therefore misses hazards and therefore have someone killed, shouldn't they be held liable? I think so.
Ergo; I think Uber needs to be held responsible for this.
Re:Very convenient for Uber (Score:4, Insightful)
Uber's liability stems from not monitoring their test drivers. It's unlikely that this was the first time she wasn't paying attention, and they have a camera pointed at her but apparently no-one was bothering to review it.
They could even use a hands-on-wheel detection or gaze detection system like level 2 cars do.
Re: (Score:2)
This means that they can relax significantly when it comes to the safety of the testing vehicles; if an accident happens, the courts says it's the driver's fault.
Err no, that doesn't follow from this example at all. Uber put a system in place to prevent the accident and the single person wasn't even remotely attempting to do the job they did. This would be no different in any other industry. If I watched TV instead of doing what I got paid to do and someone died I fully expect to be held liable.
Even if Uber is liable for not installing driver facing "incompetent job slacking" detection cameras in their cars the driver is still 100% at fault.
So much this (Score:2)
Liabilty (Score:4, Insightful)
Re: (Score:3)
You programmed it, you built it, you set it up, you put it on the roads. Not your fault though.
It certainly could have done better but the human that was watching the movie inside the vehicle was supposed to be there to prevent this kind of catastrophe. Whether it is reasonable to expect a person to sit there and do nothing for hours at a time and then act in an emergency is another question that ought to be addressed. But I am sure that Uber would be held civilly liable for this, even if the criminal justice system has decided not to do anything.
.
Re: (Score:2)
The person who made the decision to disable the base vehicle's collision detection/automatic braking system was supposed to be paying attention to his job. Whether it would be reasonable to expect a person to sit and do nothing while a colossally stupid design decision was made that deliberately put the non-Uber-stockholder public at risk is another question that ought to be addressed.
Re: (Score:2)
You left out the bit where the person who you paid money to do something then proceeded to kill someone.
Re: (Score:2)
You left out the bit where the person who you paid money to do something then proceeded to kill someone.
Who was driving the car? Who disabled the auto stop? Who approved this person to be the safety backup and what was their training? Who let her in the car with her phone in the first place? It's not like this "driver" took over control and purposefully ran someone over. From the video she looked to paying about as much attention as any passenger expecting the car to do what its been doing for however long its been going which is less than ideal to be sure but to say that makes her solely responsible is just
Re: (Score:2)
Who was driving the car?
The driver. Legally as required by the government during this test.
The driver. As employed by Uber to oversee the machine under test.
Who approved this person to be the safety backup and what was their training?
Irrelevant given the evidence that it wasn't training which was lacking. Or are you suggesting everyone in the world do mandatory "don't watch TV when you're on the clock" training? That's just stupid.
Who let her in the car with her phone in the first place?
God did. That's where you're heading with this right? I mean by going back throughout all the irrelevant things that lead up this incident you'll eventually get to "First there w
Re: (Score:2)
God did. That's where you're heading with this right? I mean by going back throughout all the irrelevant things that lead up this incident you'll eventually get to "First there was nothing and then God created the universe" right?
No, I was heading in the direction of reasonable oversight and good management but you take it where you want.
Re: (Score:2)
"Please disable the 'automatic stop' feature"
"That sounds like a bad idea"
"We have other automated systems that will stop the vehicle, but anyway there's a human at the wheel"
Sorry but every fucking car I've driven has no 'automatic stop' capability, they've all relied on the human behind the wheel. So by your shitty idiotic stupid senseless logic every fucking car designer ever is a murderer.
If you hadn't guessed already, I disagree with you.
There's a story going around (Score:2)
I'm not sure how that effects things, but it probably should have been investigated more.
Re: (Score:2)
2) Put a minimum wage human on the driver seat to take the fall,.
? 3) ...
4) Profit.
Drive recorder (Score:3, Interesting)
LArge corporation (Score:2)
Moive IDEA (Score:2)
Let's have where some one loses there kid in a self driving car (with no manual control) crash and they are held criminally liable and go to prison.
In prison they learn how to be an criminal FF to them getting out and after the life of mc job's and no health care they so I had it better in lock up so they hunt the down the CEO of the self driving car corp and KILL there son SAYING NOW YOU KNOW WHAT IT'S LIKE TO LOSE YOUR KID!!! AND I DON'T GIVE A DAM LOCK ME UP I LIKED IT.
Do Self-Drive Cars Take Driving Tests? (Score:2)
But the real question here concerns "fitness to drive". Today, if you want to drive a vehicle on the road, unsupervised, you have to pass a driving test. You have to be able to demonstrate to an examiner that you can control the vehicle saf
This is 100% correct (Score:2)
The current laws for these tests require an operator. Ultimately it's their responsibility. This finding has 0 precedence to a case that occurs once an attentive driver is no longer a requirement.
Re: (Score:2)
Re: (Score:2)
Fool Proof (Score:2)
Same thing applies here.
I have driven many times in that area at night and people are idiots when it comes to crossing the street.
Drunk, not paying attention, not using cross walks. Arizona has one of the highest pedestrian deaths in the nation because of stupid people.
So if people are getting run over by humans on a consistent basis, it makes sense that a computer would run someone ove
Re: (Score:2)
The old adage if you think your application is fool proof, set a fool in front of it and they will prove you wrong.
If that's true then automated cars shouldn't be at the public testing point yet. They will need to be as reliable as airplanes, doing things on their own more complex than an airplane. And look at all the rigorous testing and oversight that goes into that. A whole government department with regulations and many eyes to watch over a single plane's movement. That's what we will need from self driving cars.
The pedestrian was not a fast moving "surprise". The pedestrian was crossing at a highly visible s
Who the hell IS liable? (Score:2)
What an incredibly horrible precedent this sets. Corporations can *literally* get away with murder so long as machines do it.
Re: (Score:2)
Very special net. (Score:2)
Not her fault (Score:2)
Does anyone remember watching the video of this accident?
If not, you should go back and rewatch it. Here is what I remember...
The idiot pedestrian walked out in front of an oncoming car in the dark and wearing dark clothing without looking for oncoming traffic. The speed limit in the area was fast enough that a normal person would not be able to stop in time, even if they were paying attention.
The fault is the pedestrian's. They were stupid and walked in front of a moving vehicle. The driver of the vehicl
Re:Guess who's getting a big contribution (Score:5, Interesting)
This is the same Yavapai County AG that is prosecuting legal medical marijuana patients for possessing cannabis concentrates that were purchased legally in a dispensary. She recently claimed that distilling extracts from cannabis was similar to making explosives with AMFO
Speaking of contributions, she got $500,000 from Insys Pharmaceuticals which markets a synthetic THC and Fentanyl lollypops, two products which are threatened by medical marijuana.
Re: (Score:2)
How did she get the twitter @AOC name? It sees like it's as valuable as a 3 character domain. Did she pay for it? If not then is it a gift? Is it reported? There is no way I can get a 2 character twitter name.
Re: (Score:2)
But stupidity is not a crime.
Then until we can figure out who is responsible when an AI death happens, we should stop testing on public streets immediately.
Re: Guess who's getting a big contribution (Score:2)
But stupidity is not a crime.
Yeah except that "stupidity" wasn't the event which occurred; it was an aspect of it. Criminal negligence was another aspect of it.
Re: (Score:2)
And the threat of "vehicular manslaughter" charges?
The "backup driver" is in a totally impossible position. No one except an experienced meditator of certain varieties (some kinds of zen, some kinds of "awareness" meditation) could maintain attention in that kind of situation. And there aren't many of those. (In fact, I'm just assuming that those people could maintain awareness. It could be hype.) Most people in that kind of extremely boring position will space out in one way or another. Many of the w
Re: (Score:2)
Watching a fishing rod doesn't require correct evaluation of a complex rapidly changing situation within 5 seconds. Not comparable. You can space out frequently while watching a fishing rod and never have it cause a calamity.
too many subcontractors to just blame uber. (Score:2)
too many subcontractors to just blame uber.
Re: (Score:2)
That's correct. You should blame every company that relies on a "backup driver" approach. That is an approach guaranteed to fail. Well, guaranteed to fail unless you can always guarantee at least a 5 second switch-over time period. The cars that pull over and stop are doing it reasonably. Any approach that relies on a human "driver" maintaining attention and readiness to take over while not exercising any control is guaranteed to fail. Even rested drivers in a boring environment are subject to "highwa
Re:Huh? (Score:5, Interesting)
This is the moment, this is the checkpoint where the hype meets reality. You do not want to accept the job of 'backup driver' because you are basically taking the blame, and it doesn't pay enough. And the driverless car isn't able to do this task on its own without a driver, per the evidence of this case. The law will not protect you, per this case. Does the 'backup driver' have sufficient control to avoid dangerous situations in the first place? Are there traps they can't avoid even if they were not on their phone? Did the company sufficiently explain to the driver that they were not actually in reserve for backup duty, but that they were from the first moment the legal and primary operator of that vehicle? If not, I could understand why they were on their phone; waiting for some timely and orderly signal to pay attention and resume operation of the vehicle. If I was on the jury for that individual, I would need to know that information. This is new and unproven technology, and quite-frankly, the state allowed it. The state suddenly refers this case for prosecution of the individual?
Even if this driver wasn't watching their phone, there is a cognitive disconnect between the autopilot and the 'backup driver' that is supposed to suddenly become situationally aware in a split second. There are numerous tragedies in trains, and planes already to demonstrate this problem; and those are cases that are actually simpler from an automation perspective. Additionally, Tesla (inaptly-named ) Autopilot has its history.
Anybody's guess who would be responsible the moment that some states allow a truly driverless car. Will some hapless engineer be responsible? the person who assembled the car? The CEO? The person who hailed the car?
Maybe Google or Tesla, or xyz is a different case, but I doubt it. I suspect they will suffer from the same hype of delivering 80% of the solution and claim victory (or perpetually just 2 years away.) The last 19.999% to ensure reasonable safety and availability under all conditions may be nearly impossible and I'm not hearing much talk about it. Maybe in constrained scenarios it might be better odds, but who is responsible for those decisions?
Is lesser cases, who gets the traffic ticket when a driverless car exceeds the speed limit? (Do driverless cars pull-over if a cop car is behind them?) Will people pay for cars that refuse to speed?
Re: (Score:2)
You do not want to accept the job of 'backup driver' because you are basically taking the blame
Not at all. The driver didn't get the blame because a self driving Uber killed someone or because they weren't able to avoid an accident. The driver got the blame because they didn't do their job at all while at the same time also illegally operating a motor vehicle.
If you want to spend all day watching TV then maybe you should should become a TV critic rather than an Uber backup driver.
Re: (Score:2)
Not at all. The driver didn't get the blame because a self driving Uber killed someone or because they weren't able to avoid an accident. The driver got the blame because they didn't do their job at all while at the same time also illegally operating a motor vehicle.
You can't have it both ways. A self driving car should not require a driver and conversely, if a driver is required then the car isn't really 'self driving', is it? Uber's job title for these backup drivers is 'Mission Specialist' and I would bet good money that before the accident NONE of those specialists were told outright that they were the legal operator of the vehicle they were babysitting. It's obvious that the software running these self driving cars is nowhere NEAR ready for prime time and they
Re: (Score:2)
It is an experimental self driving car. The only reason to have a "mission specialist" is to monitor the car and ensure that it's experimental control system does not fail and cause damage.
I would expect that Uber absolutely informed the "mission specialists" they were responsible for the operation of the vehicle, else why have them? There is absolutely no proof they did not.
You are trying to cover up for the incompetent behavior of the "mission specialist" in question, who was streaming a TV show while the
Re: (Score:2)
Obviously the driver has a lot of blame in this and should be punished the same as any other inattentive driver.
But for Uber to be able to throw the driver to the wolves and take no responsibility is infuriating to me.
Uber has just as much responsibility in this as the driver because it is their experiment, this wouldn't have happened if they hadn't put that car on the road.
For the buck to stop at the lowest rung really does not seem right at all.
Re: (Score:2)
Uber knew, or had reason to know, that the job asked of the "backup driver" was impossible. Perhaps the "backup driver" or "mission specialist" is *also* to blame, but the primary blame should be on Uber. He is merely an accomplice before the fact.
Re: (Score:2)
Watching TV is damning in hindsight, this person is probably SOL since that is how people will look at this. Time to get a good lawyer. Is there room to convince a jury, hypothetically, by saying: "I don't recall Uber giving me those instructions; they said they were operating on public streets at SAE level 4 automation and the state permitted it. I was just there to collect data and drive it back if it breaks."? Is the ball back in the state's court to prove their case, not just say there was no proof
Re: (Score:2)
You can't have it both ways. A self driving car should not require a driver and conversely
I don't have it both ways. There are legal obligations currently under both the laws that govern vehicles moving on public roads, and the regulations that govern the testing of self driving vehicles. Those laws state you need a driver. That driver was there, employed for a job that he didn't do. He wasn't distracted, he outright was not doing his job and breaking the law in the process.
This isn't having it both ways. It's having it one way. Person being paid for a task should be doing the task. If I decided
Re:Huh? (Score:4, Informative)
I think the problem is the idiotic mentality to hire the cheapest.
Autonomous cars in Germany usually have 2 or 3 passengers, one the obvious back up driver, and the others are engineers. They keep each other awake and pay attention and usually have diagnostic screens active and a laptop connected to the car system.
Re: (Score:2)
This is the moment, this is the checkpoint where the hype meets reality.
Yep, but the reality train of pain isn't for the job of backup driver... Its for all the idiots who believe autonomous cars mean they can sit back and watch a movie whilst the car does everything.
Because you're still in control of the vehicle, this means you're liable for what the vehicle does.
there is a cognitive disconnect between the autopilot and the 'backup driver' that is supposed to suddenly become situationally aware in a split second.
This is why autonomous cars won't be allowed for decades, let alone are right around the corner. Machines are a long way from being able to tell a leaf from a child, most do not even take into account the trajector
Re: (Score:2)
Autonomous cars don't have to be better than the beat drivers. They only have to be better than the median drive, which, unfortunately is not a high bar.
I've seen drivers on their phones totally oblivious to what was happening on the road. Several times I've seen people reading books, newspapers and magazines while driving. People are generally speaking lousy drivers.
If autonomous cars reduce the number of accidents and the cost to insurance companies they will be on the roads. Only knee jerk emotion will k
Re:Huh? (Score:5, Interesting)
> Hold up. While it's tempting to think this case is the bar we're setting. Consider the released footage of the driver for a second. [theguardian.com] Imagine any industry where you look away from a machine in motion for that long, yeah, you're at fault for your reckless behavior. Like if I was watching a show on my phone while operating a table saw, yeah, I really wouldn't have a strong case for an injury lawsuit.
You have a point, but conversely: people are really, really, really bad at vigilance tasks, where they are supposed to monitor something that almost always goes right, look for something that almost always doesn't appear, etc. Asking someone to be a vigilant backup driver for 4 hours may be a lot harder than driving for 4 hours.
Re: (Score:2)
Bad analogy - you're not the backup operator for your table saw , it doesn't operate itself , you have to operate it therefore you're 100% in charge of it. However if it was an automated saw and all you were supposed to do was tell it what to cut and it did it and you were there just to sit and watch it for hours then it suddenly goes wrong and cuts someone in half for the 30 seconds you're not watching it then would you really be at fault?
Re:Huh? (Score:4, Interesting)
How about "life guard" or "airline pilot." If you watch life guards, then watch for like 30 minutes or so, then switch with another guard. That's because if you stare at the same area of water for long periods of time you completely tune-out. Pilots similarly have a process where the pilot and copilot say to each other something like "scanning 1 o'clock, 2 o'clock, 3'oclock..." because otherwise staring at an empty sky totally desensitizes you. Human brains are terrible watchdogs.
Re: (Score:2)
The problem here wasn't that the vehicle was experimental. It was that the safety driver was not doing their job.
How do you know that?
You have no idea what that person was hired to do. You have no idea what that person was trained to do. No idea what Uber told them their job was.
We both have an idea what the job should have entailed, but that's way different than actually knowing what it did entail. More than likely, if this goes to trial that's going to be the driver's #1 defense.
If Uber can pull out evidence that the driver was told that it was their responsibility to make sure the car didn't hit anything, and was t
Re: (Score:2)
Exactly my point. As part of states allowing driver-less car experiments, that needs to be spelled out explicitly for the parties involved: engineers, vehicle operators, and the general public. Just assuming people understand isn't enough.
Re: (Score:2)
The problem here wasn't that the vehicle was experimental. It was that the safety driver was not doing their job.
Nonsense. The problem here was that the technology was obviously not tested sufficiently to be safe on a public road. Self driving technology that cannot discern, and then stop for or otherwise avoid an obstacle, is absolutely not ready to be operated on a public road, with or without a back up driver. The failure of the back up system is a secondary problem.
Re: (Score:2)
How do you propose we know when a technology is "safe enough" to be tested on a public road, since we can expect that public road testing will reveal new defects and weaknesses?
I'm not quibbling that Uber really fucked up. I just think that it's reasonable to think that we can't get the technologies to be perfectly safe relying on recorded data of other driving and test-track scenarios.
Re: (Score:2)
Obviously extensive testing on public roads is and should be a requirement before the technology can be put into general use. But closed course testing should have been more than sufficient to iron out whatever bugs were present that could have allowed the car to plow into an obstacle without even a hint of recognition. In this case, a failure to slow down in a smooth manor, or to get a little too close to the pedestrian as it passed, would be the kind of bugs I'd be expecting here. Absolutely not a complet
Re: (Score:2)
The Uber system detected miscategorized the cyclist as roadside debris-- like a plastic bag-- after categorizing objects correctly the vast majority of the time.
Look, Uber clearly had all kinds of software quality and process problems. But I guarantee you even the leading implementations like Waymo are learning a whole shit-ton from real road stuff-- exactly about this kind of misclassification problem. Synthetic test conditions reflect experimenters' bias.
Tesla has a bit of an advantage with all the vide
Re: (Score:2)
I agree, there is too much emphasis on real-world testing. It probably makes for better press-releases.
Show me the documentation that they performed their diligence in first proving out all weather and lighting conditions for all foreseeable traffic obstacles. Testing shouldn't just be reactionary, but comprehensive.
You can gather real-world data without the car actually operating. i.e. turn the sensors on but truly just have a human driver that knows they are the primary driver. No automation hand-off
Re: (Score:2)
Nice try but we know that if operators are relieved of too much work, their attention will start to drift [nationalpost.com]. And there are other examples of catastrop [hbr.org]
Re: (Score:2)
Sorry, the paragraph break separated the link from the related point. Removing the break makes things clearer:
Re: (Score:2)
Bull fucking shit. It's about as disconnected as using cruise control and asking people to keep tabs on how fast or slow traffic is going. Just keep your goddamn eyes on the fucking road, it's not brain surgery.
Then why not just drive yourself? I mean if you have to pay as much attention to a vehicle you are not in primary control of then just take primary control. It's a lot easier to remain focused on the road when you are the driver and not the passenger.
Re: (Score:2)
Because this was an experimental vehicle. There was no passenger. There was a safety observer who was being paid to stay alert and monitor the car's operation.
When a fully autonomous vehicle becomes available it will not require someone to monitor it. That's why it will be autonomous and why people will pony up the money for it.
Re: (Score:2)
FWIW, there is evidence that the footage was tampered with to reduce contract and make everything look darker. "Tampered with" may be wrong, since I don't know that the actual details, as opposed to the visual display of them, were actually hidden. But other cameras recording the same scene didn't show it as that dark at all. They weren't good enough to record the necessary detail, but they did record the general level of illumination.
So I guess it's possible that Uber was just using a grossly substandar
this backup-driver may need a $300/hr Attorney (Score:2)
this backup-driver may need a $300/hr Attorney if they go to court as there is a lot of stuff in this case that can get them off but they need to flight it out and not plea bargain also take the bail bond as well you don't want to get stuck in tent city jail.
Re: (Score:2)
Adaptive cruise control keeps speed and following distance, I think that was the AC's point.
Re: Huh? (Score:2)
Worth noting somewhere in here that Uber intentionally disabled all object-detection stoppage in order to "tune for a smoother ride" so the basic automatic braking was never implemented at all.
Wrong; they disabled the manufacturers auto-breaking system and replaced it with one of their lown.
That would be the driver.
Re: (Score:3)
Wrong; they disabled the manufacturers auto-breaking system and replaced it with one of their lown.
This is the way I understand it as well. I believe Volvo's technology would have stopped the car in plenty of time to avoid the pedestrian. Uber's didn't even try, not even when the pedestrian was only a few feet away. Not even a blip on the radar. It is outrageous that Uber put such blatantly disfunctional technology on a public road at all.
Re: (Score:2)
Untested Emergency Automatic Braking systems are incredibly dangerous and require extensive validation. You slam on your brakes and a motorcyclist behind you dies.
And as of today vehicles aren't legally required to have one, so disabling the feature doesn't make your vehicle any less safe or less legal than the hundreds of millions of older vehicles on our roads without an EAB.
Re: (Score:2)
Untested Emergency Automatic Braking systems are incredibly dangerous and require extensive validation. You slam on your brakes and a motorcyclist behind you dies.
This is complete nonsense. That's only true if the motorcyclist behind you is actively looking for the Darwin award. Would it be better to stop even faster than the cycle could under the best of conditions by slamming into a car? Would it be better to flatten an innocent pedestrian than to test the reflexes of a motorcyclist who is following too closely and/or not paying paying attention? I think not.
And as of today vehicles aren't legally required to have one, so disabling the feature doesn't make your vehicle any less safe or less legal than the hundreds of millions of older vehicles on our roads without an EAB.
More nonsense. Disabling that feature may well make your vehicle less safe, if you're not a very attentive d
Re: (Score:2)
Re: (Score:2)
when Republicans try to execute one.
They tried pretty hard with Solyndra. At least, they danced on its grave.
I was thinking that. They never had a product, tho (Score:2)
I also immediately thought of Solyndra and similar green slush fund companies. Of course, Solyndra never sold a product, and nobody ever tried to stop them, so you can't really say anyone killed Solyndra other than their own leadership. I don't know if they actually INTENDED to eventually sell a product, or to just keep taking tax money while repeatedly announcing plans to make solar panels "real soon now". Republicans sure had fun with Obama getting auckered into that, though.
They took the standard Kickst
Re: (Score:2)
Oh definitely (Score:2)
That's for sure.
Most Republicans see Pllanned Parenthood in one of two ways:
A. They are in the business of murdering babies. ...
B. It's not quite the same as murdering babies, let me see if I can explain the difference
If your perspective is A, Planned Parenthood is literally similar to the Nazis, and you definitely want to put a stop to it.
If your perspective is B, "not technically murdering babies, there is a subtle difference" also sounds like a pretty horrible thing!
That's what I find interesting. Some p
Re: (Score:2)
NTSB will "start digging" when an Audi or a Toyota is crashed by its driver, and "AI" is blamed. As long as it is a symbolic American company, no, no sir, they have nothing to worry about.
Re: (Score:2)