Fully Driverless Waymo Taxis Are Due Out This Year, Alarming Critics (arstechnica.com) 256
Alphabet's Waymo is launching a driverless taxi service in Phoenix in the next three months -- and it's open to the public. But due to the limited regulations surrounding self-driving cars, many critics argue that more regulations are needed to ensure the safety of these vehicles before they roll out for public and commercial use. Ars Technica reports: If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe. Currently, there's no comparable requirement for self-driving cars. Federal and state laws allow Waymo to introduce fully self-driving cars onto public streets in Arizona without any formal approval process. That's not an oversight. It represents a bipartisan consensus in Washington that strict regulation of self-driving cars would do more harm than good.
Mary "Missy" Cummings, an engineering professor at Duke, agrees. "I don't think there should be any driverless cars on the road," she tells Ars. "I think it's unconscionable that no one is stipulating that testing needs to be done before they're put on the road." But so far these advocates' demands have fallen on deaf ears. Partly that's because federal regulators don't want to slow the introduction of a technology that could save a lot of lives in the long run. Partly it's because they believe that liability concerns give companies a strong enough incentive to behave responsibly. And partly it's because no one is sure how to regulate self-driving cars effectively. When it comes to driverless cars, "there's no consensus on what it means to be safe or how we go about proving that," says Bryant Walker Smith, a legal scholar at the University of South Carolina.
Mary "Missy" Cummings, an engineering professor at Duke, agrees. "I don't think there should be any driverless cars on the road," she tells Ars. "I think it's unconscionable that no one is stipulating that testing needs to be done before they're put on the road." But so far these advocates' demands have fallen on deaf ears. Partly that's because federal regulators don't want to slow the introduction of a technology that could save a lot of lives in the long run. Partly it's because they believe that liability concerns give companies a strong enough incentive to behave responsibly. And partly it's because no one is sure how to regulate self-driving cars effectively. When it comes to driverless cars, "there's no consensus on what it means to be safe or how we go about proving that," says Bryant Walker Smith, a legal scholar at the University of South Carolina.
I thought... (Score:2)
...Google has been testing driverless cars for years now?
Re:I thought... (Score:5, Interesting)
They have, the difference is these won't have a backup driver to take over.
It seems to me, that Waymo have the best tech and take the safety aspect very seriously, so I don't see any cause for alarm myself.
If Uber were doing this, then there would be cause for alarm. Regulation is needed, as although Waymo take safety seriously, some of there competitors may not.
I agree completely. As over-the-top safety conscious as Waymo has been taking baby steps all the way to release- if Waymo think they're ready I'm inclined to believe them... but like you, I wouldn't believe some of their competitors.
If Waymo are being premature with this and any harm comes to anyone; you can bet they will get their socks sued off them.
Re: (Score:2)
If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe.
That's true, but they're certifying the HARDWARE.
An airplane(sic) can still be flown into a mountain by an idiot pilot no matter how federally certified it is.
What we're talking about here is replacing the human component, not the machinery, and humans aren't very good drivers.
They are in beta.... (Score:2)
... gooole is typically comfortable with launching full fledged products in beta. Don't worry, they will come out of beta in few years.
Re: (Score:2)
Like banks that are "too big to fail," companies like Waymo are now "too big to regulate." They just lobby their way out of any legislation that gets in the way of their business model. And should they screw up and are sued, they have such deep pockets that it has a minuscule effect on their bottom line. So a couple of people get run over or are killed when a self-driving car gets T-boned in an intersection. What's the price of a life these days? $2M? $5M? Waymo can pay for that out of their coffee f
Re:I thought... (Score:5, Insightful)
Well, Waymo certainly has the best PR.
They also only have 9 million miles on the road, most of that effectively in a "sandbox". In the US, there's one fatality every 86 million miles. So the fact that Waymo hasn't killed anyone yet is hardly indicative of anything.
Leaving it to corporations to regulate themselves due to fears of liability has created one disaster after the next. And come on, let's not act like we can't all figure out how autonomous vehicles could be tested. Give a small fleet of them to the NHTSA and have the NHTSA spend a few weeks subjecting them to one unanticipated event after the next in an (easily reconfigurable) mock town, without a given "script" that the manufacturer could use as a cheat sheet.
Too onerous of a testing cost? At least give them a one day serial battery of scenario tests. I mean, come on. We're talking about peoples' lives here. It's bad enough that Level 2 systems don't have to do this. But Level 5? Ugh.
Re: (Score:2)
Re: (Score:2)
They also only have 9 million miles on the road, most of that effectively in a "sandbox". In the US, there's one fatality every 86 million miles. So the fact that Waymo hasn't killed anyone yet is hardly indicative of anything.
I'm sure that Waymo has kept track of minor accidents, or near-accidents, both of which could provide a decent indication of chance of real deadly accidents. .
Re:I thought... (Score:5, Informative)
Sure, the FATALITY rate is roughly 1.25 per 100,000,000 miles in the US, but the ACCIDENT rate is around 600 per 100,000,000 miles.
Re: (Score:2)
Sure, the FATALITY rate is roughly 1.25 per 100,000,000 miles in the US, but the ACCIDENT rate is around 600 per 100,000,000 miles.
And Waymo has been in accidents... but at least as of last year the only accidents they have been involved in they were not at fault in all but one... and that one, the human was driving at the time.
Re: (Score:3)
Waymo used to publish a monthly list of all accidents its cars were in, but they've since stopped doing this in january 2017 and scrubbed their website of past lists. And the claim that it's not been at fault in accidents when under its own control are not true - for example... [businessinsider.com]
It's important to remember that for most of its miles, it's also had a human present who can take control to prevent accidents, and a large chunk of its miles have been "sandboxed" - that is, Google/Waymo tightly controlled where it
Re:I thought... (Score:5, Insightful)
... Their idea: "Just ride bikes. If you live too far out, move in closer." ...The snotty, "just ride bikes" suggestion sounds great, but in reality, it is pompous. There are many elderly people or people in wheelchairs that can't go and buy an Orbea Orca and magically be on their way.
In reality, I could ride my bike to the grocery store if just picking up a few items. I can't ride my bike to work, it would take too long (and I can't afford to move closer to work)... but there are a few other locations within a few dozen miles that I could probably use my bike for... but I won't.
Two reasons:
1) It's friggin' hot in the South 6 months out of the year... I'd arrive everywhere stinking awful. Not to mention, can you imagine all the dehydration deaths if many people did this?
2) It's not safe. I see more "white bicycle memorials" marking where cyclists are killed than I see actual bicycles on the road around here. People don't drive safely around cyclists here and there are no bike lanes. I know this would change as people got more experience around cyclists and such... but I wouldn't want to be cycling until it is the norm.
Re: (Score:2)
You must live in the nice part of the South. The part I live in if friggin' hot EIGHT months out of the year. And humid as all hell to boot - let's hear it for temps and humidity both in the mid-90's (fahrenheit - that's mid-30's for you Celsius junkies)...
need to test in cold and rainy citys as well + sno (Score:2)
need to test in cold and rainy citys as well + snow!
Re: (Score:2)
Chicago city driving + Chicago land highway driving is the real test as well. They better be ready to do 70+ in a 55 speed limit
Re:I thought... (Score:5, Interesting)
No, this is not accurate.
The Waymo vehicles have been driving around Phoenix, without a driver, for most of this year. The vehicles have been limited to 'beta testers' who could hail a ride. This announcement is really just about the commercial launch of the service. At that point everyone will be able to hail one.
Also the notion that there is no 'backup driver' is false. The driver is just not in the car. If the car gets stuck it will simply stop, turn on the hazard lights, and wait for a remote operator to do something to help.
Re: (Score:2)
Oh, you think you're so clever, taking part in anonymous meta meta self criticism, as if people here wouldn't see through my ruse.
No admission or denial of having written the above post, the post it responded to, or the post that that was in response to.
This is why I'm on slashdot.
Can't wait (Score:3, Insightful)
"Currently, there's no comparable requirement for self-driving cars."
Human drivers cause 6,5 million accidents per year, killing tens of thousands of people and injuring several millions.
This can only be better.
Blood on their hands. (Score:2)
Developer data acquired over the past decade of real street testing strongly indicates self-driving cars would save lives. Is a government safety certification process going to accomplish anything these companies have not already considered? What of the lives that can be saved in the mean time?
"Self-driving car advocates argue that slowing down the development of self-driving cars could ultimately cost more lives than it saves. In 2016, more than 37,000 people died from highway crashes, with many being caus
Re: (Score:3)
As previous accidents have shown, a human is a lousy backup for a self-driving car. Very few humans, if there exists any at all, can maintain constant vigilance over a system that does what its supposed to do 99.999% of the time.
We don't understand self-driving tech and their failure modes well enough to create the equivalent of a driving test. People will die before that can be developed, and the faster and more frequently they die, the faster the development. The only alternative is to let someone else de
Re:Blood on their hands. (Score:5, Insightful)
Fine for a family car with self-drive (or even Tesla's driver assistance) functionality, but we're specifically talking about self-driving taxis here. What is the humans in the taxi are using it because they are not able (legally and/or physically) to drive, and that's why they are taking the taxi in the first place? It's going to be kind of limiting for a taxi if it can only accept passengers that are able to drive it as well, especially on a Friday or Saturday night, are they also going to have to require the designated driver to scan their driver's license and to breathe into a breathalyzer before setting off?
I guess there would be slightly less chance of the floor getting covered in vomit though, so there is that.
Re: (Score:3)
Governments have safety in mind, people have career progress and short term stock prices in mind.
Waymo cars required an intervention every 5600 miles, the average American driver drives 13000 miles a year. The average human doesn't crash 3 times a year.
Re: (Score:2)
Governments have safety in mind, people have career progress and short term stock prices in mind.
Not this government. This government is run on career progress and short term stock prices as well.
Re: (Score:3)
Without knowing the circumstances of each "disengagement", I think it is presumptuous to assume they would have all led to accidents.
It could be analogous to a human driver pulling off the side of the road in heavy rain - a fail safe.
Re: (Score:2)
From what I've seen, the opposite is the case. Except under very limited, well controlled circumstances, current self-driving cars are significantly worse than the average human driver, despite all the faults of human drivers. That's not to say self-driving cars won't get better than humans eventually, just that I'm guessing that Waymo is going to severely limit the conditions under w
Yes they could be worse (Score:2)
Umm, no.... It definitely COULD be worse with driverless cars. There are a lot of numbers bigger than 6.5 million. Obviously there is no point to the endeavor if the driverless cars have a worse accident rate than human driven cars but it's certainly among the possible outcomes. And there definitely will be fatalities and injuries with driverless cars. We're just hoping for (substantially) better than current technology.
Re: (Score:2)
Who do you sue when a driverless car runs you over?
The maker of the self-driving system. Nothing else would make sense. In this case the vehicles will be owned and operated by Waymo, so there's really no question. If a self-driving vehicle were privately-owned, unless the self-driving system maker can show that the owner of the vehicle did something wrong (didn't keep the software up to date, didn't maintain the mechanical functions of the car, actively interfered with the system during operation, causing the accident), then there's no entity other than t
Re: (Score:2)
And here's the cold financial logic. Let's say that the cars kill over four times as many people human drivers - one every 20 million miles (Waymo has driven under half as many so far, a lot of that with heavy sandboxing). Let's say that the average wrongful death settlement is a well-above-average $2m. So $0,10/mi. The average rate for an Uber ride in the US is $2/mi, and the average car in the US costs (incl. depreciation) about $0,75/mi to operate.
Waymo can afford to kill people.
Re:Can't wait (Score:5, Insightful)
And here's the cold financial logic. Let's say that the cars kill over four times as many people human drivers - one every 20 million miles (Waymo has driven under half as many so far, a lot of that with heavy sandboxing). Let's say that the average wrongful death settlement is a well-above-average $2m. So $0,10/mi. The average rate for an Uber ride in the US is $2/mi, and the average car in the US costs (incl. depreciation) about $0,75/mi to operate.
Waymo can afford to kill people.
Financially, perhaps, but if Waymo's cars are more dangerous than human-driven cars -- or even just aren't significantly safer, since every single accident will be national news -- then it will be a PR disaster that will get them shut down quickly.
Re: (Score:2)
"Who do you sue when a driverless car runs you over?"
Very clever.
This is a trick question. The answer is that you don't sue anyone. You're dead.
Re: (Score:2)
And the car owner, and the manufacturer of the car and of the software. There may be many problems with self-driving cars, but finding lawsuit-targets ain't one.
Re: (Score:2)
And the car owner, and the manufacturer of the car and of the software. There may be many problems with self-driving cars, but finding lawsuit-targets ain't one.
Sure, you can sue all of them, but if the self-driving system made a bad decision and caused an accident, the maker of that system is the only place courts could reasonably pin the liability.
Re: (Score:2)
And the car owner, and the manufacturer of the car and of the software. There may be many problems with self-driving cars, but finding lawsuit-targets ain't one.
Sure, you can sue all of them, but if the self-driving system made a bad decision and caused an accident, the maker of that system is the only place courts could reasonably pin the liability.
Wrong. The liability could also be pinned on the state government (in other words, you and me) for allowing these cars to operate on public roads without adequate testing.
Re: (Score:2)
I imagine that the vehicles will be insured for both damages and liability like every other car on the road, so fault will be the only determining factor in the suit as far as
Re: (Score:2)
Or do you think "courts" follows your reasonable ?
Yes, courts are generally very reasonable. There are exceptions, plenty of them, but not usually around liability assignment.
Testing? (Score:3, Insightful)
Mary "Missy" Cummings, an engineering professor at Duke, agrees. "I don't think there should be any driverless cars on the road," she tells Ars. "I think it's unconscionable that no one is stipulating that testing needs to be done before they're put on the road."
What does she assume this whole time self driving cars have just been something in people's heads? Every company who's in on this technology brags about their logged road time... Glad she ain't my prof.
Re:Testing? (Score:5, Insightful)
It's easy to log millions of miles of road when you are the one choosing which roads to drive and when to drive.
Proper independent testing needs to test out when those cars will fail, not just that they will work well under expected conditions.
Re: (Score:2)
It's easy to log millions of miles of road when you are the one choosing which roads to drive and when to drive. Proper independent testing needs to test out when those cars will fail, not just that they will work well under expected conditions.
While that's kinda true, you run into all sorts of drivers, trucks, bikes, trikes, bicycles, pedestrians, wild animals etc. unless you're extremely selective about where you go. I mean the road artifacts aren't that many, sure if you plan a route with no roundabouts or train crossings you never do those but there aren't that many "fixed situations". Most the crazy stuff happens at random due to traffic, you can't plan your way out much other than road conditions.
Re: (Score:2)
According to Waymo's latest disengagement report for 2017 [ca.gov], their disengagement rate is 1 once per 5600 miles. Considering the average american drives around 13000 miles per year based on DOT statistics, this equates to a disengagement incident every 5 months per vehicle. If the car is fully autonomous without a monitoring driver, each of those disengament incidents would translate into an accident. Any human that had an accident every 5 months would find themselves uninsurable in short order.
I don't think t
Re: (Score:2)
If the car is fully autonomous without a monitoring driver, each of those disengament incidents would translate into an accident.
Not necessarily. Disengagement can occur when someone see something unexpected. For example, the case of the flipped Uber the driver disagreeing with the car and causing the disengagement directly led to the accident. I also can't find the source of it, but I remember seeing a report a while back talking about things like disengagement being required when the car suddenly stopped unexpectedly and the driver took over until the situation could be analysed... which happened a few seconds later when a cyclist
Re: (Score:2)
Disengagements are a concern, but saying that without the driver they would translate to an accident is completely false.
I've definitely read about a number of cases where disengagement meant pulling over and stopping. I'd be really interested in a summary of what the end results of the disengagements are. If it's 10% accidents, that might be cause for concern because it's on the order of an accident every 4-5 years. If it's 1% accidents, that's better than human drivers, I'm guessing.
Um... won't they always choose the roads? (Score:2)
Re: (Score:2)
This sentiment is common among people who favor heavy regulation of damn near everything. And I get it - engineering teaches you about the high price of engineering failures.
My beef with this mindset is the same problem I have with central planning - the idea that you will guess correctly and prevent failure is hilariously arrogant. It is better to regulate after there is a demonstrated need - in other words, institutionalize the process of learning from failures. If you want to be proactive, rather than ha
do you remember the fights over motorcycle helmet (Score:2)
I don't worry much about the cars themselves, but about the potential for legislation mandating we have self-driving cars. If you are old enough, you'll recall all the arguments about motorcycle helmet laws. How motorcycle accidents without helmets overloaded ER rooms and cost the public a lot of money in medical care. There is a clear precedent for my fears, The argument is the same. Don't forget that driving is a privilege not a right.
Re: (Score:2)
I don't worry much about the cars themselves, but about the potential for legislation mandating we have self-driving cars. If you are old enough, you'll recall all the arguments about motorcycle helmet laws. How motorcycle accidents without helmets overloaded ER rooms and cost the public a lot of money in medical care. There is a clear precedent for my fears, The argument is the same. Don't forget that driving is a privilege not a right.
Eventually it WILL be better for everyone that EVERYONE be mandated to have a self-driving car.
I don't expect that to happen anytime soon. Probably not for at least another 30 years or more... but eventually yes, people won't drive themselves... and that's a good thing.
Re: (Score:2)
Self-driving car is cool. Car that reports everywhere you go to Google, because they don't have enough information about you, is very much not cool./p.
Re: (Score:2)
How did you calculate 30 years? If you take into consideration exponential growth and previous examples of technology improving faster than predicted, you can't use intuition to predict the timeline for future tech.
In other words, I and people who have a history of predicting tech correctly, think that you are wrong.
One clear proof should be Waymo. Single accident would cause huge PR damage. Do you really think they want to test it how it goes without being absolutely sure that it works? So yes, seld driing cars in good driving conditions is here today. For bad conditions it might take more time, perhaps even a year or two.
10 years to get the technology to be
a) I figure there are at least 10 years of development before the technology is ready for the masses.
b) Average car on the road is 10 years old- so even when the technology is ready there will be a long transition before the majority of cars on the road are self driving.
c) Government will give a long grace period for people with human driven cars that are late to the self-driven technology to switch over.
It's not going to happen in less than 30 years.
Re: (Score:2)
Should say 10 years to get the technology to be ready for mass-consumer.
Re: (Score:2)
I'm all for car drivers and pedestrians to wear crash helmets. As well as offering protection from impact, making HUDs mainstream would be safer than walking or driving around looking down at smartphone screens.
Manufacturers will want regulation too (Score:5, Interesting)
I used to work for a medical device manufacturer. While having to deal with a lot of regulations was certainly annoying (mostly because they are written by lawyers and you need to be a lawyer to really understand them), the great thing about them was that once you complied, you didn't have to worry nearly as much about liability. If there were no regulations (basically a form of self-regulation), then how exactly do you prove that you were not negligent? Maybe you think all the tests you did were enough. Maybe the lawyers you hired for advice thought so too. But you'll never know until it is tested in court. With regulations it is more-or-less black and white as to whether you have done enough to absolve yourself of responsibility for unforeseen events.
Another important point is that regulation creates a powerful barrier to entry in a market. The infrastructure required (in terms of processes and procedures) is immense, and large companies can gain economies of scale for these work. While the tech is enough of a barrier to entry right now, as time goes on this will change for driverless vehicles as well.
Re: (Score:3)
If there were no regulations (basically a form of self-regulation), then how exactly do you prove that you were not negligent?
This is a much easier problem for makers of self-driving systems because one of the primary functions of such systems is to gather large amounts of data about the precise locations of the vehicle and all other vehicles, pedestrians, cyclists and other objects around it, as well as status of any traffic signals, etc. So as long as the self-driving system stores at least a few minutes of data and retains it all in the event of an accident, the precise sequence of events and the decisions made by the self-dri
black box need to be safe from manufacturers delet (Score:2)
black box need to be safe from manufacturers deleting data so they can't cover up when there software fails.
Re: (Score:2)
black box need to be safe from manufacturers deleting data so they can't cover up when there software fails.
Yes, I said that.
Good and bad (Score:2)
As you know, then, the FDA changed it's rules on adverse effect reporting a few years ago, putting the burden on device manufacturers to figure out if their devices are causing issues or not. These changes came with little to no guidance on how to properly report on these incidents. The FDA will let you know you are doing something wrong when they fine or sue you.
There are good regulatory regimes and there are bad ones. With the government structured the way it is, it's a bit of a crap-shoot as to what you
Bad comparsion (Score:2)
If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe. Currently, there's no comparable requirement for self-driving cars.
How are these things comparable? We require medical devices to prove that they work because quackery [wikipedia.org] is a real thing with real consequences. As such we require medical device manufacturers to prove that they actually provide therapeutic value. It's expensive but the alternative of people not being able to trust the treatment is FAR more expensive. People who are ill and in need of treatment aren't routinely able to determine for themselves whether a device works or is well made and most medical treatmen
Re: (Score:2)
As such we require medical device manufacturers to prove that they actually provide therapeutic value.
The regulations are mostly for safety, such as proving that the device cannot deliver an electric shock from the mains supply, while wires are attached to your body. Device must be designed so that they are safe even in case there's a single failure, such as broken insulation, or faulty component.
Medical device regulations (Score:2)
The regulations are mostly for safety, such as proving that the device cannot deliver an electric shock from the mains supply, while wires are attached to your body.
Of course the regulations are for safety but there is a lot more to it than that. My company makes wiring harnesses for heart and lung machines so I'm more familiar than I really care to be with what is required. There are a lot of requirements regarding how they are made, the quality systems, traceability of materials and processes, calibration of equipment, and more. Product design is of course a piece of the puzzle too but it's not the only piece by a long shot. I've seen the FDA crawl up the ass of
There's only one way to find out what to regulate (Score:2)
What came first, the automobile or traffic laws? The airplane or the FAA?
The nature of safety regulations is such that they must be reactive. Some issues can be predicted, like liability in case of a crash, but most will only be revealed after implementation.
Re: (Score:2)
What came first, the automobile or traffic laws? The airplane or the FAA?
Wrong question. The right questions are, "How long did commercial passenger flights operate before the government started creating aircraft safety regulations?", and "Whose idea was it to have the Federal government get into the business of creating aircraft safety regulations?" By the way, the FAA wasn't created until 1958. The Department of Commerce was the original regulating authority, and actually took over some functions from the Post Office Department.
The first scheduled commercial airline flight was in 1914. The Air Commerce Act of 1926, gave the Federal government the power to regulate civil aviation. This legislation was passed at the urging of the aviation industry, whose leaders believed the airplane could not reach its full commercial potential without federal action to improve and maintain safety standards.
So, if you want to draw parallels between airplanes and self-driving cars, you should be asking why Waymo and other developers of the technology aren't asking for government regulation, not maintaining that no regulation should be required.
Re: (Score:2)
I agree that a lot of regulation will come after. We have no idea what protections will need to be in place to make it safe. There's going to be some trial and error.
I suspect I big part of the lack of regulation is that the government doesn't know how to regulate it properly. That is to say, when the government designs the test, people are rightly or wrongly going to assume that it is safe. They can put whatever disclaimers they want on it, that will be the perception of people and companies. Companies wil
Well (Score:2)
The standard shouldn't be "Is it perfectly safe" - it should be "is it safer than humans"...
Humans make shitty drivers...
Re: (Score:2)
Slower Traffic (Score:2)
I give it a year. (Score:2)
Book taxi to drive somewhere quiet for a pickup.
Put bomb in dummy on back seat while dressed head-to-toe in black.
Set taxi destination to some sensitive location, or as near as damn it.
Boom.
Wait for the fallout as they realise the only thing they have is a rural location and a pre-pay credit card to link it all back to.
liability (Score:2)
the rule should be "For the purpose of liability and insurance requirements, any self-driving car is considered as being driven by the CEO"
So when Jack Krafcik goes to jail for one of the cars killing a pedestrian or passenger then maybe they'll slow down the rollout.
The good news is.. (Score:2)
Re: Waymo is not Uber (Score:2)
All they need to do is kill one pedestrian, and then it will be the same shit.
Re:Waymo is not Uber (Score:5, Funny)
How much adversarial attacks have those cars been exposed to? Just because a car can safely drive down a standard road while supervised, doesn't mean it can't fail in catastrophic ways when exposed to non-standard situations. You don't want to have some jocksters paint stripes down a cliff and the cars blindly driving to their doom Wile E. Coyote style.
I wouldn't trust those cars one bit until they have been shown to be able to handle freak situations in a reasonable way.
Re:Waymo is not Uber (Score:5, Insightful)
I wouldn't trust those cars one bit until they have been shown to be able to handle freak situations in a reasonable way.
That statement should apply to both self-driving and human cars: No human-driven cars should be allowed on the road until humans have been shown to handle freak situations in a reasonable way. Sadly, this is provably not the case.
Re: Waymo is not Uber (Score:2)
Re: (Score:2)
The fact that you think people function like computers is amusing... and revealing.
When it comes to driving, there is no real functional difference except that computers don't get emotional, so they won't get themselves into an emotional state which causes them to make mistakes. Even immediately after a near-collision, the computer will still have the same lack of emotions that it had right before a near-collision. That means that it will retain control of the vehicle in situations where many humans would simply stop functioning, i.e. "panic" and "freeze up". The computer can sustain more
Re: (Score:2)
Emotions are exactly what prevent humans from entering into dangerous situations in the first place.
They do no such thing. They can dissuade some humans from getting into some situations, but they can also encourage some humans to get into some other situations. People think they can do things that they can't do because of emotions all the time, with disastrous results.
You have emphasized exactly why a person painting lines to a cliff is a valid concern.
It's a valid concern because human actors do dumb things. And the vehicles compare data from multiple sources (GPS, sensors, etc.) so they will not be easily fooled in this manner. Not just that, but the vehicle looks to see if there is a r
Re: (Score:2)
That fact that you don't think people function like computers is tragic... and revealing.
Seriously. If you don't know how dumb people are, you really should educate yourself [theweek.com].
Re: (Score:3)
I wouldn't trust those cars one bit until they have been shown to be able to handle freak situations in a reasonable way.
That statement should apply to both self-driving and human cars: No human-driven cars should be allowed on the road until humans have been shown to handle freak situations in a reasonable way. Sadly, this is provably not the case.
Humans are a few orders of magnitude better than the numbers that waymo posted for there cars. Typical humans go 100s of thousands of miles without needing an intervention. Waymo cars (according to their own data) need an intervention every 5600m or so.
So, yeah, humans may not be perfect, but according to the numbers they're a hell of a lot better than the best SDCs available so far.
Re: (Score:2)
That statement should apply to both self-driving and human cars
No, because they're not the same at all. Self-driving cars are in many respects attractive nuisances. If you don't think they will be tempting targets for malicious interference (e.g., spray-paint cameras or laser pointers aimed at cameras), then you haven't thought about it enough.
I understand that Waymo and others have done a lot of testing and undoubtedly have their own sets of standards they think are sufficient to ensure a reasonable level of safety, but I don't have access to those standards or the te
Re: (Score:2)
Also, if a particular model of self-driving car has a flaw that causes multiple accidents, there will be massive pressure to fix it. The pressure may come from a government decertifying that model, or it may come from liability issues, I'd be happy with more government involvement here (governments are slow but faster than the free market in these cases), but either way it will be fixed. Taking away someone's license for a year doesn't make them a better driver.
Re: (Score:2)
How much adversarial attacks have those cars been exposed to? Just because a car can safely drive down a standard road while supervised, doesn't mean it can't fail in catastrophic ways when exposed to non-standard situations. You don't want to have some jocksters paint stripes down a cliff and the cars blindly driving to their doom Wile E. Coyote style.
I wouldn't trust those cars one bit until they have been shown to be able to handle freak situations in a reasonable way.
Humans fail in catastrophic ways when exposed to non-standard and standard situations all the time. Nothing will ever be perfect. That includes self-driving cars. The question is: Do they have a significant potential to be a lot safer than human drivers? Is a self driving car more likely to:-
eat a burger while driving down the road?
text?
rubber-necking?
weave in and out of traffic?
race it's buddies on the highway?
engage in road rage?
speed?
put on makeup?
etc etc
compared to a human?
Re: (Score:2)
And that Arizona specialty, blasting down a freeway going the wrong way after the bars close.
Re: (Score:2)
I just had to swerve out of an exit lane, pass someone, and get back into it in front of them because they were going 30 under the speed limit at the start of it and swerving around all over the place. I was damn worried that the cars behind me exiting were going to plow into the back of me if I slammed on my brakes and went his speed. As I passed the jackass, I could see him pounding down a messy sub with both hands and driving with his elbow. And this means that he pulled it out, unwrapped it, and starte
Re: (Score:2)
I have an adversarial scenario for you: turn off the traffic lights at any intersection and then watch as the brilliant, adaptable humans fail to treat it as a four-way stop and just zoom on through.
Re: (Score:2)
There's an intersection on my way to work, with traffic lights that are disabled early in the morning. There's a shocking number of cars (at least half) that no longer have any clue what to do, because they are so used to the lights working. I've seen all kinds of crazy behavior, such as cars not yielding when they should, yielding when they shouldn't, or go at 3 mph across the intersection because they realize they have no idea what they're doing.
Re: (Score:2)
At least the people yielding when they shouldn't or going 3 mph have a sensible fail safe :)
The scary ones are the people who are like, huh, lights out... I guess I should just plow through the intersection as if there was no stop light at all!
Re:It is inevitable (Score:5, Insightful)
It is inevitable that they pass laws allowing machines to kill x number of people. It can be no other way. And that will be a major devaluing of human life.
The question is, will it be more or less devaluing than the currently allowed rate of 40K+ people a year killed by human-driven cars in the US?
Re: (Score:3)
Insurance = valuing a human life (Score:2)
It is inevitable that they pass laws allowing machines to kill x number of people. It can be no other way. And that will be a major devaluing of human life.
The statistics for car accidents are no mystery and are objectively rather appalling yet we seem to be largely ok with the current state of affairs. We are letting machines operated by humans kill X number of people even though we have the technical ability to reduce this number any time we want. Whether a person is killed by a programmed machine or killed by a mistake a human makes directly operating a machine is really of no consequence to the dead. Insurance is literally a valuation of human life and
Re: (Score:2)
Why is that inevitable? 1.3 million die in auto accidents every year. Life is already devalued. Historically, life has never been valued more highly. You used to lose dozens of people building a stinking bridge - now the largest public infrastructure project in the US has had one minor injury [chicoer.com]. The $4 billion Tappan Zee bridge was completed without any deaths or serious injuries.
Re:It is inevitable (Score:4, Insightful)
Re: (Score:2)
That's bull, because if a human driver kills a human, then that human has made a mistake.
In most cases, the driver's insurance just pays the damage, and the human climbs back in a new car without personal consequences. For driverless cars, that would be the same.
Re:It is inevitable (Score:4, Interesting)
And then THEY ALL can be fixed, probably with an over-the-air update. No drunk driving PR campaign, no driver outreach or training, no expensive modification of signage or intersections, etc. This is an advantage. Only a hardware limitation would be similar to the current paradigm, and even then it would be no worse.
Re: (Score:2)
Re: (Score:3)
Also true of signage, intersection design, human training, etc.
Re: (Score:2)
Re: (Score:3)
If that is your threshold, then just go away unconvinced because that kind of analysis isn't going to happen in a slashdot discussion thread. Comparing the systemic complexity of transportation system design with and without the added variable of artificial intelligence is thesis material, at least. All you'll get out of me is that they are both very hard, subject to continuous change and improvement, and both very imperfect. No, I don't know the relative magnitudes and neither do you.
Re: (Score:2)
Re: (Score:2)
It's not ridiculous at all. Your assumption that the existing system is simple is ridiculous. 30,000 automotive deaths per year is a sufficiently large problem to solve such that if it were easy it would have been done by now. You are going after me for data - where's your data? We are in the same position.
Re: (Score:3)
over-the-air update need to be free and no roaming (Score:2)
over-the-air update need to be free and no roaming fees.
OR you really want an car the need to go the dealer each X mouths for an $200 software update. (60 software + 140 labor) and maybe even BS like the map data is needs more HDD space come the dealer for an $260 500GB SSD + 150 labor to install it.
Re: (Score:2)
Aren't we getting a little ahead of ourselves? :)
Re: (Score:3)
The Tesla that killed the dude by driving straight into a crete highway barrier had just gotten an update. The previous version wouldn't have wrecked like that.
That was just lane following, a much simpler problem than general autonomous driving.
Re: (Score:3)