Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation Businesses Google Government United States Technology

Fully Driverless Waymo Taxis Are Due Out This Year, Alarming Critics (arstechnica.com) 256

Alphabet's Waymo is launching a driverless taxi service in Phoenix in the next three months -- and it's open to the public. But due to the limited regulations surrounding self-driving cars, many critics argue that more regulations are needed to ensure the safety of these vehicles before they roll out for public and commercial use. Ars Technica reports: If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe. Currently, there's no comparable requirement for self-driving cars. Federal and state laws allow Waymo to introduce fully self-driving cars onto public streets in Arizona without any formal approval process. That's not an oversight. It represents a bipartisan consensus in Washington that strict regulation of self-driving cars would do more harm than good.

Mary "Missy" Cummings, an engineering professor at Duke, agrees. "I don't think there should be any driverless cars on the road," she tells Ars. "I think it's unconscionable that no one is stipulating that testing needs to be done before they're put on the road." But so far these advocates' demands have fallen on deaf ears. Partly that's because federal regulators don't want to slow the introduction of a technology that could save a lot of lives in the long run. Partly it's because they believe that liability concerns give companies a strong enough incentive to behave responsibly. And partly it's because no one is sure how to regulate self-driving cars effectively. When it comes to driverless cars, "there's no consensus on what it means to be safe or how we go about proving that," says Bryant Walker Smith, a legal scholar at the University of South Carolina.

This discussion has been archived. No new comments can be posted.

Fully Driverless Waymo Taxis Are Due Out This Year, Alarming Critics

Comments Filter:
  • ...Google has been testing driverless cars for years now?

  • Can't wait (Score:3, Insightful)

    by nospam007 ( 722110 ) * on Thursday October 04, 2018 @05:30AM (#57423358)

    "Currently, there's no comparable requirement for self-driving cars."

    Human drivers cause 6,5 million accidents per year, killing tens of thousands of people and injuring several millions.
    This can only be better.

    • Developer data acquired over the past decade of real street testing strongly indicates self-driving cars would save lives. Is a government safety certification process going to accomplish anything these companies have not already considered? What of the lives that can be saved in the mean time?

      "Self-driving car advocates argue that slowing down the development of self-driving cars could ultimately cost more lives than it saves. In 2016, more than 37,000 people died from highway crashes, with many being caus

      • by djinn6 ( 1868030 )

        As previous accidents have shown, a human is a lousy backup for a self-driving car. Very few humans, if there exists any at all, can maintain constant vigilance over a system that does what its supposed to do 99.999% of the time.

        We don't understand self-driving tech and their failure modes well enough to create the equivalent of a driving test. People will die before that can be developed, and the faster and more frequently they die, the faster the development. The only alternative is to let someone else de

      • by Zocalo ( 252965 ) on Thursday October 04, 2018 @06:16AM (#57423490) Homepage

        Stipulate the special scenarios where the human has to take over and let the machines handle all other scenarios.

        Fine for a family car with self-drive (or even Tesla's driver assistance) functionality, but we're specifically talking about self-driving taxis here. What is the humans in the taxi are using it because they are not able (legally and/or physically) to drive, and that's why they are taking the taxi in the first place? It's going to be kind of limiting for a taxi if it can only accept passengers that are able to drive it as well, especially on a Friday or Saturday night, are they also going to have to require the designated driver to scan their driver's license and to breathe into a breathalyzer before setting off?

        I guess there would be slightly less chance of the floor getting covered in vomit though, so there is that.

      • by Luthair ( 847766 )

        Governments have safety in mind, people have career progress and short term stock prices in mind.

        Waymo cars required an intervention every 5600 miles, the average American driver drives 13000 miles a year. The average human doesn't crash 3 times a year.

        • Governments have safety in mind, people have career progress and short term stock prices in mind.

          Not this government. This government is run on career progress and short term stock prices as well.

        • Without knowing the circumstances of each "disengagement", I think it is presumptuous to assume they would have all led to accidents.

          It could be analogous to a human driver pulling off the side of the road in heavy rain - a fail safe.

      • by jbengt ( 874751 )

        Developer data acquired over the past decade of real street testing strongly indicates self-driving cars would save lives.

        From what I've seen, the opposite is the case. Except under very limited, well controlled circumstances, current self-driving cars are significantly worse than the average human driver, despite all the faults of human drivers. That's not to say self-driving cars won't get better than humans eventually, just that I'm guessing that Waymo is going to severely limit the conditions under w

    • Umm, no.... It definitely COULD be worse with driverless cars. There are a lot of numbers bigger than 6.5 million. Obviously there is no point to the endeavor if the driverless cars have a worse accident rate than human driven cars but it's certainly among the possible outcomes. And there definitely will be fatalities and injuries with driverless cars. We're just hoping for (substantially) better than current technology.

  • Testing? (Score:3, Insightful)

    by SCVonSteroids ( 2816091 ) on Thursday October 04, 2018 @06:04AM (#57423454)

    Mary "Missy" Cummings, an engineering professor at Duke, agrees. "I don't think there should be any driverless cars on the road," she tells Ars. "I think it's unconscionable that no one is stipulating that testing needs to be done before they're put on the road."

    What does she assume this whole time self driving cars have just been something in people's heads? Every company who's in on this technology brags about their logged road time... Glad she ain't my prof.

    • Re:Testing? (Score:5, Insightful)

      by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Thursday October 04, 2018 @06:39AM (#57423588) Homepage

      It's easy to log millions of miles of road when you are the one choosing which roads to drive and when to drive.

      Proper independent testing needs to test out when those cars will fail, not just that they will work well under expected conditions.

      • by Kjella ( 173770 )

        It's easy to log millions of miles of road when you are the one choosing which roads to drive and when to drive. Proper independent testing needs to test out when those cars will fail, not just that they will work well under expected conditions.

        While that's kinda true, you run into all sorts of drivers, trucks, bikes, trikes, bicycles, pedestrians, wild animals etc. unless you're extremely selective about where you go. I mean the road artifacts aren't that many, sure if you plan a route with no roundabouts or train crossings you never do those but there aren't that many "fixed situations". Most the crazy stuff happens at random due to traffic, you can't plan your way out much other than road conditions.

      • According to Waymo's latest disengagement report for 2017 [ca.gov], their disengagement rate is 1 once per 5600 miles. Considering the average american drives around 13000 miles per year based on DOT statistics, this equates to a disengagement incident every 5 months per vehicle. If the car is fully autonomous without a monitoring driver, each of those disengament incidents would translate into an accident. Any human that had an accident every 5 months would find themselves uninsurable in short order.

        I don't think t

        • If the car is fully autonomous without a monitoring driver, each of those disengament incidents would translate into an accident.

          Not necessarily. Disengagement can occur when someone see something unexpected. For example, the case of the flipped Uber the driver disagreeing with the car and causing the disengagement directly led to the accident. I also can't find the source of it, but I remember seeing a report a while back talking about things like disengagement being required when the car suddenly stopped unexpectedly and the driver took over until the situation could be analysed... which happened a few seconds later when a cyclist

          • Disengagements are a concern, but saying that without the driver they would translate to an accident is completely false.

            I've definitely read about a number of cases where disengagement meant pulling over and stopping. I'd be really interested in a summary of what the end results of the disengagements are. If it's 10% accidents, that might be cause for concern because it's on the order of an accident every 4-5 years. If it's 1% accidents, that's better than human drivers, I'm guessing.

      • it's a taxi service. They can always limit service to areas they're comfortable driving in. 80% coverage would still be plenty profitable. Heck 50% would.
    • This sentiment is common among people who favor heavy regulation of damn near everything. And I get it - engineering teaches you about the high price of engineering failures.

      My beef with this mindset is the same problem I have with central planning - the idea that you will guess correctly and prevent failure is hilariously arrogant. It is better to regulate after there is a demonstrated need - in other words, institutionalize the process of learning from failures. If you want to be proactive, rather than ha

  • I don't worry much about the cars themselves, but about the potential for legislation mandating we have self-driving cars. If you are old enough, you'll recall all the arguments about motorcycle helmet laws. How motorcycle accidents without helmets overloaded ER rooms and cost the public a lot of money in medical care. There is a clear precedent for my fears, The argument is the same. Don't forget that driving is a privilege not a right.

    • I don't worry much about the cars themselves, but about the potential for legislation mandating we have self-driving cars. If you are old enough, you'll recall all the arguments about motorcycle helmet laws. How motorcycle accidents without helmets overloaded ER rooms and cost the public a lot of money in medical care. There is a clear precedent for my fears, The argument is the same. Don't forget that driving is a privilege not a right.

      Eventually it WILL be better for everyone that EVERYONE be mandated to have a self-driving car.

      I don't expect that to happen anytime soon. Probably not for at least another 30 years or more... but eventually yes, people won't drive themselves... and that's a good thing.

      • Eventually it WILL be better for everyone that EVERYONE be mandated to have a self-driving car.

        Self-driving car is cool. Car that reports everywhere you go to Google, because they don't have enough information about you, is very much not cool./p.

    • by bazorg ( 911295 )

      I'm all for car drivers and pedestrians to wear crash helmets. As well as offering protection from impact, making HUDs mainstream would be safer than walking or driving around looking down at smartphone screens.

  • by monkeyxpress ( 4016725 ) on Thursday October 04, 2018 @06:26AM (#57423534)

    I used to work for a medical device manufacturer. While having to deal with a lot of regulations was certainly annoying (mostly because they are written by lawyers and you need to be a lawyer to really understand them), the great thing about them was that once you complied, you didn't have to worry nearly as much about liability. If there were no regulations (basically a form of self-regulation), then how exactly do you prove that you were not negligent? Maybe you think all the tests you did were enough. Maybe the lawyers you hired for advice thought so too. But you'll never know until it is tested in court. With regulations it is more-or-less black and white as to whether you have done enough to absolve yourself of responsibility for unforeseen events.

    Another important point is that regulation creates a powerful barrier to entry in a market. The infrastructure required (in terms of processes and procedures) is immense, and large companies can gain economies of scale for these work. While the tech is enough of a barrier to entry right now, as time goes on this will change for driverless vehicles as well.

    • If there were no regulations (basically a form of self-regulation), then how exactly do you prove that you were not negligent?

      This is a much easier problem for makers of self-driving systems because one of the primary functions of such systems is to gather large amounts of data about the precise locations of the vehicle and all other vehicles, pedestrians, cyclists and other objects around it, as well as status of any traffic signals, etc. So as long as the self-driving system stores at least a few minutes of data and retains it all in the event of an accident, the precise sequence of events and the decisions made by the self-dri

    • As you know, then, the FDA changed it's rules on adverse effect reporting a few years ago, putting the burden on device manufacturers to figure out if their devices are causing issues or not. These changes came with little to no guidance on how to properly report on these incidents. The FDA will let you know you are doing something wrong when they fine or sue you.

      There are good regulatory regimes and there are bad ones. With the government structured the way it is, it's a bit of a crap-shoot as to what you

  • If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe. Currently, there's no comparable requirement for self-driving cars.

    How are these things comparable? We require medical devices to prove that they work because quackery [wikipedia.org] is a real thing with real consequences. As such we require medical device manufacturers to prove that they actually provide therapeutic value. It's expensive but the alternative of people not being able to trust the treatment is FAR more expensive. People who are ill and in need of treatment aren't routinely able to determine for themselves whether a device works or is well made and most medical treatmen

    • As such we require medical device manufacturers to prove that they actually provide therapeutic value.

      The regulations are mostly for safety, such as proving that the device cannot deliver an electric shock from the mains supply, while wires are attached to your body. Device must be designed so that they are safe even in case there's a single failure, such as broken insulation, or faulty component.

      • The regulations are mostly for safety, such as proving that the device cannot deliver an electric shock from the mains supply, while wires are attached to your body.

        Of course the regulations are for safety but there is a lot more to it than that. My company makes wiring harnesses for heart and lung machines so I'm more familiar than I really care to be with what is required. There are a lot of requirements regarding how they are made, the quality systems, traceability of materials and processes, calibration of equipment, and more. Product design is of course a piece of the puzzle too but it's not the only piece by a long shot. I've seen the FDA crawl up the ass of

  • and that's to put the tech into widespread use.

    What came first, the automobile or traffic laws? The airplane or the FAA?

    The nature of safety regulations is such that they must be reactive. Some issues can be predicted, like liability in case of a crash, but most will only be revealed after implementation.

    • by tsqr ( 808554 )

      What came first, the automobile or traffic laws? The airplane or the FAA?

      Wrong question. The right questions are, "How long did commercial passenger flights operate before the government started creating aircraft safety regulations?", and "Whose idea was it to have the Federal government get into the business of creating aircraft safety regulations?" By the way, the FAA wasn't created until 1958. The Department of Commerce was the original regulating authority, and actually took over some functions from the Post Office Department.

      The first scheduled commercial airline flight was in 1914. The Air Commerce Act of 1926, gave the Federal government the power to regulate civil aviation. This legislation was passed at the urging of the aviation industry, whose leaders believed the airplane could not reach its full commercial potential without federal action to improve and maintain safety standards.

      So, if you want to draw parallels between airplanes and self-driving cars, you should be asking why Waymo and other developers of the technology aren't asking for government regulation, not maintaining that no regulation should be required.

    • I agree that a lot of regulation will come after. We have no idea what protections will need to be in place to make it safe. There's going to be some trial and error.

      I suspect I big part of the lack of regulation is that the government doesn't know how to regulate it properly. That is to say, when the government designs the test, people are rightly or wrongly going to assume that it is safe. They can put whatever disclaimers they want on it, that will be the perception of people and companies. Companies wil

  • The standard shouldn't be "Is it perfectly safe" - it should be "is it safer than humans"...

    Humans make shitty drivers...

  • Even if safety is "covered", doesn't anyone notice how often abnormal road situations occur? (Pay attention next time, you'll see what I mean.) Anything out of the norm would snarl these things and all the traffic behind them, daily.
  • Book taxi to drive somewhere quiet for a pickup.

    Put bomb in dummy on back seat while dressed head-to-toe in black.

    Set taxi destination to some sensitive location, or as near as damn it.

    Boom.

    Wait for the fallout as they realise the only thing they have is a rural location and a pre-pay credit card to link it all back to.

  • the rule should be "For the purpose of liability and insurance requirements, any self-driving car is considered as being driven by the CEO"

    So when Jack Krafcik goes to jail for one of the cars killing a pedestrian or passenger then maybe they'll slow down the rollout.

  • I think we're about to see just how good humans are at driving.

"All the people are so happy now, their heads are caving in. I'm glad they are a snowman with protective rubber skin" -- They Might Be Giants

Working...