Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Google Government Transportation United States Technology

NHTSA Gives Green Light To Self-Driving Cars 220

New submitter tyme writes: Reuters reports that the U.S. National Highway Traffic Safety Administration (NHTSA) told Google that it would recognize the artificial intelligence in a self-driving car as the "driver" (rather than any of the occupants). The letter also says that NHTSA will write safety rules for self-driving cars in the next six months, paving the way for deployment of self-driving cars in large numbers.
This discussion has been archived. No new comments can be posted.

NHTSA Gives Green Light To Self-Driving Cars

Comments Filter:
  • Instance or class? (Score:5, Interesting)

    by iapetus ( 24050 ) on Wednesday February 10, 2016 @10:55AM (#51479019) Homepage

    So is each individual instance of an AI a driver? Each version of the software? Each combination of hardware and software?

    If a single car is found to be doing something that would have its license revoked, does that car lose its license, or are all Google cars immediately banned from driving? Would a version tweak cause that license to be reinstated, or would Google be out of the self-driving-car business?

    • I would imagine that's part of the rules being written over the next six months. Rationally, we know the correct solution is "it depends". A software error could get the license revoked across-the-board until it received an update to resolve the situation, and a mechanical error would get an individual fixed.

  • by Lodlaiden ( 2767969 ) on Wednesday February 10, 2016 @10:57AM (#51479031)
    If I don't have to lose an hour each way on maintaining moderate concentration, moving out of the suburbs into the country suddenly becomes feasible. Sweet! NHTSA approval is a major milestone in this becoming a reality.
    • by Ichijo ( 607641 )

      You won't even have to pay for parking, because it will drop you off at the entrance and then circle around until you're ready to leave. Or if you work where there's free 1-hour parking, just put your car into "musical chairs" mode. Take that, meter maids!

  • So if the A.I. is the driver under the law, who needs to buy the insurance? Me or the company previously known as Google?

  • by ledow ( 319597 )

    Wonder how much Google public liability insurance premium just increased by.

    Because, sorry, but the "AI" is really just a set of rules still. A set of rules that can't take account of every situation. Sure, it can drive more carefully than a human driver, but it can also make just the same kind of dumb mistakes as a human driver too.

    But with the consequence that the first accident of note will result in all kinds of problems for EVERY instance of that model running in EVERY model of that self-driving car,

    • Re: (Score:3, Insightful)

      by Nemyst ( 1383049 )

      But with the consequence that the first accident of note will result in all kinds of problems for EVERY instance of that model running in EVERY model of that self-driving car, rather than just a single driver being an idiot.

      Assuming that accident is the fault of the AI, then you can reasonably expect a patch within a week, a month if the issue is extremely complicated. Good luck fixing human drivers this efficiently.

    • by PPH ( 736903 )

      Wonder how much Google public liability insurance premium just increased by.

      Nothing. It's called Alphabet now. The self driving car subsidiary has probably been left holding zero assets to handle the possibility that a horrible accident occurs and the victims try suing the company.

    • Because, sorry, but the "AI" is really just a set of rules still. A set of rules that can't take account of every situation. Sure, it can drive more carefully than a human driver, but it can also make just the same kind of dumb mistakes as a human driver too.

      Yes, but at the heart of the algorithm is a big overriding rule of "don't drive off the road or hit anything". That's pretty cut and dried as far as rules go. The car's hardware can literally see in every direction and track everything around it, static or moving. It will react to danger and determine the best course of action even before most humans even recognize there's a problem. Unless there's a really serious flaw in the system, that means at worst the car is going to come to a stop or simply avoi

  • by NEDHead ( 1651195 ) on Wednesday February 10, 2016 @11:18AM (#51479203)

    Until they can flip a bird and show unambiguous road rage

  • by laurencetux ( 841046 ) on Wednesday February 10, 2016 @11:20AM (#51479227)

    Phase 1: limit AI driven cars to say 35mph or under "network" control (in either case Hazard Lights GO ON)

    Phase 2: increase speed by 10mph and put laws in place that a car in AI mode is exempted from DWI (as long as the car is driving directly HOME or to the nearest medical facility)

    Phase 3: increase speed by another 10 MPH (or to current speed limit)

    Phase 4: AI cars allowed to not have Hazard lights ON unless otherwise needed ....

    Phase N: AI cars allowed to travel without somebody in the car and to pickup children (note we had better have KITT level AIs in cars at this point)

    • by tnk1 ( 899206 )

      Phase 2: increase speed by 10mph and put laws in place that a car in AI mode is exempted from DWI (as long as the car is driving directly HOME or to the nearest medical facility)

      I don't think that the restriction is required. The problem with DWI isn't your location, it's your ability to control the vehicle safely. If drunk guy wants his car to drive itself to the middle of nowhere, and it is under the control of the AI the entire way, I don't see why he'd be busted for operating a vehicle under the influence. He might wonder why he was at the county landfill in the morning when he sobered up, but he would have gotten there safely.

    • Phase 1: limit AI driven cars to say 35mph or under "network" control (in either case Hazard Lights GO ON)

      No. You are going to need a new signal. Hazard lights already have a meaning, and that meaning is "I am stopped or otherwise below the minimum speed for this roadway which I am in, and thus obstructing it — go around me when/if it is safe".

      • trying not to add "Yet Another Light" for folks to have on the car but your point is valid.

        maybe lights on either side of the Third Brake light?? (flash yellow for automode flash red for driver "offline" call EMS)

  • Do I still pay for insurance or can the cost of incidents be charged to the software maker?

    Will the cost of my insurance vary depending upon the safety record of the software provider?

    Will I be unable to use my vehicle if flaws are discovered in the software? This assumes that Big Brother can disable any vehicle or class of vehicles from a central control location. Which also assumes that Small Hacker can also disable vehicles. Which also assumes that forced updates will be required and that end user modifi

  • If companies can be granted quasi-personhood status, why not a car AI? Are we ready to deal with the implications of car AI rights and car AI voting?
  • The Great Race ! (Score:5, Interesting)

    by swell ( 195815 ) <jabberwock@poetic.com> on Wednesday February 10, 2016 @11:52AM (#51479529)

    'The Great Race' was a 1965 movie and also an American tradition. Competitors race from one side of the country to the other in various vehicles with various rules.

    Self driving cars will surely do the same. They will be judged on safety and speed and technicalities like choosing the best route and handling obstacles. Car buyers will want this information and car makers will struggle to optimize their software to win the next race.

  • Google is on crack (Score:3, Interesting)

    by kheldan ( 1460303 ) on Wednesday February 10, 2016 @12:05PM (#51479651) Journal
    From TFA:

    Google "expresses concern that providing human occupants of the vehicle with mechanisms to control things like steering, acceleration, braking... could be detrimental to safety because the human occupants could attempt to override the (self-driving system's) decisions," the NHTSA letter stated.

    Bullshit. Vehicles must have a full set of manual controls available to the human operator at all times, and furhermore they must be fully educated, trained, licensed, and insured, just like always. To do otherwise is what will put people's lives at risk. Google is smoking crack and needs to be put in their place.

    • by Anonymous Coward on Wednesday February 10, 2016 @01:21PM (#51480401)

      Maybe at first, but in the long term humans will not even know what it means to drive a vehicle. Ultimately it will come down to safety... once they get all this figured out it will just be too risky to have humans manually driving. Software can be buggy but humans are reckless. Software bugs can be fixed but we have not figured out an effective way to keep people from being reckless.

    • I strip away the old debris that hides a shining car
      A brilliant red Barchetta, from a better vanished time
      I fire up the willing engine, responding with a roar
      Tires spitting gravel I commit my weekly crime
    • Sure. We can let you have some control. All liability falls on you whenever you take control of part of the driving.

    • I assume it will only work if the driver has 100 percent or the AI has 100 percent. Yes you can have your controls. But its either you or the AI not a mix.

    • I would expect the biggest and most lucrative application for self driving vehicles would be for freight, not commuters.
  • by Joe_Dragon ( 2206452 ) on Wednesday February 10, 2016 @12:10PM (#51479691)

    Tickets / civil liability / criminal liability?

    And with tickets you have

    parking ticket is issued by a private, non-governmental parking authority patrolling an office parking lot or shopping area

    private security guards issuing live speeding / moving tickets (non state tickets) (some HOA's and some parking lots or shopping area)

    moving tickets from a real cop

    parking tickets

    red light tickets (parking like)

    red light tickets (moving like)

    speed camera tickets (moving like)

    speed camera tickets (parking like)

    toll violations (I can see an auto driver car with bad DB info getting one in some settings)

    • I'm curious where you live that an office parking lot or shopping area can give you a ticket. Private property (here at least) doesn't grant you the authority to fine people. It just grants you the right to ask them to leave, have their car towed, etc.

      • not near me but there are places where they try bs like that. But lets say goolge owns the car there may be a big up tick in $10-$40 bs tickets that Google will just pay vs fighting them.

      • Lots of places that give you the authority to park on the property with some type of pass issue tickets. Everything from movie studios to college campuses.

    • by fizzup ( 788545 )

      First, imagine this: the price of a self-driving car will be too high to allow it to sit idle in your garage. You will never, ever own a self-driving car. You can choose to buy a car that you drive yourself, or you can choose to consume Transportation-as-a-Service (TaaS). Uber will provide it. Google will provide it. Maybe car manufacturers will provide it. You can answer all your questions yourself if you replace "self-driving car" with "taxi."

      TaaS will be in competition with car ownership, so providers wi

  • Seriously... I can't think of any way shape or form that the "AI" behind a "self-driving car" is anywhere near ready for full legal responsibility for this.

    Google (and/or other tech companies trying to get this to happen) must have placed tremendous pressure on them to make this happen.

    • by kqs ( 1038910 )

      Seriously... I can't think of any way shape or form that the "AI" behind a "self-driving car" is anywhere near ready for full legal responsibility for this.

      An AI cannot have legal liability; it is a machine. Depending on how this shakes out, either the auto seller (Google, Toyota) or the auto owner will provide insurance and have legal liability.

      And since human drivers are almost universally incompetent, as long as the AI driver is more competent than the average human (a low bar), the insurance will be cheaper than insurance for human drivers.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...