NHTSA Gives Green Light To Self-Driving Cars 220
New submitter tyme writes: Reuters reports that the U.S. National Highway Traffic Safety Administration (NHTSA) told Google that it would recognize the artificial intelligence in a self-driving car as the "driver" (rather than any of the occupants). The letter also says that NHTSA will write safety rules for self-driving cars in the next six months, paving the way for deployment of self-driving cars in large numbers.
Instance or class? (Score:5, Interesting)
So is each individual instance of an AI a driver? Each version of the software? Each combination of hardware and software?
If a single car is found to be doing something that would have its license revoked, does that car lose its license, or are all Google cars immediately banned from driving? Would a version tweak cause that license to be reinstated, or would Google be out of the self-driving-car business?
Re: (Score:2)
I would imagine that's part of the rules being written over the next six months. Rationally, we know the correct solution is "it depends". A software error could get the license revoked across-the-board until it received an update to resolve the situation, and a mechanical error would get an individual fixed.
Re:Instance or class? (Score:4, Insightful)
I wouldn't be surprised if the "driver" was considered a "minor", and the person in the driver's seat was considered to be in an overseeing position with overall responsibility for ensuring that an accident did not occur. Obviously, that would require that the "adult" have the ability to take over safely and at least get the car pulled over to the shoulder or to evade a problem. More to the point, the "adult" would have to be paying attention to some degree.
I doubt that anyone is going to allow the driver to be completely off the hook for this, although they could simply set it up so that the owner was responsible if they were not keeping their car patched.
Re: (Score:2)
keeping their car patched may only be part of the issue.
Like what if ford says your car is 2 years old no more updates buy a new car!
Rented / non owned car?
The state fails to update the software on traffic monitoring systems that sends out bad info.
The firemen in a rush to save people failed to set the road blocked flags in the right way.
The city waterworks crew who was called in on call to fix a pipe did not have an system to flag the road off.
The cable subcontractor is to cheap to buy the flagging system
Re: (Score:2)
For the "driver" to be considered a minor, they'd have to be considered a person at all first. That's a massive stretch. I don't see it happening.
Re: (Score:2)
To be sure, I think the terminology will not be "minor" vs. "adult". I was meaning to imply that the AI might be considered to be similar to a permitted learner or restricted driver who requires certain supervision. I doubt we would attempt to consider the AI to be a person and that is a entirely unnecessary development for this level of computing intelligence.
Re:Instance or class? (Score:5, Informative)
Actually, I've seen answers to all of those questions.
> If I own a self driving car, is my insurance insuring the AI as the driver?
Yes. Google has stated they will assume liability. Other companies pursuing this say the same.
> Is the driving record of that AI individual to my car, or to AI's of that software version ?
This one is actually easier. The insurance industry will have much better figures on the probability of having a claim to pay for the AI drivers, since all those drivers will drive the 'same'. They will be able to say that cars of model X get into .00001 accidents per car per year (or whatever) resulting in $2000 payouts per accident on average (or whatever) and thus will be expected to pay .00001 x $2000 x $INDUSTRY_MARKUP for insurance. Of course it gets a lot more complicated when you have to weigh in modifiers such as the weight of the vehicle (heavier cars cause more damage), the paint job (red cars get more tickets), the environment the car is in (urban cars get hit more), and etc.
> Can I sue the AI, or am I suing the AI manufacturer. Is the AI the car, or separate from the car?
The manufacturer gets sued. The manufacturer would keep insurance and lawyers for these lawsuits.
> am I suing Google or Ford ?
You sue whoever sold you the car. One throat to choke.
Re: (Score:2)
This one is actually easier. The insurance industry will have much better figures on the probability of having a claim to pay for the AI drivers, since all those drivers will drive the 'same'.
They might always drive the 'same' but the conditions on the road are always different.
Re: (Score:2)
Shouldn't the "red car" factor disappear with A.I. drivers?
Re: (Score:2)
Re: (Score:2)
I don't see what you're accepting. If the AI is at fault, Google is paying. If the AI is not at fault, and you're not at fault, the other person's insurance is paying. It doesn't look like you're on the hook unless you personally are at fault, which is as it should be.
Re: (Score:2)
Actually, I've seen answers to all of those questions.
Yes, since all of these questions have already been answered, because SDCs are already on the road in many states. The only thing this NHTSA ruling does is help pave the way to private ownership of fully autonomous cars. But we already have corporations operating SDCs on public roads, and we already have private ownership of semi-autonomous cars, such as Tesla Autopilot, where the software is making decisions. So this ruling doesn't really change anything. It is just another incremental step.
Re: (Score:2)
I think that the only way Google could make those promises is if they operated as a livery service, where they remain the owners and maintainers of the vehicles and the people subscribe to the transportation service.
Re: (Score:2)
This is not a given. The plaintiff has to right to sue all parties that the plaintiff believes contributed to the accident. It's up to the courts to dismiss the suit against the owner of the vehicle or the person determined to be in a position to take corrective action.
I don't believe the court will automatically dismiss the suit since the person in the vehicle still played a role in the accident by not taking
Re: (Score:2)
My bet would be that the AI will refuse to drive if it is not maintained well enough to do so safely. These systems will be redundant, so one failure should not cause an immediate crash.
Re:Instance or class? (Score:5, Funny)
You'll sue the car, it will then have to get a job as a taxi to work off the judgement.
Longer commute, here I come (Score:5, Insightful)
Re: (Score:2)
You won't even have to pay for parking, because it will drop you off at the entrance and then circle around until you're ready to leave. Or if you work where there's free 1-hour parking, just put your car into "musical chairs" mode. Take that, meter maids!
Re: (Score:2)
To DO things, perhaps? Your idea that (all? most? a lot?) work can be done with a computer and Internet connection is bizarre.
Re: (Score:2)
Well... if it was a salesperson using the car time for taking and making calls, they might still have to travel to their client sites for face-to-face meetings and demos.
The context does sound like an office worker, but it doesn't have to be.
Re: (Score:3)
Re: (Score:2)
Productive could also be "get all your online shopping done before you get to work". It doesn't necessarily mean productive for your employer.
Re: (Score:2)
Re: (Score:2)
Companies need to embrace telecommuting rather than putting up obstacles to it.
1. Be careful what you wish for. If a job can be done remotely from your house, then it can also be done remotely from Bangalore.
2. Have you ever worked for a company with many employees telecommuting? They tend to be dysfunctional, with many workers out-of-the-loop, and poor coordination. It is surprising how much companies rely on informal communication around the water cooler, or chance meetings in the break room.
Re: (Score:2)
Have you ever worked for a company with many employees telecommuting? They tend to be dysfunctional, with many workers out-of-the-loop, and poor coordination. It is surprising how much companies rely on informal communication around the water cooler, or chance meetings in the break room.
Correct you are.
The real key is communication skills. Most people don't have them, or have them just enough to get by.
Face to face communication is by far the best way to communicate critical or time sensitive info. Email, phone calls, etc will work, but they just add time and inconvenience.
Insurance? (Score:2)
So if the A.I. is the driver under the law, who needs to buy the insurance? Me or the company previously known as Google?
Re: (Score:2)
I think you don't need insurance if you are able to assume the costs of any potential liability. Most people can't do this, but it's no problem for Google.
Re: (Score:2)
They'll include 20 years of insurance?
Ouch (Score:2)
Wonder how much Google public liability insurance premium just increased by.
Because, sorry, but the "AI" is really just a set of rules still. A set of rules that can't take account of every situation. Sure, it can drive more carefully than a human driver, but it can also make just the same kind of dumb mistakes as a human driver too.
But with the consequence that the first accident of note will result in all kinds of problems for EVERY instance of that model running in EVERY model of that self-driving car,
Re: (Score:3, Insightful)
But with the consequence that the first accident of note will result in all kinds of problems for EVERY instance of that model running in EVERY model of that self-driving car, rather than just a single driver being an idiot.
Assuming that accident is the fault of the AI, then you can reasonably expect a patch within a week, a month if the issue is extremely complicated. Good luck fixing human drivers this efficiently.
Re: (Score:3)
Wonder how much Google public liability insurance premium just increased by.
Nothing. It's called Alphabet now. The self driving car subsidiary has probably been left holding zero assets to handle the possibility that a horrible accident occurs and the victims try suing the company.
Re: (Score:2)
Because, sorry, but the "AI" is really just a set of rules still. A set of rules that can't take account of every situation. Sure, it can drive more carefully than a human driver, but it can also make just the same kind of dumb mistakes as a human driver too.
Yes, but at the heart of the algorithm is a big overriding rule of "don't drive off the road or hit anything". That's pretty cut and dried as far as rules go. The car's hardware can literally see in every direction and track everything around it, static or moving. It will react to danger and determine the best course of action even before most humans even recognize there's a problem. Unless there's a really serious flaw in the system, that means at worst the car is going to come to a stop or simply avoi
I won't ever trust an AI driver (Score:5, Funny)
Until they can flip a bird and show unambiguous road rage
a way to do this "safely" (Score:3)
Phase 1: limit AI driven cars to say 35mph or under "network" control (in either case Hazard Lights GO ON)
Phase 2: increase speed by 10mph and put laws in place that a car in AI mode is exempted from DWI (as long as the car is driving directly HOME or to the nearest medical facility)
Phase 3: increase speed by another 10 MPH (or to current speed limit)
Phase 4: AI cars allowed to not have Hazard lights ON unless otherwise needed ....
Phase N: AI cars allowed to travel without somebody in the car and to pickup children (note we had better have KITT level AIs in cars at this point)
Re: (Score:2)
Phase 2: increase speed by 10mph and put laws in place that a car in AI mode is exempted from DWI (as long as the car is driving directly HOME or to the nearest medical facility)
I don't think that the restriction is required. The problem with DWI isn't your location, it's your ability to control the vehicle safely. If drunk guy wants his car to drive itself to the middle of nowhere, and it is under the control of the AI the entire way, I don't see why he'd be busted for operating a vehicle under the influence. He might wonder why he was at the county landfill in the morning when he sobered up, but he would have gotten there safely.
Re: (Score:2)
Phase 1: limit AI driven cars to say 35mph or under "network" control (in either case Hazard Lights GO ON)
No. You are going to need a new signal. Hazard lights already have a meaning, and that meaning is "I am stopped or otherwise below the minimum speed for this roadway which I am in, and thus obstructing it — go around me when/if it is safe".
Re: (Score:2)
trying not to add "Yet Another Light" for folks to have on the car but your point is valid.
maybe lights on either side of the Third Brake light?? (flash yellow for automode flash red for driver "offline" call EMS)
legal considerations . (Score:2)
Do I still pay for insurance or can the cost of incidents be charged to the software maker?
Will the cost of my insurance vary depending upon the safety record of the software provider?
Will I be unable to use my vehicle if flaws are discovered in the software? This assumes that Big Brother can disable any vehicle or class of vehicles from a central control location. Which also assumes that Small Hacker can also disable vehicles. Which also assumes that forced updates will be required and that end user modifi
One step closer to personhood (Score:2)
The Great Race ! (Score:5, Interesting)
'The Great Race' was a 1965 movie and also an American tradition. Competitors race from one side of the country to the other in various vehicles with various rules.
Self driving cars will surely do the same. They will be judged on safety and speed and technicalities like choosing the best route and handling obstacles. Car buyers will want this information and car makers will struggle to optimize their software to win the next race.
Google is on crack (Score:3, Interesting)
Google "expresses concern that providing human occupants of the vehicle with mechanisms to control things like steering, acceleration, braking... could be detrimental to safety because the human occupants could attempt to override the (self-driving system's) decisions," the NHTSA letter stated.
Bullshit. Vehicles must have a full set of manual controls available to the human operator at all times, and furhermore they must be fully educated, trained, licensed, and insured, just like always. To do otherwise is what will put people's lives at risk. Google is smoking crack and needs to be put in their place.
Re:Google is on crack (Score:4, Insightful)
Maybe at first, but in the long term humans will not even know what it means to drive a vehicle. Ultimately it will come down to safety... once they get all this figured out it will just be too risky to have humans manually driving. Software can be buggy but humans are reckless. Software bugs can be fixed but we have not figured out an effective way to keep people from being reckless.
Oblig. (Score:2)
A brilliant red Barchetta, from a better vanished time
I fire up the willing engine, responding with a roar
Tires spitting gravel I commit my weekly crime
Re: (Score:2)
Sure. We can let you have some control. All liability falls on you whenever you take control of part of the driving.
Re: (Score:2)
I assume it will only work if the driver has 100 percent or the AI has 100 percent. Yes you can have your controls. But its either you or the AI not a mix.
Re: (Score:2)
Tickets / civil liability / criminal liability? (Score:3)
Tickets / civil liability / criminal liability?
And with tickets you have
parking ticket is issued by a private, non-governmental parking authority patrolling an office parking lot or shopping area
private security guards issuing live speeding / moving tickets (non state tickets) (some HOA's and some parking lots or shopping area)
moving tickets from a real cop
parking tickets
red light tickets (parking like)
red light tickets (moving like)
speed camera tickets (moving like)
speed camera tickets (parking like)
toll violations (I can see an auto driver car with bad DB info getting one in some settings)
Re: (Score:2)
I'm curious where you live that an office parking lot or shopping area can give you a ticket. Private property (here at least) doesn't grant you the authority to fine people. It just grants you the right to ask them to leave, have their car towed, etc.
Re: (Score:2)
not near me but there are places where they try bs like that. But lets say goolge owns the car there may be a big up tick in $10-$40 bs tickets that Google will just pay vs fighting them.
Re: (Score:2)
Lots of places that give you the authority to park on the property with some type of pass issue tickets. Everything from movie studios to college campuses.
Re: (Score:2)
college campuses have real cops.
Re: (Score:2)
First, imagine this: the price of a self-driving car will be too high to allow it to sit idle in your garage. You will never, ever own a self-driving car. You can choose to buy a car that you drive yourself, or you can choose to consume Transportation-as-a-Service (TaaS). Uber will provide it. Google will provide it. Maybe car manufacturers will provide it. You can answer all your questions yourself if you replace "self-driving car" with "taxi."
TaaS will be in competition with car ownership, so providers wi
Re: (Score:2)
Then who is willing to own a auto taxi without an IC driver to pass the blame to?
So how much were they bribed? (Score:2)
Seriously... I can't think of any way shape or form that the "AI" behind a "self-driving car" is anywhere near ready for full legal responsibility for this.
Google (and/or other tech companies trying to get this to happen) must have placed tremendous pressure on them to make this happen.
Re: (Score:3)
Seriously... I can't think of any way shape or form that the "AI" behind a "self-driving car" is anywhere near ready for full legal responsibility for this.
An AI cannot have legal liability; it is a machine. Depending on how this shakes out, either the auto seller (Google, Toyota) or the auto owner will provide insurance and have legal liability.
And since human drivers are almost universally incompetent, as long as the AI driver is more competent than the average human (a low bar), the insurance will be cheaper than insurance for human drivers.
Re: (Score:2)
I'm not sure what you are suggesting, or where the savings would supposedly come from. The market price for a car with insurance included would certainly be higher than the market price for a car with no insurance included.
Re: (Score:3)
I'm suggesting if Google is driving, and the passengers are passengers, then why the hell would anybody pay for things like liability insurance for an AI?
Could it be because it's still going to have a "fuck it, you drive" mode which passes responsibility to the human so Google can claim they're not responsible?
A self driving car becomes useful when I can have no controls, and be asleep in the back. I don't pay liability insurance on a bus, train or taxi ... why the hell would I pay it when something create
Re: (Score:2)
If there is a risk from self-driving cars that needs to be insured, you are going to pay for it one way or another. If you don't buy your own insurance, Google will simply have to add it to the purchase price.
It's similar to malpractice insurance: you may think you force your doctor to pay for it, but they just add it back to your bill.
In the end, it's you
Re: (Score:2)
Re: (Score:2)
Could it be because it's still going to have a "fuck it, you drive" mode which passes responsibility to the human so Google can claim they're not responsible?
Well no, if it had a "fuck it, you drive" mode then there would need to be no regulatory changes. Also, it's right in the article that Google wants to avoid the driver panicking and taking control of the car when something goes wrong.
Re: (Score:2)
I'm suggesting if Google is driving, and the passengers are passengers, then why the hell would anybody pay for things like liability insurance for an AI?
You are going to need basic liability insurance no matter what, but it should be a lot cheaper in a car that you're not allowed to drive, because you won't be able to cause an accident.
Could it be because it's still going to have a "fuck it, you drive" mode which passes responsibility to the human so Google can claim they're not responsible?
For the foreseeable future, cars are going to have a human-driven mode, so you're going to need liability insurance for that. If you're willing to let your insurer into your car's data, perhaps they will give you a discount if you don't actually use it.
A self driving car becomes useful when I can have no controls, and be asleep in the back. I don't pay liability insurance on a bus, train or taxi ... why the hell would I pay it when something created by Google is in charge of driving it?
Mechanical failure. Again, your rates should go down if you're not driving
Re: (Score:2)
I'm suggesting if Google is driving, and the passengers are passengers, then why the hell would anybody pay for things like liability insurance for an AI?
Same reason I can lend my car to someone with a driver's license but no car and thus no insurance of his own. Google's driving but it's still your property and that makes you liable. Say you walk into a store and a light fixture falls on your head. Maybe it's a manufacturing error, maybe it's shoddy work in construction, maybe it's sabotage (unlikely) or whatever. It doesn't matter to you because you sue the store, the store manager can't just pass the buck. Even if they find it was a manufacturing error an
Re: (Score:2)
There's also a reason it's called a self-driving car.
It's driving, or I'm driving. This isn't Schroedinger's driver.
From your link:
No, because I'm no
Re: (Score:2)
I agree.. otherwise your self driving car becomes a game of Russian roulette. Will you have accidents and have to claim responsibility for them? Maybe, maybe not is not a good enough answer.
Re:Good ... (Score:4, Insightful)
That's my major problem with this technology: there's an awful lot vague answers to specific questions.
A "self driving car" means you put little Timmy in it, send him to school, and monitor it on your cell phone to confirm he gets out in the right place and a teacher has collected him ... or it means you come out of a bar, fall into the backseat, and say "home, James" ... or it means grandpa who has lost his vision and his driver's license can get in and say "take me to my doctor's appointment".
No driver's license or legal responsibility for operating the vehicle at all. You are livestock being transported. You're not driving or operating, you simply told it your destination.
This bizarre model in which the car drives, except when it doesn't, and with no clear demarcation between is damned near impossible to make sense of.
If the car decides it's got no idea what to do, and it just says "you're in charge", and before you even know what's happening you're in an accident .. and the logs say "human was driving, his fault", you're screwed. Or, worse, someone builds in code which lies and just says "human was driving" 5 minute before any crash is triggered (so they can avoid liability).
There can't be a gray area between who is in charge and who isn't. And paying for liability insurance when the computer is in charge sounds moronic to me, why would you do that? Are you accepting liability on behalf of the computer or something?
Self-driving-ish cars? Autonom-ish cars? It just seems like everybody is pretending this is a solved issue, and I don't believe it is.
Re: (Score:2)
There is no "this model"; people just don't know yet what form self-driving cars will take. There may well be several different classes, depending on speed, kind of AI, and kind of driver.
Re: (Score:2)
jonny cab (Score:2)
jonny cab is not responsible for injury or death! We hope you enjoy the ride and ride we are not a amusement park so there is no operator on board.
Re: (Score:2)
Self-driving-ish cars? Autonom-ish cars? It just seems like everybody is pretending this is a solved issue, and I don't believe it is.
It isn't, but the cars aren't ready to drive themselves 100% of the time anyway. They need a lot more data. So they're going to put the cars with the technology out there, and start collecting it. Then they'll determine how it's going to work as they go along... and by "they" I mean automakers and regulators alike.
Re: (Score:3)
You're arguing a situation that won't happen.
It's simple, if the car runs into a situation it doesn't know how to handle it will come to a stop. At that time the human operator can take control. The car won't just hand off control without warning, in fact the car won't hand off control at all -- the human would have to take control.
Re: (Score:2)
You assert this as a fact. Citation? Or are you just deciding it's true? (If it's true, I'd love to know.)
If the car won't just hand off control without warning, then I should be able to be asleep in the back. If I can't be asleep in the back, then I don't believe what you say.
If it's full stop, change control, start driving ... then I shouldn't physically be in the driver's seat, to make it 100% explicit.
So far your "simple" scenario has yet to be validate by anybody, and so far all these tests require
Re: (Score:2)
So far your "simple" scenario has yet to be validate by anybody, and so far all these tests require a driver in the seat ready to take controls.
From all the documentation about google cars I've seen, while a professional driver ready to take control is required, the self-driving car will continue to drive until the driver takes positive action to over-ride it.
That's actually how one of the accidents happened - the google car was braking to a stop, the pro disabled it and hit the gas into the back end of the car ahead of him. The car attempts to keep itself safe. Getting 'stuck' is a bigger problem than getting into an accident.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It was solved in Demolition Man. Human drives from home to the freeway on-ramp. Engages auto-pilot and the wheel stows itself. Coming up to the exit the wheel comes back out and the human disengages auto-pilot.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I am sure that the details will be hashed out over the years. You don't know what you don't know and you can't hold back the ocean so basically do your best and write laws that seem to make sense and then let the courts deal with hashing out the details when we have real-world issues.
Re: (Score:2)
This bizarre model in which the car drives, except when it doesn't, and with no clear demarcation between is damned near impossible to make sense of.
If the car decides it's got no idea what to do, and it just says "you're in charge", and before you even know what's happening you're in an accident .. and the logs say "human was driving, his fault", you're screwed. Or, worse, someone builds in code which lies and just says "human was driving" 5 minute before any crash is triggered (so they can avoid liability).
Hell, they're already doing just the opposite. Remember the Hyundai superbowl commercial? Within a certain speed range the car will emergency brake itself to prevent a collision - and that's with a human driver at the wheel.
Given the VW scandal, I think that car companies are going to be under more intense scrutiny for a while. The only time I've heard about self-driving cars that will toss control to a driver were extreme-alpha builds, manned by professional drivers. Modern self-driving cars have the o
Re: (Score:2)
That article didn't read like English. What exactly was the goal of the owner of the Mercedes? What was s/he exempt from? Why was the driver surrounded by police and the plate confiscated if it was a legally obtained plate? How did that link have ANYTHING to do with driver's vs operator's license? What is it you are even trying to say with that statement?
Re: (Score:2)
You routinely buy insurance that protects you from mechanical defect or wear-and-tear. If the defect is egregious enough, lawsuits will be filed (presumably by the insurance companies or as class actions) and you will get some sort of recompense (if only in lower insurance bills than you would otherwise have gotten). You are still making the decision to use an autonomous car, so you are in some sense responsible for that choice. ("Yeah, I know Android Car is a better driver than Microsoft Car, but I got a g
Re: (Score:2)
Google isn't "making you pay" anything. It's your choice to buy or not buy a self-driving car from them and purchase the necessary insurance for it. If you impose liability on Google, they buy their own insurance and add it to the purchase price of the car. Then it's still your choice whether to buy the car or not.
It's the same when you buy a Ferrari: nobody is "making" you buy the car or insure it, but if you choose
Re: (Score:2)
Of course insurance will be different between a self driving car and a human driving car. Just like it is different between my self and my co-workers based on our driving history. However, the question is what is the insurance for. If I am buying liability insurance for a autonomous object then the rationale like car insurance must be such the vehicles are so likely to be in an accident that I need a method of paying off other individuals before I can recoup my losses from google. This is akin to needing
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
That risk is very likely astronomically lower than the risk you, or any human, pose as a driver. So far as I've heard every accident that a SDC has been involved in was clearly the fault of the human driver in the other vehicle.
Re: (Score:2)
Re: (Score:2)
Not if Google adds thousands to the car's price tag, effectively making you pay for the insurance.
Re: (Score:2)
Not if Google adds thousands to the car's price tag, effectively making you pay for the insurance.
SDCs have already driven millions of miles on public roads, and have a far better safety record than human drivers. As software improves, and hardware gets faster, their safety record will get even better. So the cost to insure them will be much lower, regardless of whether the cost of the insurance is incorporated into the car price, or purchased separately.
Consumers will indeed save billions. I would not recommend investing in auto insurance companies. Their business model is due for disruption.
Re: (Score:3)
In the UK, most tiny karate clubs have a GBP 1m public liability insurance, and it costs a pittance each year.
The fact of the number makes no difference, it's what's covered. I imagine they have to cover a lot more, but even the WORST of these may be better than human drivers on average, so it will quickly re-balance once the risk statistics are apparent, even if companies only pay at first for their testing cars.
Honestly, $100k+ liability insurance is pretty low. Even a school will have GBP 5-10 million
Re: (Score:2)
Huh? If you lease any car you have to carry a $100,000 PIP. It's certainly more expensive, but it's hardly out of reach for a normal driver. I was carrying that level of policy when I was 22 years old.
Re: (Score:2)
Thats too bad I'm sure they would have rather been killed than had a broken arm.
Also If you're worrying about the cost of care you have much bigger problems.
Maybe we should teach kids to bike carefully because people in cars aren't looking for them. Helmet or no helmet that part doesn't change.
Just Wait... (Score:3)
...until the first time AI kills someone.
1. AI Manufacturer pays millions and millions in damages?
2. AI Manufacturer finds a way to pawn off responsibility on to the owner.
3. AI Manufacturer passes a law capping damages and maybe even some kind of limited indemnity for the AI Manufacturers
Re: (Score:3)
They'll still find a way to make a human responsible for this. The car makers or Google or whoever, will have more responsibility, but unless it was a completely maliciously covered up bug, then I imagine that AI failure would be treated like a brake failure where the manufacturer will be responsible, but *only* if the AI was being maintained and patched according to the stated guidelines.
Re: (Score:2)
...until the first time AI kills someone.
There have already been many deaths caused by software bugs. There is plenty of legal precedent. This is nothing new.
In most past instances, the manufacturer has been held responsible. The owner of the device may be held partially or fully responsible if they were using their device irresponsibly, or had modified it in a way that caused or contributed to the failure.
Re: (Score:2)
Let's wait until we see if the cars develop sentience and start gunning for pedestrians before we think about giving AI's the ability to order nuclear weapons launches.
Re: (Score:2)
Re: (Score:2)
As General George S. Patton said to the krauts!
That was General Anthony McAuliffe, not Patton.
Re: (Score:2)
I can see this being an issue a bit at first. But as SDC's hit a critical mass it'll stop being an issue because SDC's and their correct driving behavior will be the standard. Hell, I can easily foresee a day when human drivers will only be allowed on closed tracks. Auto accidents like the one you mention will likely become a thing of the past and be a real rarity. Rush hours will likely be much less of a hassle because the SDC's will be able to handle the congestion much more efficiently and intelligently.
Re: (Score:2)
Rush hours will likely be much less of a hassle because the SDC's will be able to handle the congestion much more efficiently and intelligently.
Only up to a certain point, and as politicians would rather spend money on themselves, this will just mean less road construction until things are at least as bad as they were under human control (probably worse, as the time-cost is not as severe when you can do other things during the trip, so less pressure will be brought to bear on fixing things).