

Tesla Owner in Autopilot Crash Won't Sue, But Car Insurer May (bloomberg.com) 93
Dana Hull, reporting for Bloomberg: A Texas man said the Autopilot mode on his Tesla Model S sent him off the road and into a guardrail, bloodying his nose and shaking his confidence in the technology. He doesn't plan to sue the electric-car maker, but his insurance company might. Mark Molthan, the driver, readily admits that he was not paying full attention. Trusting that Autopilot could handle the route as it had done before, he reached into the glove box to get a cloth and was cleaning the dashboard seconds before the collision, he said. The car failed to navigate a bend on Highway 175 in rural Kaufman, Texas, and struck a cable guardrail multiple times, according to the police report of the Aug. 7 crash. "I used Autopilot all the time on that stretch of the highway," Molthan, 44, said in a phone interview. "But now I feel like this is extremely dangerous. It gives you a false sense of security. I'm not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn't stop -- it actually continued to accelerate after the first impact into the guardrail." Cozen O'Connor, the law firm that represents Molthan's auto-insurance carrier, a unit of Chubb Ltd., said it sent Tesla Motors Inc. a notice letter requesting joint inspection of the vehicle, which has been deemed a total loss.
Re: (Score:2)
Did the autopilot function as designed?
Re: (Score:3, Interesting)
Re: (Score:2)
Not certain. The guy claimed he was reaching into the glove box.
He's not going to sue. I find that highly suspicious. In America, everybody sues over things like spilled hot coffee, poodles in microwaves and confusing the gas and break pedal. So if he is not going to sue, he has something to hide.
If the driver did sue Tesla, Tesla would send in a battalion of private detectives to investigate everything with a scanning electron microscope. If the accident was his fault, he can't afford that to happen. Ditto for the insurance company suing Tesla; the driver cannot
Re: (Score:3)
He's not going to sue. I find that highly suspicious. In America, everybody sues over things like spilled hot coffee, poodles in microwaves and confusing the gas and break pedal. So if he is not going to sue, he has something to hide.
Or maybe, just maybe, he's an honest person who's already admitted that he wasn't paying attention which Tesla tells you to do while using auto-pilot.
If the accident was his fault, he can't afford that to happen.
It was his fault. He's practically admitted as much.
Re: Driver or Autopilot? (Score:2)
Re: (Score:2)
In America,
There's your problem... this occurred in Texas.
Re: (Score:2)
Are serious? What is he can sue for? $500 deductible? He is not even injured, insurance company pays for his car.
It is insurance company who decides sue or not sue. As is a bit expensive, it may decided to sue. It is not millions in some fatal accident like the one in FL, but still some money. Maybe not enough to warrant lawsuit with legal expenses when outcome is not clear, but still possible. Tesla has marketed autopilot as mostly "autonomous" and Tesla salesmen demonstrated hands free driving, so some le
Re: (Score:2)
What is he can sue for?
People don't get sued in the US because they did something wrong . . . they get sued because they have money. A sexual harassment at a workplace . . . who gets sued? Not the perpetrator . . . he has no money. The employer company gets sued, because they have enough money to make it worthwhile for the lawyer involved.
If there is no obvious reason to sue, the lawyer will create one. The driver has already stated that Tesla's autopilot gave him a "false sense of security", or some weasel words like that.
Re: Driver or Autopilot? (Score:2)
Re: Driver or Autopilot? (Score:3)
in this case he's not dispu
Re: (Score:2)
Even the drive by wire ones turned out to be wrong pedal accidents as well in the recent Toyota crashes. The only "fixes" they had were to remove any unsecured floormat and a software update that shut off the gas when the brake was depressed. NASA wasn't even able to fault Toyota, as the data logs clearly showed the users flooring the gas and not the brake.
Re: (Score:2)
Re: Driver or Autopilot? (Score:2)
Re: (Score:2)
Re: (Score:1)
The insurance company should sue Mark and nobody else.
That doesn't make any sense. Are you implying that Mark has more money that Tesla?
It seems pretty clear who to blame (Score:2, Informative)
"he reached into the glove box to get a cloth and was cleaning the dashboard seconds before the collision"
The Tesla has a clear warning that "autopilot" is not "self-driving", so the driver should have been paying attention to the road, not digging through the glovebox and cleaning the dashboard.
Re: (Score:1)
Remember Insurance companies can jack up the rates on Non-self-driving self-driving cars.
But they won't, because those cars are safer than human driven cars, and they don't have an agenda, just statistics.
Then the Owners will demand it be disabled
I demand that you disable this thing because I don't have the willpower not to turn it on.
Re: (Score:2)
Re: (Score:1)
Wrong. Autopilots in all commercial aircraft are capable of flying a route automatically, including changes in altitude and speed along the way. Level 1 autopilots (i.e. wing levelers) are only found in low-end general aviation aircraft. Now please stop spewing misinformation.
Re: (Score:2)
Wrong. Autopilots in all commercial aircraft are capable of flying a route automatically, including changes in altitude and speed along the way. Level 1 autopilots (i.e. wing levelers) are only found in low-end general aviation aircraft. Now please stop spewing misinformation.
You mean the kind of aircraft that the vast of majority of people would actually fly if they were to become private pilots?
Re: It seems pretty clear who to blame (Score:2)
Re: It seems pretty clear who to blame (Score:1)
Modded Troll. Why's that?
Re: (Score:2)
Eh, Slashdot? After all the picked nits you've had over the years?
If it isn't free software, don't call it free software.
If it's phr34k1n ur ph()()nez d00d, don't call it a hacker.
Well:
If it isn't autopilot, don't call it autopilot.
What makes you think that "autopilot" will drive your car with no input from you? The autopilot in the vast majority of airplanes will keep your airplane in straight and level flight, might fly the route you give it, but won't avoid obstacles in the way (including terrain, if you tell it to fly at 5000 feet, it will blindly fly you into a 6000 ft mountain), and won't land your plane, it'll fly until you run out of fuel. Likewise, set the autopilot in your boat and let it run unattended and it will run you r
Re: (Score:3)
What makes you think that "autopilot" will drive your car with no input from you? .
Because the "auto" in "autopilot" stands for "automatic" as in, done automatically without user interaction. Bitch all you want, thats just the fucking facts. It's also not an airplane, it's a car that might hit fucking guardrails and other vehicles in midair. Musk is a fucking idiot to think people wouldn't use it like this. Period.
Re: (Score:2)
Re: Splitting Musk's Pubic Hairs Pretty Fine There (Score:1)
A car is also known as an "automobile". Does that mean it automatically ambulate around? Or does auto not mean what you think it means.
Re: (Score:2)
Re: (Score:2)
Are you aware of the origin of the word "pilot"? It refers to steering an oar or a rudder - not ailerons or flaps. It is of nautical origin.
Re: (Score:1)
Last I checked, cars don't use oars or rudders either.
Re: Splitting Musk's Pubic Hairs Pretty Fine There (Score:2)
Musk is a fucking idiot to think people wouldn't use it like this.
Who's saying Musk thought anything of the sort?? Any idiot (other than you, apparently) would've been able to foresee all this; it's a disruptive technology and people are stupid (see above).
Re: Splitting Musk's Pubic Hairs Pretty Fine There (Score:2)
Re: (Score:2)
So, how does the Tesla autopilot differ from a regular car that does not have such feature? In both you need to pay attention to the road, presumably the same. So why use autopilot at all?
Re:It seems pretty clear who to blame (Score:5, Insightful)
So, how does the Tesla autopilot differ from a regular car that does not have such feature? In both you need to pay attention to the road, presumably the same. So why use autopilot at all?
I asked a guy I know who drives a P90 the same question - he said he uses it for several reasons. One, he feels that it gives him a level of safety above his own capabilities, if he sneezes or is otherwise distracted, he likes knowing that the car *may* be able to take over (which is quite a bit better than a standard car which cannot take over at all), also he said that as he ages, he feels that his reflexes are getting slower, so he likes that the car is watching over his driving. Second, his uncle died after a stroke while driving with his wife, the stroke didn't kill him, but running off the road killed both him and his wife, if he'd been in a self driving car, it's likely that the car would have just continued driving until it sensed that he was no longer in control and pulled off the road. And last, he likes that his use of auto-pilot gives Tesla real-world feedback on the system so they can improve it, so that by the time he's ready to give up driving, his car will be fully auto-drive capable. He said that due to this last point, he enables auto-pilot as much as possible.
And he added that anyone that thinks it can drive the car unattended is an idiot.
Re: (Score:2)
His senses are slowing with age to the point he needs assistance, so he buys the racecar version? Nice..
When you have $100K to spend on a car, it's hard to not get one that's a racecar under the hood. The Tesla might have the power of racecar, but it's easy to not drive it like one.
He's in his 50's now, so it's not like he's elderly, and he used to be an actual race car driver, so even if he says he feels slowed reflexes, I'd say he's still better than 90% of drivers out there.
Re: It seems pretty clear who to blame (Score:2)
Re: (Score:1)
Good sense in your relative!
Re: (Score:2)
Lol (Score:5, Insightful)
"But now I feel like this is extremely dangerous.
No fucking shit, it always was.
It gives you a false sense of security.
Sounds like wealth redistribution - Darwin style.
I'm not ready to be a test pilot.
Well, obviously you should have been since wanting to be an early adopter of a nascent technology that hasn't been thoroughly vetted at all to DRIVE YOUR FUCKING CAR sure sounds like test pilot to me.
Probably get modded down. Don't give a fuck. I think this shit will/has been pushed out the door too early because money. Wait til it kills someone else.
There are lawyers with erections they're not even sure how they got right now.
Re: (Score:2)
"But now I feel like this is extremely dangerous.
No fucking shit, it always was.
But Elon promised us it was safer than human driving!
Re: (Score:2)
But Elon promised us it was safer than human driving!
It can both be safer than humans while still not being perfect. I only expect that over time self driving cars will reduce accidents, not eliminate them. Anytime you're moving at high speed there's an element of risk.
Re: (Score:1)
I agree entirely however- if this accident was as advertised (dubious at this point), then what should have been an easy situation failed and almost turned dangerous.
I've seen videos of the car driving. This sounds like a trivial case and the car had driven the route before successfully.
Something is odd.
Re: (Score:3)
But Elon promised us it was safer than human driving!
It can both be safer than humans while still not being perfect. I only expect that over time self driving cars will reduce accidents, not eliminate them. Anytime you're moving at high speed there's an element of risk.
Actually it can be buggy as hell and still be safer than these humans, because outside of the mistakes the feedback loops will be fast and accurate. Only a small percentage of human drivers have fast, accurate reflexes that are engaged for the same percentage of the time.
Re: (Score:2)
Re: (Score:2)
Even if you get that statistic, all it would mean is that we should reduce emissions, not increase second hand smoke.
Re: (Score:2)
But Elon promised us it was safer than human driving!
And why isn't it? I suppose you have done very intense study on the accident figures and rates and can make a well informed comparison?
Re: (Score:2)
Don't worry, the Tesla and Musk apologists will find a way to explain it away. Flat earthers and bible thumpers have nothing on them when it comes to selective reality.
Re: (Score:2)
If they had just named it better it would be a lot easier to defend them. Something with the word "cruise" in it maybe, instead of "pilot."
Re: (Score:2)
The truth doesn't sound edgy enough and doesn't contain any buzzwords.
Re: (Score:2)
Well, obviously you should have been
Well obviously you should have been because the car frigging warned you that this is an early technology you'd be testing and to not take your hands off the wheel or let your attention lapse when you first turned it on.
Operator doesn't read instructions and hurts himself. News at 11.
Re: (Score:2)
Most of us don't read instructions. Instructions are for the weak, the incapable and the badly designed UI.
Questioning... (Score:1)
This is why I was making a question just the other day about the tech being used for the autopilot.
I've read explainers, watched videos talking about the tech, it doesn't add up.
Not for this case, and not for that one which ended in a fatality.
Tesla Model S is supposed to have a camera, a bunch of ultrasonic sensors, and a radar in front of the car.
Those sensors either react too slow, which would make them useless, the software is bad, or quality control isn't working well enough.
Because if you think about
Re: (Score:2)
Are those working at all? What sort of condition is required to make 3 different colision detection systems fail all at once?
How about.... Because of the way those systems work, and the way the self-driving system works.. each of those systems is a single point of failure
in some driving situations?
Meaning if just one of those systems goes offline or fails to do its job properly, then there are some crash risks/emergencies which will not be detected, or the self-driving system will not notice.
Re: (Score:1)
You need a pretty hard impact for the airbags to deploy.
Re: An issue (Score:2)
Sounds like Tesla need a MS-style software EULA.. (Score:2)
+Class action prohibited And Binding arbitration with Tesla for any accident that occurs as a result of you operating the vehicle, And a restriction that You may not convey any of your dispute rights or capability to sue us to any insurance company or other 3rd party; any claim must be pursued solely by you, with sworn statement that no insurer or 3rd party will have interest in any settlement paid to you for dispute resolution.
Re: (Score:2)
What a nonsense, how can you prevent insurance company from going to court. They paid for damages and they can sue for themselves. Even if such clause would be legal and enforceable, insurances company can always refuse to write policy for anything that has word "Tesla" on it, and good luck selling cars with such smart ass contracts then.
Re: (Score:2)
good luck selling cars with such smart ass contracts then.
It won't be a problem.... Nobody ever reads them anyways. Also, accepting the EULA terms becomes a
requirement not to own the car, But to Activate the software license key which enables the Self-Driving Option.
Don't agree to the EULA, then no AutoPilot for you.
Feedback Loop (Score:3)
I am getting really sick of the media and others bashiing self-driving technology when they can't see the forest for the trees - no new technology is ever perfect. When commercial air travel first started in the 20s, crashes happened all the time - it was extremely dangerous by modern standards, and even more dangerous than current car travel. Air travel is now by orders of magnitude the safest way to travel on earth - how did that come to be? It came to be because the regulation ensured that accidents were investigated, root cause analysis done, and whatever deficiency was found was addressed.
This is the exact same thing that will happen with self-driving technology, except that it will happen at an EXPONENTIALLY faster pace.
Yes, people will get into accidents with self-driving cars. Yes, people will die. Anyone who does not think this is going to happen is living behind a reality distortion field. However, what happens with self-driving technology is that every single accident gives the opportunity to push software updates out to make EVERY CAR instantly safer. This is simply not the case with human drivers - when a human driver causes an accident, there is no feedback loop that makes all other human drivers safer.
Re: (Score:2)
Re: (Score:2)
only that google's method will not scale. And does it adapt to road changes? :-)
Sometimes it's like the roomba vs neato arguments
Future will tell.
Re: (Score:3)
Nobody bashes self-driving technology. People bash swindlers who sell adoptive cruise control as cutting edge "self-driving", so that they would be able to sell new stock for billions each year. There is nothing "self driving" in it, but some brainwashed fanboys still imagine that they are getting close to "self driving" with this technology and play Russian roulette with people lives around, that is the whole problem.
Terrible malfunction (Score:2)