Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Courts Transportation

Tesla Owner in Autopilot Crash Won't Sue, But Car Insurer May (bloomberg.com) 93

Dana Hull, reporting for Bloomberg: A Texas man said the Autopilot mode on his Tesla Model S sent him off the road and into a guardrail, bloodying his nose and shaking his confidence in the technology. He doesn't plan to sue the electric-car maker, but his insurance company might. Mark Molthan, the driver, readily admits that he was not paying full attention. Trusting that Autopilot could handle the route as it had done before, he reached into the glove box to get a cloth and was cleaning the dashboard seconds before the collision, he said. The car failed to navigate a bend on Highway 175 in rural Kaufman, Texas, and struck a cable guardrail multiple times, according to the police report of the Aug. 7 crash. "I used Autopilot all the time on that stretch of the highway," Molthan, 44, said in a phone interview. "But now I feel like this is extremely dangerous. It gives you a false sense of security. I'm not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn't stop -- it actually continued to accelerate after the first impact into the guardrail." Cozen O'Connor, the law firm that represents Molthan's auto-insurance carrier, a unit of Chubb Ltd., said it sent Tesla Motors Inc. a notice letter requesting joint inspection of the vehicle, which has been deemed a total loss.
This discussion has been archived. No new comments can be posted.

Tesla Owner in Autopilot Crash Won't Sue, But Car Insurer May

Comments Filter:
  • "he reached into the glove box to get a cloth and was cleaning the dashboard seconds before the collision"

    The Tesla has a clear warning that "autopilot" is not "self-driving", so the driver should have been paying attention to the road, not digging through the glovebox and cleaning the dashboard.

    • So, how does the Tesla autopilot differ from a regular car that does not have such feature? In both you need to pay attention to the road, presumably the same. So why use autopilot at all?

      • by hawguy ( 1600213 ) on Saturday August 20, 2016 @09:34AM (#52737639)

        So, how does the Tesla autopilot differ from a regular car that does not have such feature? In both you need to pay attention to the road, presumably the same. So why use autopilot at all?

        I asked a guy I know who drives a P90 the same question - he said he uses it for several reasons. One, he feels that it gives him a level of safety above his own capabilities, if he sneezes or is otherwise distracted, he likes knowing that the car *may* be able to take over (which is quite a bit better than a standard car which cannot take over at all), also he said that as he ages, he feels that his reflexes are getting slower, so he likes that the car is watching over his driving. Second, his uncle died after a stroke while driving with his wife, the stroke didn't kill him, but running off the road killed both him and his wife, if he'd been in a self driving car, it's likely that the car would have just continued driving until it sensed that he was no longer in control and pulled off the road. And last, he likes that his use of auto-pilot gives Tesla real-world feedback on the system so they can improve it, so that by the time he's ready to give up driving, his car will be fully auto-drive capable. He said that due to this last point, he enables auto-pilot as much as possible.

        And he added that anyone that thinks it can drive the car unattended is an idiot.

        • by eionmac ( 949755 )

          Good sense in your relative!

  • Lol (Score:5, Insightful)

    by waspleg ( 316038 ) on Friday August 19, 2016 @08:22PM (#52735997) Journal

    "But now I feel like this is extremely dangerous.

    No fucking shit, it always was.

    It gives you a false sense of security.

    Sounds like wealth redistribution - Darwin style.

    I'm not ready to be a test pilot.

    Well, obviously you should have been since wanting to be an early adopter of a nascent technology that hasn't been thoroughly vetted at all to DRIVE YOUR FUCKING CAR sure sounds like test pilot to me.

    Probably get modded down. Don't give a fuck. I think this shit will/has been pushed out the door too early because money. Wait til it kills someone else.

    There are lawyers with erections they're not even sure how they got right now.

    • "But now I feel like this is extremely dangerous.

      No fucking shit, it always was.

      But Elon promised us it was safer than human driving!

      • But Elon promised us it was safer than human driving!

        It can both be safer than humans while still not being perfect. I only expect that over time self driving cars will reduce accidents, not eliminate them. Anytime you're moving at high speed there's an element of risk.

        • I agree entirely however- if this accident was as advertised (dubious at this point), then what should have been an easy situation failed and almost turned dangerous.

          I've seen videos of the car driving. This sounds like a trivial case and the car had driven the route before successfully.

          Something is odd.

        • But Elon promised us it was safer than human driving!

          It can both be safer than humans while still not being perfect. I only expect that over time self driving cars will reduce accidents, not eliminate them. Anytime you're moving at high speed there's an element of risk.

          Actually it can be buggy as hell and still be safer than these humans, because outside of the mistakes the feedback loops will be fast and accurate. Only a small percentage of human drivers have fast, accurate reflexes that are engaged for the same percentage of the time.

        • Autopilots on plane also save lives and reduce accidents, but are not allowed to be used until certified. The FAA does not allow beta software on commercial jets.
      • But Elon promised us it was safer than human driving!

        And why isn't it? I suppose you have done very intense study on the accident figures and rates and can make a well informed comparison?

    • Probably get modded down. Don't give a fuck. I think this shit will/has been pushed out the door too early because money. Wait til it kills someone else.

      Don't worry, the Tesla and Musk apologists will find a way to explain it away. Flat earthers and bible thumpers have nothing on them when it comes to selective reality.

      • If they had just named it better it would be a lot easier to defend them. Something with the word "cruise" in it maybe, instead of "pilot."

    • Well, obviously you should have been

      Well obviously you should have been because the car frigging warned you that this is an early technology you'd be testing and to not take your hands off the wheel or let your attention lapse when you first turned it on.

      Operator doesn't read instructions and hurts himself. News at 11.

      • by Cederic ( 9623 )

        Most of us don't read instructions. Instructions are for the weak, the incapable and the badly designed UI.

  • This is why I was making a question just the other day about the tech being used for the autopilot.

    I've read explainers, watched videos talking about the tech, it doesn't add up.
    Not for this case, and not for that one which ended in a fatality.

    Tesla Model S is supposed to have a camera, a bunch of ultrasonic sensors, and a radar in front of the car.
    Those sensors either react too slow, which would make them useless, the software is bad, or quality control isn't working well enough.

    Because if you think about

    • by mysidia ( 191772 )

      Are those working at all? What sort of condition is required to make 3 different colision detection systems fail all at once?

      How about.... Because of the way those systems work, and the way the self-driving system works.. each of those systems is a single point of failure
      in some driving situations?

      Meaning if just one of those systems goes offline or fails to do its job properly, then there are some crash risks/emergencies which will not be detected, or the self-driving system will not notice.

  • +Class action prohibited And Binding arbitration with Tesla for any accident that occurs as a result of you operating the vehicle, And a restriction that You may not convey any of your dispute rights or capability to sue us to any insurance company or other 3rd party; any claim must be pursued solely by you, with sworn statement that no insurer or 3rd party will have interest in any settlement paid to you for dispute resolution.

    • by rch7 ( 4086979 )

      What a nonsense, how can you prevent insurance company from going to court. They paid for damages and they can sue for themselves. Even if such clause would be legal and enforceable, insurances company can always refuse to write policy for anything that has word "Tesla" on it, and good luck selling cars with such smart ass contracts then.

      • by mysidia ( 191772 )

        good luck selling cars with such smart ass contracts then.

        It won't be a problem.... Nobody ever reads them anyways. Also, accepting the EULA terms becomes a
        requirement not to own the car, But to Activate the software license key which enables the Self-Driving Option.

        Don't agree to the EULA, then no AutoPilot for you.

  • by brunes69 ( 86786 ) <[slashdot] [at] [keirstead.org]> on Saturday August 20, 2016 @09:44AM (#52737677)

    I am getting really sick of the media and others bashiing self-driving technology when they can't see the forest for the trees - no new technology is ever perfect. When commercial air travel first started in the 20s, crashes happened all the time - it was extremely dangerous by modern standards, and even more dangerous than current car travel. Air travel is now by orders of magnitude the safest way to travel on earth - how did that come to be? It came to be because the regulation ensured that accidents were investigated, root cause analysis done, and whatever deficiency was found was addressed.

    This is the exact same thing that will happen with self-driving technology, except that it will happen at an EXPONENTIALLY faster pace.

    Yes, people will get into accidents with self-driving cars. Yes, people will die. Anyone who does not think this is going to happen is living behind a reality distortion field. However, what happens with self-driving technology is that every single accident gives the opportunity to push software updates out to make EVERY CAR instantly safer. This is simply not the case with human drivers - when a human driver causes an accident, there is no feedback loop that makes all other human drivers safer.

    • There is a big difference between the way Google is doing it and Tesla. I am completely behind Google's methodology. Tesla, not so much.
      • only that google's method will not scale. And does it adapt to road changes?
        Sometimes it's like the roomba vs neato arguments :-)
        Future will tell.

    • by rch7 ( 4086979 )

      Nobody bashes self-driving technology. People bash swindlers who sell adoptive cruise control as cutting edge "self-driving", so that they would be able to sell new stock for billions each year. There is nothing "self driving" in it, but some brainwashed fanboys still imagine that they are getting close to "self driving" with this technology and play Russian roulette with people lives around, that is the whole problem.

  • I simply stepped into the back seat to make myself a sandwich and, being overcome with fatigue, decided to take a nap...

If you want to put yourself on the map, publish your own map.

Working...