Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Businesses Government Transportation AI Power Software United States Build Hardware

US Regulators Investigating Tesla Over Use of 'Autopilot' Mode Linked To Fatal Crash (cnbc.com) 379

An anonymous reader quotes a report from CNBC: The U.S. National Highway Traffic Safety Administration said on Thursday it is opening a preliminary investigation into 25,000 Tesla Motors Model S cars after a fatal crash involving a vehicle using the "Autopilot" mode. The agency said the crash came in a 2015 Model S operating with automated driving systems engaged, and "calls for an examination of the design and performance of any driving aids in use at the time of the crash." It is the first step before the agency could seek to order a recall if it believed the vehicles were unsafe. Tesla said Thursday the death was "the first known fatality in just over 130 million miles where Autopilot was activated," while a fatality happens once every 60 million miles worldwide. The electric automaker said it "informed NHTSA about the incident immediately after it occurred." The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield. Tesla quietly settled a lawsuit with a Model X owner who claims his car's doors would open and close unpredictably, smashing into his wife and other cars, and that the Model X's Auto-Pilot feature poses a danger in the rain.
This discussion has been archived. No new comments can be posted.

US Regulators Investigating Tesla Over Use of 'Autopilot' Mode Linked To Fatal Crash

Comments Filter:
  • by friedmud ( 512466 ) on Thursday June 30, 2016 @05:53PM (#52422991)

    It was bound to happen sooner or later.

    Luckily for Tesla this sounds like it couldn't have been avoided in any way.

    There will be more... but, like Tesla says, their Auto-pilot system has thus far proven VERY safe. What remains to be seen is how the world reconciles the fact that there will always be outliers...

    • by Anonymous Coward on Thursday June 30, 2016 @05:59PM (#52423011)

      Luckily for Tesla this sounds like it couldn't have been avoided in any way.

      On the contrary, this seems like exactly the type of collision that auto-pilot systems should offer vastly improved protection from:

      "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

      Seems like a pathetically primitive excuse for an "electronic eye" to me.

      • Re: (Score:3, Interesting)

        by mrspoonsi ( 2955715 )
        This death happened in the area of the car Tesla spoke of the test machine breaking during a crush test of a roof, apparently withstanding the weight of 4 cars on its roof. A Trailer moving sideways crushing the car, you would think if the roof was super strong the car would be pushed sideways, instead of going under.
        • This death happened in the area of the car Tesla spoke of the test machine breaking during a crush test of a roof, apparently withstanding the weight of 4 cars on its roof. A Trailer moving sideways crushing the car, you would think if the roof was super strong the car would be pushed sideways, instead of going under.

          From another story I read, the Tesla was at speed when it hit the trailer that was across the road. Momentum caused the bottom edge of the trailer to shear off the top of the car (and probably the top of the driver) as it passed underneath. It's not like the car was parked and the trailer rolled over it.

          • I first read it as the lorry was in a 2nd lane and moved into the tesla lane trapping the car in the middle as it moved lanes. If indeed the trailer was jack-knifed across the road then there is little the car can do, the radar would see right through the gap. It points to trailers requiring the side rail protectors, then the radar would potentially spot it.
          • Re: (Score:2, Informative)

            by Anonymous Coward

            Yes that is what appears to have happened.

            Washington Post has the actual Florida Hwy Patrol traffic diagram from the accident:

            https://www.washingtonpost.com/news/the-switch/wp/2016/06/30/tesla-owner-killed-in-fatal-crash-while-car-was-on-autopilot/

        • by Nemyst ( 1383049 ) on Thursday June 30, 2016 @06:28PM (#52423151) Homepage
          You might want to read up on the difference between compressive strength and shear strength.
        • The trailer didn't crush the roof, it cut it right off at mid windshield where the support is minimal on all sides due to windows. Probably also took the drivers head off as well.

        • by sjames ( 1099 ) on Thursday June 30, 2016 @06:51PM (#52423327) Homepage Journal

          Crush is not the same as shear.

      • by Anonymous Coward

        Sounds to me like whatever beam they use to look for things in the way passed under the trailer - they'd need a wider range to ensure long hanging objects are detected.
        If that's the problem, I'd call that negligence in design.

        • Yeah, I totally want my car to apply the emergency brakes every time I go under a bridge!

          Maybe the answer is to fix the trucks, not mess with a system that has half the accident rate of human drivers.

      • by Ichijo ( 607641 ) on Thursday June 30, 2016 @10:44PM (#52424313) Journal

        "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

        Why couldn't it see the tires, undercarriage, and side reflectors? And if the image was so washed out that it couldn't make out the outline, then because it couldn't see clearly, the car shouldn't have been moving so fast. It violated the Basic Speed Law just as surely as if it had been driving the speed limit in heavy fog, and that's a programming error.

        It would also help to upgrade the camera to one with a wider dynamic range and/or more resolution so the image is less likely to get washed out again.

        So there's a software fix and a hardware fix that will prevent this from happening again in the future. Unavoidable, my foot!

      • by Kkloe ( 2751395 )
        Whatever the car is assumed to see something this trivial(seems idiots here assume detection is trivial) it is more idiotic that the driver was stupid enough to not see it himself.
        Too bad that he died but good that this moron didnt take anyone else with him, a moron who put his faith in a opt-in beta function and following the instructions.
      • by AmiMoJo ( 196126 )

        It does seem like a bit of an oversight to have distance sensors that only work at grille height, not right up to the height of the roof. When Nissan recently demoed their auto-pilot system, it had sensors mounted at the top corners of the windscreen.

    • by bartle ( 447377 ) on Thursday June 30, 2016 @06:06PM (#52423045) Homepage
      I don't know if I agree that the accident was unavoidable. The inference of the article is that the driver wasn't paying any attention at all and had surrendered the driving completely to the car.

      My opinion is that Tesla's self-driving system is not nearly as safe as they claim. One doesn't have to look very hard to find videos like this [youtube.com] one where the driver has to react to prevent the auto-pilot from causing a crash. I question how long, realistically, a production Tesla can stay on the highway before a human needs to intercede to prevent an accident.

      Given enough time, and enough lawsuits, I think that Tesla will shut off their self-driving feature. It needs to be a lot robust than it current is. I can't say with any expertise, but it seems like their competitors are taking their autonomous vehicle research far more seriously with plans to install a more sophisticated sensor package on their cars.

      • by sjames ( 1099 )

        Depending on how close the truck was when it left it's lane, standing on the brake might not have helped.

        • by Jeremi ( 14640 ) on Thursday June 30, 2016 @07:21PM (#52423495) Homepage

          Depending on how close the truck was when it left it's lane, standing on the brake might not have helped.

          The truck wasn't in a lane -- it was crossing the highway (and perpendicular to the lane the Tesla was driving in) while making a left turn (presumably from a road that intersects the highway).

          • by sjames ( 1099 )

            Looking again, I see that. Yeah, if the driver had been paying attention at all rather than relying on autopilot, he could have stopped.

      • by fluffernutter ( 1411889 ) on Thursday June 30, 2016 @07:19PM (#52423483)
        What is the point of having a car with a self driving feature if you have to pay attention? Either the car drives for you or it doesn't. If it doesn't, the only way to ensure your fully involved in the drive is to drive.
        • by JaredOfEuropa ( 526365 ) on Thursday June 30, 2016 @07:41PM (#52423583) Journal
          Agreed. If the car is mostly driving itself, the driver's attention will soon wander. That seems more dangerous than having the driver do all or most of the work (still having cruise control). We will have self driving cars one day, but at the current state of the art it seems more prudent to let the autopilot keep an eye on the driver rather than the other way around.
        • Tesla doesn't _HAVE_ a self-driving feature.

          They have auto-pilot feature, which just like in real planes requires the pilot / driver to retain situational awareness at all times.

      • Even if the driver was paying attention (in that case what is the point of the autopilot?), how is he supposed to know that this is the one time the computer is going to fail to stop the car? I mean similar situations, but with other cars or pedestrians may have happened before and the computer stopped the car. So, one may believe that if the computer can see a pedestrian, then it most definitely can see a lorry which is may times bigger than a pedestrian. By the time you notice that the computer is not goi

      • by Whibla ( 210729 )

        I too am not sure I agree, but, there's one thing I'm particularly curious about, and that's the role of the truck driver in the crash.

        I'm not familiar with the 'rules of the road' in the US, but I did look at the police sketch of the accident scene. The truck turned left, crossing oncoming traffic. The truck driver is reported as saying "The Tesla was moving so fast I didn't even see it" or words to that effect.

        Yet, as far as I can tell, there's no indication that the Tesla was exceeding the speed limit fo

    • Sure you could, you can't just look for objects on the pavement, you need to have your radar looking for any object up to the full clearance (plus a few inches) minimum. This exposes a hole in their algorithm as the software isn't looking at objects above a certain height which is foolish.

    • My first thought, only killed one person? How many is GM or Chrysler up to this year alone?

  • by sims 2 ( 994794 ) on Thursday June 30, 2016 @05:55PM (#52422995)

    That's still pretty impressive if it's twice as safe as letting a human drive.

    Even more so after seeing all the videos on youtube with people in the back of the car letting tesla drive.

    • That's still pretty impressive if it's twice as safe as letting a human drive.

      I would bet that factlet wouldn't stand up to much scrutiny.

    • by AvitarX ( 172628 )

      I'd like to know how many highway miles is typical before saying it's impressive.

      The real issue right now is that self driving cars seem to cause accidents.

      The Google cars are in far many not at fault accidents relative to a normal driver, which implies they're actually quite bad drivers, doing weird shit.

      • IIRC the law requires all incidents involving self-driving to be reported. So a large majority of the cases are minor incidents that are not usually reported. If someone gets a little close and gives you a tiny bump in traffic or cuts you off and you scrape the curb or something else happens that doesn't cause any property or personal damage people are not required to report it, and usually don't bother.
    • That's still pretty impressive if it's twice as safe as letting a human drive.

      Even more so after seeing all the videos on youtube with people in the back of the car letting tesla drive.

      It's not clear to me that the Tesla system is safer than a human based on the quoted numbers. First, the incidents are very unlikely for either human or Tesla, so it's not clear that once in 130e6 or once in 60e6 miles is statistically different. Second, the populations for the two numbers are definitely different. Tesla owners are clearly not representative of the general population, so the more apples-to-apples comparison is between Autopilot and manual driving for the same type of drivers, probably ch

      • You realize that traffic fatalities are a multiple-times-daily occurrence in the USA alone, right? That's not some fuzzy guesstimate, it's about as statically sound as you could hope for. 94M miles (the number Tesla gives per fatal accident in the US, which is a better comparison than the idiot submitter and CNBC author chose to display) is nothing in a country with over 2.5 times that many vehicles. The worldwide rate is, if anything, possibly less well-established just because it's hard to collect accurat

    • The problem is that the computer crashed the car in a very stupid way that most likely would have been prevented if a human was driving the car. I mean the computer failed to notice a huge lorry. I would have noticed it even without my glasses.

      If the accident was something more like the ones sober people cause, it would not be such big news. However, if a human driver caused such an accident, people would think that he was really inattentive (searching for his phone on the floor, texting etc) or drunk.

      So, a

      • The problem is that the computer crashed the car in a very stupid way that most likely would have been prevented if a human was driving the car. I mean the computer failed to notice a huge lorry. I would have noticed it even without my glasses.

        I remember talking with someone who shared an experience she had had a few weeks earlier. She was a road construction worker, holding a stop sign and there were a few cars stopped. A big tractor trailer came to a stop, a little bit later the back of the trailer jumped up and went back down. She radioed in that she thought something was wrong and went to investigate. A motorcycle had come to a complete stop behind the trailer, but the pickup truck driver behind the motorcycle wasn't paying attention, didn't

        • If course, if the driver is busy writing a text message or browsing the net (or drunk, or asleep) then he may not even notice a cargo ship.

          However, when this happens, the driver is (correctly) blamed and ridiculed for it, since normally you may not notice a cyclist or a pedestrian (dressed in black clothes at night with no reflectors), but a tractor trailer is kinda obvious.

          This is why it is bad if the computer makes a mistake most human drivers wouldn't. I am sure it normally has much better visibility and

          • by Cederic ( 9623 )

            If course, if the driver is busy writing a text message or browsing the net (or drunk, or asleep) then he may not even notice a cargo ship.

            In the UK, being an island nation, we've honed our cargo ship detection capabilities and spot them on the roads even when we're drunk, texting or turned to talk to the kids in the back.

            This explains the absence of large container and freight vessel related injuries on the UK roads.

  • by mrspoonsi ( 2955715 ) on Thursday June 30, 2016 @05:57PM (#52423001)
    "the first known fatality in just over 130 million miles where Autopilot was activated," while a fatality happens once every 60 million miles worldwide. Autopilot is only allowed on highways, whereas I am sure they are comparing 60 million miles against normal driving which is inherently more dangerous than all cars heading in the same direction with barriers between the traffic flow. Apples and Oranges.
    • by godrik ( 1287354 ) on Thursday June 30, 2016 @06:27PM (#52423143)

      well, comparing to the worldwide rate of accident might not be reasonable. Some countries have a very high rate of accident and fatalities. One should compare to the accident rate in the same locations.

      According to wikipedia [1], fatalities in driven accident in the us is about 15 per billion mile. Which also about 1 per 65 million miles.

      [1] https://en.wikipedia.org/wiki/... [wikipedia.org]

      • well, comparing to the worldwide rate of accident might not be reasonable. Some countries have a very high rate of accident and fatalities. One should compare to the accident rate in the same locations.

        According to wikipedia [1], fatalities in driven accident in the us is about 15 per billion mile. Which also about 1 per 65 million miles.

        [1] https://en.wikipedia.org/wiki/... [wikipedia.org]

        The comment you answered asked for fatalities on highways. Highways are typically much safer than any other type of road, so you can't compare highway only to general average.

    • Ok some number from the UK: http://www.racfoundation.org/m... [racfoundation.org] In 2014, the majority of injured casualties occurred on built-up roads (72 per cent of total casualties). However, the majority of fatalities occurred on non built-up roads (just over a half). Although motorways carry around 21 per cent of traffic, they only account for 5.4 per cent of fatalities and 4.7 per cent of injured casualties. - See more at: http://www.racfoundation.org/m... [racfoundation.org] So 79% of the roads travelled are non highway, giving 94.6%
    • We don't even know yet if it's apples we're comparing against oranges. Meaning that 1 fatality in 130 million miles doesn't constitute much of a data set, and statistically leaves a huge uncertainty in that presumed fatality rate.
    • "the first known fatality in just over 130 million miles where Autopilot was activated," while a fatality happens once every 60 million miles worldwide..

      It is quite disingenuous as it is comparing US high-end vehicle driver statistics with world-wide statistics including 3rd-world countries where driving can be borderline suicidal. As a quick comparison via Google. The Insurance Institute for Highway Safety [iihs.org] reports as of 2014 (last year stats are available, including all vehicle types), there were 32,675 vehicle crash-related fatalities. By state, that ranges between one fatality in 68 million miles driven (South Dakota) and 161 million miles driven (Vermon

      • Wow.. I messed that up by picking the second worst and second best some how. worst case is 1 in 60.6 million miles (South Carolina), best case is 175.4 million miles (Massachusetts). These figures include fatalities of motorcyclist/bicylist/pedestrians as well as fatality injured drivers with blood alcohol content (BAC) >= 0.08.

  • by Razed By TV ( 730353 ) on Thursday June 30, 2016 @05:58PM (#52423003)
    If my car is in autopilot, and I take control of the vehicle just before dying in an accident, is it considered an autopilot fatality?
  • don't worry guys (Score:4, Informative)

    by Anonymous Coward on Thursday June 30, 2016 @05:59PM (#52423013)

    Tesla's autopilot mode is still in beta, so it isn't a big deal. Because it's apparently OK to sell cars that have only been beta tested at most.

    Tesla noted that customers need to acknowledge that autopilot "is new technology and still in a public beta phase" before they can turn it on. Drivers also acknowledge that "you need to maintain control and responsibility for your vehicle."

  • >bottom of the trailer hit the Tesla vehicle's windshield

    aka driver got decapitated smokey and the bandit style :o

  • by kheldan ( 1460303 ) on Thursday June 30, 2016 @06:04PM (#52423027) Journal
    And some of you want us all to get into a so-called 'self driving car' with no manual controls that would at least give you a chance to save yourself. I'll just keep driving myself the old-fashioned way, thanks anyway, because I want to live. Ask me again in 50 years. I'll still say 'Hell, NO!', but at least then you'll have decades of data instead of a measly few years' worth.
  • by Ecuador ( 740021 ) on Thursday June 30, 2016 @06:04PM (#52423029) Homepage

    According to the article this was not something the driver could see and avoid, and while the autopilot could not see it either, based on the data from this crash it *could* see it the next time. Drivers learn from their own experience and fatal crashes terminate their learning experience, while autopilots learn from ALL autopilots on the road, and there are no "fatalities".
    Of course it is always the fault of humans in the end, in this case the tractor trailer was not supposed to be there, so we'll only have perfect records when we get rid of all the drivers and have all the cars on autopilot.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

      The driver likely didn't notice because he wasn't paying attention. That is something drivers can see when driving. You can see the front of the truck cross in front of you, you understand the sky doesn't suddenly shift colors, you understand previously seen road doesn't suddenly turn into sky, and if all that escapes you you can still see the truck wheels along the road. The car probably saw them too and figured it would fit between them. Why didn't the radar see the truck? Too angled towards the road

      • by Jeremi ( 14640 ) on Thursday June 30, 2016 @07:16PM (#52423451) Homepage

        Why didn't the radar see the truck?

        The radar saw the truck's trailer, but misidentified it as an overhead sign, because it was so high off the ground.

        Did the driver die?

        Yes.

        If so, how do we know he didn't notice the truck?

        If he had noticed the truck, he presumably would have applied the brake. (we'll have to assume the driver wasn't feeling suicidal)

        • by Xenna ( 37238 )

          If I understand correctly he was an enthousiastic early adopter, putting youtube videos up that showed how well the computer in his Tesla did. He was probably reading a book or something just to show off his smart car.

      • What's the point of having Autopilot if you have to stay fully involved? Might as well drive then and avoid the autopilot disengaging at some unfortunate moment.
        • by lgw ( 121541 )

          Tesla is clear in their manual and click-through agreement that autopilot is not a self-driving car, but a sophisticated driver assist. Most people won't read anything, though, so I suspect the majority of Tesla owners don't understand this.

          What's the point? Most things people like about cars are subjective. I like the (fairly limited) driver assists in my car a lot. The intelligent cruise control is great for limiting your top speed when a cop is around, the "beep if you're in a dangerous situation" fe

          • It's because the autopilot as implemented in a Tesla car just doesn't make sense. It's bizarre to have a system that purports to allow you to not pay attention to driving, yet still have you have to pay attention to driving. Things that don't make sense tend to not have a place in the way people understand and use things. It isn't ready, it doesn't work with human psychology, and it should have never been put on the market in its current state.
    • No, this was something the driver did not avoid. We don't know if/when the driver saw it, or if the driver could see it.
      It is likely the driver was staring at their phone and not looking at the road because they assumed the car's autopilot mode worked.

      • by tlhIngan ( 30335 )

        No, this was something the driver did not avoid. We don't know if/when the driver saw it, or if the driver could see it.
        It is likely the driver was staring at their phone and not looking at the road because they assumed the car's autopilot mode worked.

        Autopilot mode isn't "hands off" or true autonomous driving. In fact, Tesla's implementation doesn't even use the GPS. It's really a more sophisticated lane keeping and cruise control system. It can change lanes, but you have to command it to do so.

        In fact, if

    • It was a frigging tractor-trailer crossing the street. How can anybody in their right mind claim that this was impossible to see??? If you, as the driver, cannot see a tractor-trailer crossing the street (no matter whether it's white or any other color), then you shouldn't be driving a car. If the car's AI cannot distinguish it either, it's not fit to control your vehicle. Very simple. You can be pretty sure that any human that pays attention while driving would notice a truck crossing the street, and hit
      • If this had been a Google car, there would have been no accident. Google has much better radar which maintains a model of all vehicles in the vicinity before they turn into your direction, and it's high enough up that it would not miss a trailer.

    • by Xenna ( 37238 )

      "so we'll only have perfect records when we get rid of all the drivers and have all the cars on autopilot."

      And all the cyclists and all the children and all the senior citizens and all the animals....

      Basically the only way to get perfect records is to create roads that are dedicated to driverless cars. Then a very complicated problem suddenly becomes fairly easy to solve.

  • by Camel Pilot ( 78781 ) on Thursday June 30, 2016 @06:09PM (#52423055) Homepage Journal

    This would not have occurred I suspect.

  • by Moof123 ( 1292134 ) on Thursday June 30, 2016 @06:31PM (#52423171)

    Until "autonomous" means exactly that, we will have people lulled into not paying attention, and a driving system that cannot handle everything that is thrown at it. The result will be crashes.

    No manner of EULA, or cries of BETA will get around that predictable result.

    Expecting human nature to change to match your product's limitation is a fool's journey.

    • Expecting human nature to change to match your product's limitation is a fool's journey.

      Tell that to the collectivists who are in favor of various styles of socialist or communist forms of government.

      Strat

  • It would be a high priority to look for vehicles crossing unexpectedly. That is a major cause of accidents. And not that hard to do with just stereo vision.

    However, the article contains no useful information. Did the AutoPilot actually see it? Did it interpret it correctly? Did it try to slow down at all? At what point after the truck started moving (but before it entered the other road) did the AutoPilot react?

    Those are the critical questions.

    I think it would be safe to assume that the auto pilot did

    • Never hit the brakes. Saw the trailer, thought it was a sign. Forward radar is aimed at road level, didn't detect trailer and apparently thought the tractor was clear.

    • I think it would be safe to assume that the auto pilot did slow down once the truck was in front of the car, but that, of course, would be too late.

      From TFA:

      "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

  • by CanadianMacFan ( 1900244 ) on Thursday June 30, 2016 @09:05PM (#52423947)

    It sounds like the truck crossed the lane without enough time for the oncoming cars to make it but all we hear is how the autopilot is at fault. I can understand how the the sensors missed the trailer and that is going to be something all developers will have to add to their tests (when seeing a rig with a space after it then check for tires).

    We are going to see cases like this come up now and again with self driving cars but there won't be a need for a recall. What should happen is an alert go out to the owners of cars while the manufacturers check their systems. If their cars pass tests then they can send out messages to their customers. If not then they create an update, test it, verify it, and send it out. Until owners of the cars hear that the system has been verified then they need to be extra vigilant when such an event happens.

  • Trailer design (Score:5, Informative)

    by Martin S. ( 98249 ) on Friday July 01, 2016 @05:47AM (#52425275) Journal

    There seems to be big flaw in the design of the trailer that allowed this to happen.

    In the UK HGV trailers are required to have side and rear run-under prevention to stop this very thing from happening.

    http://www.transportsfriend.or... [transportsfriend.org]

If you steal from one author it's plagiarism; if you steal from many it's research. -- Wilson Mizner

Working...