Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Government Transportation

Report That Tesla Autopilot Cuts Crashes By 40% Called 'Bogus' (arstechnica.com) 77

Remember when America's National Highway Traffic Safety Administration reported Tesla's Autopilot reduced crashes by 40%? Two years later the small research and consulting firm Quality Control Systems (QCS) finally obtained the underlying data -- and found flaws in the methodology "serious enough to completely discredit the 40 percent figure," reports Ars Technica, "which Tesla has cited multiple times over the last two years."
The majority of the vehicles in the Tesla data set suffered from missing data or other problems that made it impossible to say whether the activation of Autosteer increased or decreased the crash rate. But when QCS focused on 5,714 vehicles whose data didn't suffer from these problems, it found that the activation of Autosteer actually increased crash rates by 59 percent...

NHTSA undertook its study of Autopilot safety in the wake of the fatal crash of Tesla owner Josh Brown in 2016. Autopilot -- more specifically Tesla's lane-keeping function called Autosteer -- was active at the time of the crash, and Brown ignored multiple warnings to put his hands back on the wheel. Critics questioned whether Autopilot actually made Tesla owners less safe by encouraging them to pay less attention to the road. NHTSA's 2017 finding that Autosteer reduced crash rates by 40 percent seemed to put that concern to rest. When another Tesla customer, Walter Huang, died in an Autosteer-related crash last March, Tesla cited NHTSA's 40 percent figure in a blog post defending the technology. A few weeks later, Tesla CEO Elon Musk berated reporters for focusing on stories about crashes instead of touting the safety benefits of Autopilot....

[T]hese new findings are relevant to a larger debate about how the federal government oversees driver-assistance systems like Autopilot. By publishing that 40 percent figure, NHTSA conferred unwarranted legitimacy on Tesla's Autopilot technology. NHTSA then fought to prevent the public release of data that could help the public independently evaluate these findings, allowing Tesla to continue citing the figure for another year.... NHTSA fought QCS' FOIA request after Tesla indicated that the data was confidential and would cause Tesla competitive harm if it was released.

Last May the NHTSA finally clarified that their study "did not assess the effectiveness of this technology." Ars Technica also points out that the data focused on version 1 of Autopilot, "which Tesla hasn't sold since 2016."
This discussion has been archived. No new comments can be posted.

Report That Tesla Autopilot Cuts Crashes By 40% Called 'Bogus'

Comments Filter:
  • I read Slashdot at the time, and that was pretty much the consensus back then.
    • Re: (Score:2, Funny)

      by Anonymous Coward

      I read Slashdot at the time, and that was pretty much the consensus back then.

      Aside from people named Rei or had mod points.

      Clearly Elon stole the Reality Distortion Field Steve Jobs pioneered.

      Now we have

      1. The NSA really is spying on everyone.
      2. Bitcoin is an nonviable ponzi scheme.
      3. Tesla autopilot is neither autopilot nor safe.

      Anyone care to take bets on what is next? Mine is that solar and wind are cheaper than fossil fuel. We have heard it many times, but the numbers don't match quite yet.

      • 3 is wrong, though. Autopilot is autopilot, that is an obvious tautology. And regarding wind and solar...why don't the numbers match? Do Chileans lie? Or Lazard?
  • Who are QCS? (Score:3, Interesting)

    by 15Bit ( 940730 ) on Saturday February 16, 2019 @02:48PM (#58131878)
    Not saying this is wrong, but QCS appears to be a husband and wife consulting outfit, looking at their website. Which raises the question - who checked over their research and OK'd this analysis of the statistics?
    • They are a sham outfit for sure, but the raw NHTSA data isn’t a great metric for the impact of Autopilot on safety, since it includes both miles where autopilot can be used and miles that it can’t. The miles where it can’t be used are generally higher accident rates, so of course the rate is going to be lowere when comparing to miles that autopilot is engaged.

      • Also I don't believe they compared to other non-Autopilot cars in the same price range. They just compared to all other cars.
        • by AmiMoJo ( 196126 )

          All other cars on all roads. AP is only for use on highways, that are safer anyway.

          Also when they introduced AP they introduced automatic emergency braking. So it's impossible to say which is responsible for any effects.

    • Re:Who are QCS? (Score:5, Interesting)

      by Entrope ( 68843 ) on Saturday February 16, 2019 @03:29PM (#58131996) Homepage

      You don't have to take QCS's, or anyone else's, word about their work. Their report and the Ars article describe the statistically inappropriate decisions in the NHTSA study.

      For example, the study did not have enough data to tell when Autosteer was enabled for about two thirds of the vehicles considered, so they approximated. That's not intently bad, but the approximation was that the 29,000 vehicles such vehicles had zero pre-Autosteer miles, which is very questionable. That mistake was seriously aggravated by including 18 accidents from these cars in the pre-Autosteer group, which obviously moves miles and/or accidents in a direction that made Autosteer look better.

      • Their report and the Ars article describe the statistically inappropriate decisions in the NHTSA study.

        While introducing a lot of inappropriate analysis themselves. In order to decide that autopilot was bad he (not they, QCS is only one guy) discarded 90% of the available data which resulted in something which completely failed statistical significance tests. So forgive me for not following the outrage here.

        Now what really should happen is that Tesla should open up all their *current* data. They have over a billion miles of data now to work with and that may actually be usable to draw a valid conclusion.

    • by 110010001000 ( 697113 ) on Saturday February 16, 2019 @03:49PM (#58132046) Homepage Journal
      Who checked the NHSTA analysis (or Tesla's). No one.
    • Our 'specialist on the ground' Msmash, no doubt
    • by Rei ( 128717 )

      Not saying this is wrong, but QCS appears to be a husband and wife consulting outfit, looking at their website. Which raises the question - who checked over their research and OK'd this analysis of the statistics?

      Absolutely nobody.
      Also, their funding? Who the heck knows
      Peer review? Hahahaha, why bother with that?
      Tesla's response? Why bother to post that?
      The fact that if you read over this "analysis" most of the data appears to validate the NHTSA but the author dismisses it (all but 5714 out of 43781 vehicl

    • The guy's name is Randy Whitfield. Based on his resume he's at least got a statistical background and a paper or two to his name and he seems to have been completely silent since the last 80s.

      The NHTSA study had it's flaws, but this guy needed to cut 90% of the data to get to his conclusion and the first 6 pages of his report seem to be a legal hit piece and a whine about the NHTSA not happily just handing over everything they've done and refusing to talk to him after he sued them (surprise).

      But ad hominem

  • funding secured, brah

    (takes long drag off of a doobie)

  • by Dan East ( 318230 )

    So the new number crunching by QCS ignores 18 crashes that happened before Autosteer was rolled out, because the total mileage of those vehicles prior to the crash was not known. Yet... those crashes most certainly occurred. In fact, it's quite possible that the crash itself resulted in enough destruction to the vehicle that the odometer could not be read and thus the exact mileage is not known and reported in the data.

    With Autosteer enabled, more data is collected about the vehicle, so the exact mileage i

    • by Anonymous Coward

      that a mere 18 cars in the original 'investigation' is enough to swing the results nearly 100 percentage points (+40 to -59).

      Seems like yet another badly executed examination with a woefully undersized data set.

      At least in this case it can be somewhat justified as the number of accidents involving Tesla cars is extremely small since there aren't that many of them out on the roads in their brief existence as a production vehicle.

      After all, this whole thing started over one incident of a stupid driver misusin

      • Personally I would have counted slamming into a concrete barrier as 1000 accidents, but that's just me.
      • by Anonymous Coward

        that a mere 18 cars in the original 'investigation' is enough to swing the results nearly 100 percentage points (+40 to -59).

        Those 18 cars are enough to prove that the data is flawed, because those 18 cars indicate both that they were never driven without autopilot, and that they had accidents that occurred before autopilot was installed. The NHTSA assumed that if there was no "miles driven before autopilot" then the car always had autopilot installed, yet these 18 cars all indicated no miles driven without autopilot AND an autopilot install date after their crash.

        This is why they ended up removing a ton of cars from the dataset

    • Re: (Score:3, Insightful)

      by whoever57 ( 658626 )

      This new report narrows down the pool of cars and crashes so much that there isn't enough data (there were not enough crashes) to get meaningful results from the analysis.

  • by fluffernutter ( 1411889 ) on Saturday February 16, 2019 @03:16PM (#58131968)
    Was the original study even adjusted for age? There will be less younger inexperienced drivers in a Tesla.
    • You would think that older, more-experienced drivers are better drivers bit from what I've seen, most experienced drivers simply drive as badly as they did when they weren't experienced.

      My conclusion: most young idiots develop into older idiots.

      • Yes yes, humans suck at driving, etc etc meat puppet blah blah.
      • You would think that older, more-experienced drivers are better drivers bit from what I've seen, most experienced drivers simply drive as badly as they did when they weren't experienced.

        My conclusion: most young idiots develop into older idiots.

        Your conclusion is based on unsystematic, anecdotal information. In contrast, the actuaries who work for insurance companies have, and use, extensive data on accidents, cost and fault, all correlated with many driver characteristics including age, training, history of accidents and moving violations, and more. They adjust insurance premiums based on what they learn from systematic statistical analysis of this mountain of data. So, even without access to their data, we can make conclusions about their conc

        • Your conclusion is based on unsystematic, anecdotal information.

          More of an observation than a conclusion: I put 40 to 50 thousand [city!] miles on the odometer anually; I therefore get the opportunity to observe more of my fellow drivers than virtually anyone else.

          In a nutshell, I've found that the vast majority of older drivers are still making the same mistakes younger drivers do. Hence said conclusion.

          Perhaps the essence of what I was attempting to convey was simpler than you realized.

          • FYI, as I replied to the AC, I wasn't referring to driving style; older drivers are clearly calmer and more attentive... and thus statistically safer. That does not make they more skilled.
  • Autopilot (Score:4, Insightful)

    by 110010001000 ( 697113 ) on Saturday February 16, 2019 @03:24PM (#58131980) Homepage Journal
    Stop calling it AutoPilot. It is just drivers assist, just like every other car manufacturer puts in their high end vehicles. Now every jerk thinks they can fall asleep in their Tesla while cruising down the highway.
    • Fall asleep at the wheel in your Tesla? That's stupid! I use the AutoPilot so I can peruse Slashdot and answ()&Y%*&P(Y*PPIHSFP(UBBIB:k3018yjnnu233^%%^*
      • Hey everybody, I think this is a fake post, or a joke. Obviously, when the internet connection is unexpectedly broken, the post ends with "NO CARRIER".

    • I doubt there's much "thinking" going on but we might be have different ideas of the meaning of the word. ;)
    • Sounds to me more like a total lack of understanding of the word autopilot. Autopilot has been in planes for years. If you get on a plane and the pilot and co-pilot both are hanging out in first class drinking would we not all freak out? Pilots are paid around 130k a year, spend years training, and we've always known and expected them to be alert and focused on the skies even when the plane is on autopilot. Yet somehow when we put it on the ground we assume it means "no driver needed"
    • We'll stop calling this autopilot when you stop calling a plane's autopilot autopilot.

      It is just drivers assist, just like every other car manufacturer puts in their high end vehicles.

      Why not simply write: "I have driven neither a Tesla nor a modern high end vehicle from any other manufacturer." It would hide your ignorant conclusion.

  • It turns out they were using those old pentium chips with the math co-processor error. It was supposed to be 0.45%!!!!
    https://en.wikipedia.org/wiki/... [wikipedia.org]
  • You can always find 1% which is in odds with the main body of facts.

    Nice sampling ;)
  • Autopilot would be great if it actively enforced driver attention. Then you get the best of both worlds. The car can perform mundane actions and react faster than any human when it needs to emergency brake etc. But the human is overseeing the car and road conditions and so able to prevent emergencies from developing in the first place. Sadly Tesla treated driver attention as an afterthought. Drivers are allowed to become inattentive and cannot intervene in time when the car does something dumb. The need to
  • As one of the lucky Tesla owners (and Tesla Hacker too), I'm sure it has made my driving significantly safer and increased my life expectancy.

    As for the numbers... well this story has already been debunked last week on electrek [electrek.co].

    Short version: the one person company who made the analysis doesn't understand statistics that well and bases his conclusion on 1% of the total amount of autopilot driven distance...

    I can only think of one reason for disinformation around this subject: stock price manipulation!

The reason that every major university maintains a department of mathematics is that it's cheaper than institutionalizing all those people.

Working...