Report That Tesla Autopilot Cuts Crashes By 40% Called 'Bogus' (arstechnica.com) 77
Remember when America's National Highway Traffic Safety Administration reported Tesla's Autopilot reduced crashes by 40%? Two years later the small research and consulting firm Quality Control Systems (QCS) finally obtained the underlying data -- and found flaws in the methodology "serious enough to completely discredit the 40 percent figure," reports Ars Technica, "which Tesla has cited multiple times over the last two years."
The majority of the vehicles in the Tesla data set suffered from missing data or other problems that made it impossible to say whether the activation of Autosteer increased or decreased the crash rate. But when QCS focused on 5,714 vehicles whose data didn't suffer from these problems, it found that the activation of Autosteer actually increased crash rates by 59 percent...
NHTSA undertook its study of Autopilot safety in the wake of the fatal crash of Tesla owner Josh Brown in 2016. Autopilot -- more specifically Tesla's lane-keeping function called Autosteer -- was active at the time of the crash, and Brown ignored multiple warnings to put his hands back on the wheel. Critics questioned whether Autopilot actually made Tesla owners less safe by encouraging them to pay less attention to the road. NHTSA's 2017 finding that Autosteer reduced crash rates by 40 percent seemed to put that concern to rest. When another Tesla customer, Walter Huang, died in an Autosteer-related crash last March, Tesla cited NHTSA's 40 percent figure in a blog post defending the technology. A few weeks later, Tesla CEO Elon Musk berated reporters for focusing on stories about crashes instead of touting the safety benefits of Autopilot....
[T]hese new findings are relevant to a larger debate about how the federal government oversees driver-assistance systems like Autopilot. By publishing that 40 percent figure, NHTSA conferred unwarranted legitimacy on Tesla's Autopilot technology. NHTSA then fought to prevent the public release of data that could help the public independently evaluate these findings, allowing Tesla to continue citing the figure for another year.... NHTSA fought QCS' FOIA request after Tesla indicated that the data was confidential and would cause Tesla competitive harm if it was released.
Last May the NHTSA finally clarified that their study "did not assess the effectiveness of this technology." Ars Technica also points out that the data focused on version 1 of Autopilot, "which Tesla hasn't sold since 2016."
The majority of the vehicles in the Tesla data set suffered from missing data or other problems that made it impossible to say whether the activation of Autosteer increased or decreased the crash rate. But when QCS focused on 5,714 vehicles whose data didn't suffer from these problems, it found that the activation of Autosteer actually increased crash rates by 59 percent...
NHTSA undertook its study of Autopilot safety in the wake of the fatal crash of Tesla owner Josh Brown in 2016. Autopilot -- more specifically Tesla's lane-keeping function called Autosteer -- was active at the time of the crash, and Brown ignored multiple warnings to put his hands back on the wheel. Critics questioned whether Autopilot actually made Tesla owners less safe by encouraging them to pay less attention to the road. NHTSA's 2017 finding that Autosteer reduced crash rates by 40 percent seemed to put that concern to rest. When another Tesla customer, Walter Huang, died in an Autosteer-related crash last March, Tesla cited NHTSA's 40 percent figure in a blog post defending the technology. A few weeks later, Tesla CEO Elon Musk berated reporters for focusing on stories about crashes instead of touting the safety benefits of Autopilot....
[T]hese new findings are relevant to a larger debate about how the federal government oversees driver-assistance systems like Autopilot. By publishing that 40 percent figure, NHTSA conferred unwarranted legitimacy on Tesla's Autopilot technology. NHTSA then fought to prevent the public release of data that could help the public independently evaluate these findings, allowing Tesla to continue citing the figure for another year.... NHTSA fought QCS' FOIA request after Tesla indicated that the data was confidential and would cause Tesla competitive harm if it was released.
Last May the NHTSA finally clarified that their study "did not assess the effectiveness of this technology." Ars Technica also points out that the data focused on version 1 of Autopilot, "which Tesla hasn't sold since 2016."
Uh.. (Score:1)
Re: (Score:2)
Re: (Score:2, Funny)
I read Slashdot at the time, and that was pretty much the consensus back then.
Aside from people named Rei or had mod points.
Clearly Elon stole the Reality Distortion Field Steve Jobs pioneered.
Now we have
1. The NSA really is spying on everyone.
2. Bitcoin is an nonviable ponzi scheme.
3. Tesla autopilot is neither autopilot nor safe.
Anyone care to take bets on what is next? Mine is that solar and wind are cheaper than fossil fuel. We have heard it many times, but the numbers don't match quite yet.
Re: Uh.. (Score:2)
Re: Uh.. (Score:2)
Who are QCS? (Score:3, Interesting)
Re: (Score:2)
They are a sham outfit for sure, but the raw NHTSA data isn’t a great metric for the impact of Autopilot on safety, since it includes both miles where autopilot can be used and miles that it can’t. The miles where it can’t be used are generally higher accident rates, so of course the rate is going to be lowere when comparing to miles that autopilot is engaged.
Re: (Score:3)
Re: (Score:2)
All other cars on all roads. AP is only for use on highways, that are safer anyway.
Also when they introduced AP they introduced automatic emergency braking. So it's impossible to say which is responsible for any effects.
Re:Who are QCS? (Score:5, Interesting)
You don't have to take QCS's, or anyone else's, word about their work. Their report and the Ars article describe the statistically inappropriate decisions in the NHTSA study.
For example, the study did not have enough data to tell when Autosteer was enabled for about two thirds of the vehicles considered, so they approximated. That's not intently bad, but the approximation was that the 29,000 vehicles such vehicles had zero pre-Autosteer miles, which is very questionable. That mistake was seriously aggravated by including 18 accidents from these cars in the pre-Autosteer group, which obviously moves miles and/or accidents in a direction that made Autosteer look better.
Re: (Score:2)
Their report and the Ars article describe the statistically inappropriate decisions in the NHTSA study.
While introducing a lot of inappropriate analysis themselves. In order to decide that autopilot was bad he (not they, QCS is only one guy) discarded 90% of the available data which resulted in something which completely failed statistical significance tests. So forgive me for not following the outrage here.
Now what really should happen is that Tesla should open up all their *current* data. They have over a billion miles of data now to work with and that may actually be usable to draw a valid conclusion.
Re:Who are QCS? (Score:4)
Re: Who are QCS? (Score:2)
Re: (Score:2)
Absolutely nobody.
Also, their funding? Who the heck knows
Peer review? Hahahaha, why bother with that?
Tesla's response? Why bother to post that?
The fact that if you read over this "analysis" most of the data appears to validate the NHTSA but the author dismisses it (all but 5714 out of 43781 vehicl
Not who "are", who "is". (Score:2)
The guy's name is Randy Whitfield. Based on his resume he's at least got a statistical background and a paper or two to his name and he seems to have been completely silent since the last 80s.
The NHTSA study had it's flaws, but this guy needed to cut 90% of the data to get to his conclusion and the first 6 pages of his report seem to be a legal hit piece and a whine about the NHTSA not happily just handing over everything they've done and refusing to talk to him after he sued them (surprise).
But ad hominem
Reports That Tesla's CEO is a Pedo Called 'Factual (Score:1)
funding secured, brah
(takes long drag off of a doobie)
Re: (Score:3)
Re: (Score:2)
The best cons are when the mark never knows he was conned.
Re: (Score:2)
Calling out flaws in the original study is acceptable.
Publishing another flawed study by cutting out 90% of the data and cherry-picking the outcome you want is not acceptable.
Still flawed (Score:2, Funny)
So the new number crunching by QCS ignores 18 crashes that happened before Autosteer was rolled out, because the total mileage of those vehicles prior to the crash was not known. Yet... those crashes most certainly occurred. In fact, it's quite possible that the crash itself resulted in enough destruction to the vehicle that the odometer could not be read and thus the exact mileage is not known and reported in the data.
With Autosteer enabled, more data is collected about the vehicle, so the exact mileage i
So what you're saying is... (Score:2, Insightful)
that a mere 18 cars in the original 'investigation' is enough to swing the results nearly 100 percentage points (+40 to -59).
Seems like yet another badly executed examination with a woefully undersized data set.
At least in this case it can be somewhat justified as the number of accidents involving Tesla cars is extremely small since there aren't that many of them out on the roads in their brief existence as a production vehicle.
After all, this whole thing started over one incident of a stupid driver misusin
Re: (Score:1)
Re: (Score:1)
that a mere 18 cars in the original 'investigation' is enough to swing the results nearly 100 percentage points (+40 to -59).
Those 18 cars are enough to prove that the data is flawed, because those 18 cars indicate both that they were never driven without autopilot, and that they had accidents that occurred before autopilot was installed. The NHTSA assumed that if there was no "miles driven before autopilot" then the car always had autopilot installed, yet these 18 cars all indicated no miles driven without autopilot AND an autopilot install date after their crash.
This is why they ended up removing a ton of cars from the dataset
Re: (Score:3, Insightful)
This new report narrows down the pool of cars and crashes so much that there isn't enough data (there were not enough crashes) to get meaningful results from the analysis.
XP (Score:3)
Re: XP (Score:2)
My conclusion: most young idiots develop into older idiots.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Period. End of story
Virtually everything can be distilled down to an oversimplification if you ask a numbers person... but you don't get what the fuck I'm talking about: Sure, older people are statistically safer drivers; after all, they're generally calmer (age) and more attentive (experience). Fuck's sake, that's obvious.
I didn't say anything about driving style; I was talking about skills.
Yes, there's a difference.
Re: (Score:2)
You would think that older, more-experienced drivers are better drivers bit from what I've seen, most experienced drivers simply drive as badly as they did when they weren't experienced.
My conclusion: most young idiots develop into older idiots.
Your conclusion is based on unsystematic, anecdotal information. In contrast, the actuaries who work for insurance companies have, and use, extensive data on accidents, cost and fault, all correlated with many driver characteristics including age, training, history of accidents and moving violations, and more. They adjust insurance premiums based on what they learn from systematic statistical analysis of this mountain of data. So, even without access to their data, we can make conclusions about their conc
Re: (Score:2)
Your conclusion is based on unsystematic, anecdotal information.
More of an observation than a conclusion: I put 40 to 50 thousand [city!] miles on the odometer anually; I therefore get the opportunity to observe more of my fellow drivers than virtually anyone else.
In a nutshell, I've found that the vast majority of older drivers are still making the same mistakes younger drivers do. Hence said conclusion.
Perhaps the essence of what I was attempting to convey was simpler than you realized.
Re: (Score:2)
Re: (Score:2)
they
...theM. :/
Autopilot (Score:4, Insightful)
Re: (Score:3)
Re: (Score:2)
Hey everybody, I think this is a fake post, or a joke. Obviously, when the internet connection is unexpectedly broken, the post ends with "NO CARRIER".
Re: (Score:2)
Re: Autopilot (Score:2)
Re: (Score:2)
Re: (Score:2)
We'll stop calling this autopilot when you stop calling a plane's autopilot autopilot.
It is just drivers assist, just like every other car manufacturer puts in their high end vehicles.
Why not simply write: "I have driven neither a Tesla nor a modern high end vehicle from any other manufacturer." It would hide your ignorant conclusion.
It turns out.... (Score:1)
https://en.wikipedia.org/wiki/... [wikipedia.org]
5000 out of 500000 - interesting sampling (Score:1)
Nice sampling
Hardly surprising (Score:2)
As a lucky Tesla owner... (Score:2)
As for the numbers... well this story has already been debunked last week on electrek [electrek.co].
Short version: the one person company who made the analysis doesn't understand statistics that well and bases his conclusion on 1% of the total amount of autopilot driven distance...
I can only think of one reason for disinformation around this subject: stock price manipulation!