Felony Charges Are 1st In a Fatal Crash Involving Autopilot (go.com) 142
X2b5Ysb8 shares a report from ABC News: California prosecutors have filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot who ran a red light, slammed into another car and killed two people in 2019. The defendant appears to be the first person to be charged with a felony in the United States for a fatal crash involving a motorist who was using a partially automated driving system. Los Angeles County prosecutors filed the charges in October, but they came to light only last week. The driver, Kevin George Aziz Riad, 27, has pleaded not guilty. Riad, a limousine service driver, is free on bail while the case is pending.
The misuse of Autopilot, which can control steering, speed and braking, has occurred on numerous occasions and is the subject of investigations by two federal agencies. The filing of charges in the California crash could serve notice to drivers who use systems like Autopilot that they cannot rely on them to control vehicles. The criminal charges aren't the first involving an automated driving system, but they are the first to involve a widely used driver technology.
The misuse of Autopilot, which can control steering, speed and braking, has occurred on numerous occasions and is the subject of investigations by two federal agencies. The filing of charges in the California crash could serve notice to drivers who use systems like Autopilot that they cannot rely on them to control vehicles. The criminal charges aren't the first involving an automated driving system, but they are the first to involve a widely used driver technology.
Early adopters... (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Which is exactly what's happening here so "the system" worked as it should.
Re:Not quite guinea pigs. (Score:5, Informative)
For the uninitiated:
TACC: Traffic Aware Cruise Control is just like Mercedes Benz or Lexus Radar guided cruise control. The cars had a radar and a camera, with which to gauge the distance between your car and the vehicle in front of you, and slow down your cruise control set point to not collide with the car in front of you.
Active Lake Keeping: is just as the name suggests, the car would keep in the lane of travel as long as it sensed the lanes. If the car couldn't detect a lane it would alert you to take over and slow down until you did.
Navigate on Autopilot: At the time the most advanced feature. On a freeway, the two lane blue lines from Active Lane Keeping would merge into one path the car will take, and the car was able to merge lanes from one lane of travel to the next if it recognized a lane with a dashed line and no cars in the way of the merge. It would do this if you had a destination set, and could merge into exit or free-way to free-way interchange lanes. When you exited the freeways, the car would come to a stop and disengage Nav on Auto, because it did not do city streets at all.
It sounds to me like this guy left his car in TACC or Lane Keeping and let it run into an intersection on a freeway exit, which Teslas at the time could not do. Sometime in 2020 they rolled out a feature to detect traffic lights and stop signs. The car stops even at green lights when TACC or Lane Keeping is engaged until it detects it can proceed (by watching a car in front of it proceed), sees a green light, AND has the driver acknowledge it is safe to do so.
Long story short, the driver put the car in a situation it was certainly going to fail, because in 2019 they did not stop for red lights. They do now. Reliably so..
Re: (Score:2)
There is probably some liability for Tesla too. The system is designed to monitor the driver to make sure they are paying attention, but it is trivial to defeat and can't detect things like the driver being asleep with their hand resting on the wheel.
When Tesla first introduced autopilot, it only need to detect your hand on the wheel when entering corners below a certain radius. As the accidents and regulator interest mounted up, they first made it so that it needed to detect a hand on the wheel once every
Re: (Score:3)
I am not following your logic.
Tesla had found out its initial settings were not safe enough, so they changed it to make it safer, before a set of lawsuits and politicians who have stock in other self driving companies making a big fit about it.
I do not remember any time where it was set for the half hour, or the 10 minutes, please provide sources of this default action.
However (may be a bit bias) it seems that autopilot feature is actually safer then not [cleantechnica.com]. So I don't think it is a case that Tesla is being n
Re: (Score:2)
The changes appear to have been in response to regulatory interest and lawsuits. The fact that they made those changes can be used to argue that the original system was unsafe and they should reasonably have known that before releasing it.
Re: (Score:2)
I am a lawyer but this is not legal advice. Pay my retainer if you want advice.
>The fact that they made those changes can be used to argue that
>the original system was unsafe and they should reasonably have
>known that before releasing it.
This is simply not true.
The inadmissibility of "subsequent remedial measures" is a longstanding legal principle in anglo-american law.
To *not* do so would rather strongly *discourage* learning from experience and making changes.
Re: (Score:2)
In every case, it was the driver not paying attention. One was watching a movie, others texting, some talking on phone, etc.
Re: (Score:2)
even there, there would be shades of distinction on the level of culpability, particularly intent.
Answering a phone might "merely" be regular negligence, texting probably a higher form of negligence, and the movie watching would be, I suppose, at least somewhat less culpable than not even being behind the wheel.
I wouldn't want to be the defense attorney on *any* of them . . .
Re: (Score:2)
Tesla had found out its initial settings were not safe enough, so they changed it to make it safer, before a set of lawsuits and politicians who have stock in other self driving companies making a big fit about it.
False. They decreased the time to shut up whiners. Nothing more. When you have 10 deaths associated with AP over 6 years, that is pretty impressive. NO OTHER CARS are that good, with human or automated driving.
Re: (Score:2)
I'm not an expert on this area of US law, but isn't "unreasonably dangerous" a thing?
Re: (Score:2)
Re: (Score:2)
If it had worked it would have deterred to stupid bastard from doing it in the first place. I mean it was only two people killed, wasn't it?
Re: (Score:2)
So if I rent a car, the car company is responsible for everything I do with the car, since they are the registered owner? If you lease a car, the bank is the registered owner, so you're in the clear for of the things you mention, is that what what you were taught? Someone steals your car and runs over some pedestrians, you believe that you are responsible? I think someone made some stuff up or you didn't pay attention in class. Even Tesla, in their autopilot fine print, clearly states that the driver is re
Re: (Score:2)
New York traffic law: Every owner of a vehicle used or operated in this state shall be liable and responsible
for death or injuries to person or property resulting from negligence in the use or
operation of such vehicle, in the business of such owner or otherwise, by any person
using or operating the same with the permission, express or implied, of such owner.
The 2005 federal Transportation Equity Act specifically exempts rental and leasing companies from this, but in general if you own the vehicle you are res
Re: (Score:2)
Replace the word autopilot with cruise control (Score:2)
Re: (Score:2)
Aside from the name and the extra features, I can't see a difference between the two car features.
Adaptive cruise control [wikipedia.org] will automatically maintain a safe distance between your car and the car in front of it. But you still steer the car.
Tesla Autopilot does much more than that. Autopilot has lane control, will pass slower cars on a multilane road, and will follow map directions.
The small print says "auto assist" (Score:2)
Re: (Score:2)
Indeed. And in particular, when controlling dangerous machinery (as a car clearly is), the human is required to read and understand the fine print. There is a reason you need to get licensed in order to be allowed to control a car.
Re: (Score:2)
Well, that article was underwhelming. (Score:2)
I don't get from the article why this rises to a felony level offense. Was the driver merely inattentive or is there more there? The author goes into plenty of background involving *other* "abuses" of autopilot, but those don't directly speak to what this driver did or didn't do.
Re: (Score:2)
Running a red light leading to an accident that kills someone might be the felony level offense. Not paying attention to the road, causing you to run the red light to the point of getting in an accident that kills someone bumps it u
Re: (Score:2)
Probably some "criminally negligent" path here. In the end, the car did run a red light and killed people while it clearly was the responsibility of the driver to prevent that. He did not. And it was reasonably clear that something like that could happen. At least to somebody without mindless trust in technology. Also the instructions said the driver needs to be ready to take back control at any time.
Re: (Score:2)
The historic intent for murder is "reckless disregard for human life", while for manslaughter it is criminal negligence.
criminal negligence is more than regular negligence, but not all the way to actual awareness of the risk imposed.
Operating a driving assistance system that requires driver attention and deliberately defeating the system that monitors that attention would probably reach at least criminal negligence, of not reckless disregard.
Re: (Score:2)
Unless I'm missing something from the article, this wasn't charged as criminally negligent, this was charged as the higher level manslaughter where one intended the act but not the outcome. Most states have this distinction although I will admit I have not looked at California.
Re: (Score:2)
"criminally negligent" isn't a crime. Rather, it's the required mental state for a small number number of crimes, particularly manslaughter.
Common Law recognized murder, voluntary manslaughter, and involuntary manslaughter.
Murder required a reckless disregard for human life (roughly awareness of the risk), voluntary manslaughter was a reduced form of murder with reached culpability by provocation (classic case being finding spouse in bed with another), and involuntary manslaughter required only criminal ne
Re: (Score:2)
Alright. But in-attentive driving isn't usually this level of charge per se. If this were a regular car and the person were, for instance, looking at their phone and this happened, the driver wouldn't be charged with this level of felony (if even charged at all).
What is the difference here? (I have no idea)
People WILL NOT watch the car and intervene (Score:2)
It's impossible for a human to do that long term, reliably. Unless you're autistic or something. It's harder and more stressful for a human to sit and be vigilant in case the automation fails in some way, than it is to just drive the car. So it isn't any wonder that human nature just ramps down the stress and the (stupid) person stops paying attention, eventually becoming neglectful. Unless the technology is perfect, this will happen over and over again. Either perfect the technology or make the human drive
Re: (Score:2)
Re: (Score:2)
I don't agree.... I mean, sure - a LOT of people out there are irresponsible, and others get overly confident in technology like Autopilot after driving with it enabled for long enough. But my experience using it on highways for road trips is, it makes it a *little* easier than "just driving the car yourself". In those scenarios, you're often driving many dozens of miles without even passing another vehicle, so the "vigilant watching" is only minimally stressful. Meanwhile, you're saving yourself that con
Re: People WILL NOT watch the car and intervene (Score:2)
It won't stress you because you'll be using your phone for most of it.
Trying to stay attentive to the road for miles of empty road is the problem. Manual steering is unforgiving, manual steering with primitive drift correction is uncomfortable enough you'll want to prevent it from kicking in. Miles of comfortable automated lane keeping with a stopped Honda Civic at the end which the car won't react to, then there's a problem.
Re: (Score:2)
I tend to agree. Monitoring a car constantly is probably quite a bit more stressful than actually driving it. Still, that only means people should not use the "self-driving" features in the first place. You start the car, things are your responsibility, no excuses. Tesla may share some blame for marketing a product only suitable for experts to regular people, but that one is tricky. There are quite a few high-end cars, for example, that regular people cannot drive safely. Still no blame on the manufacturers
source code or you must acquit! (Score:2)
source code or you must acquit!
Detroit? (Score:2)
Is "Detroit" supposed to be shorthand for "car stuff"? Does someone from Detroit want to make sure a negative article about Telsa gets coverage?
Sounds adequate to me (Score:2)
Tesla never had fully autonomous and probably will not get there for quite a while yet. While their ads and naming of features imply differently, if you look at the actual docs they tell you that you need to monitor what the car is doing constantly. And if you do not and the car kills somebody, that is indeed manslaughter.
Criminalizing driver-assist paradigm (Score:2)
Technology is on trial. Driver assist technologies are enabling distracted driving. What good is technology that’s just going to put me behind bars? Why buy it in the first place. Why use it? Why pay for it? That’s how this is going to play out going forward.
Driving through a red-light? Stupid human tricks kill. People die. Error prone wetware, distracted driver and attractive nuisances such as cellphones, in-car entertainment and technology that a driver is also in-charge of while a car is movi
Re:What color pants was the driver wearing? (Score:5, Informative)
It "Involved" autopilot because the driver was using it.
It didn't cause the crash though, the driver did that by being a fucking moron.
Re: What color pants was the driver wearing? (Score:3)
The system ignores human nature, in a system where the human is the last line of defence for highly frequent system failure that isn't excusable. The system induced in-attentiveness. If the Tesla's at the time were at least able to reliably react to oncoming static objects maybe it would be excusable, but they were not so it was not.
I blame regulators too, they should have regulated lane keeping to be uncomfortable, lanekeeping should shortly after losing user input start staying in lane by swerving rather
Level 4 can be unsafe / fail = PRSION if this case (Score:2)
Level 4 can be unsafe / Level 4 fail = PRISON if this case goes the wrong way.
Re: (Score:2)
Or... the right way? Isn't that what courts do with the application of laws?
Re: Level 4 can be unsafe / fail = PRSION if this (Score:2)
If the level 4 car could react to static objects with the reliability of past autopilot software I think prison for anyone using it would be appropriate. Marketing and certifying too.
The reliance on radar and it's inability to react to or warn of static objects was a known property at the time, a fact to which many firetruck could attest. The system should have been designed around that fact and some common sense understanding of human nature.
Re: (Score:2)
Re:What color pants was the driver wearing? (Score:4, Interesting)
It "Involved" autopilot because the driver was using it.
It didn't cause the crash though, the driver did that by being a fucking moron.
Moronism(?) is becoming more and more prevalent though. Perhaps a look at what makes so many more people morons is in order. Might even be some discussions on /. in the future.
Re: (Score:2)
Perhaps a look at what makes so many more people morons is in order.
https://news.slashdot.org/stor... [slashdot.org]
Re: (Score:2)
This crash has nothing to do with Autopilot, but that does not matter does it? The car has autopilot that is good enough to be exploitable by the anti-Tesla vultures.
This isn't the first time auto pilot [go.com] has been involved in a death. And Tesla apparently wanted to sweep it under the rug without telling anyone.
Exposing flaws in "self-driving" software and holding people accountable is the bare minimum which needs done. Or are you saying you're perfectly fine with vehicles flinging themselves into the path [businessinsider.com] of other vehicles and preventing the driver from taking control [thenextweb.com]?
Re:What color pants was the driver wearing? (Score:5, Insightful)
In Q4 2021, Tesla recorded only one crash for every 4.31 million miles driven with Autopilot engaged, as per the company’s recently-released Safety Report. Tesla also recorded one crash for every 1.59 million miles when Autopilot was not engaged. Based on Tesla’s report, there is a significant difference between the results when Autopilot was activated and when drivers did not use it.
For comparison with overall automobiles in the US, the National Highway Traffic Safety Administration’s (NHTSA) most recent data stated there was one automobile crash every 484,000 miles driven in the United States. The NHTSA also reported that vehicle fatalities increased in the first half of 2021. According to the agency’s report, an estimated 20,160 people died in US motor vehicle crashes from January to June 2021, up 18.4% over 2020. The agency noted the 20k-estimate was the largest number of projected fatalities since 2006. Note that 20k deaths was in six months, it will likely be 40k for the year. 40,000 killings by humans with their vehicles.
TESLA SAVES LIVES. That cannot be disputed by anyone who still has rationality in their thought process.
Are you saying you're ok with not reducing the number of accidents by 10 times?
Re:What color pants was the driver wearing? (Score:5, Informative)
In Q4 2021, Tesla recorded only one crash for every 4.31 million miles driven with Autopilot engaged, as per the company’s recently-released Safety Report. Tesla also recorded one crash for every 1.59 million miles when Autopilot was not engaged
A major flaw in the reasoning here is that engaging autopilot is self-selected. If autopilot was randomly on some miles and off others, then it would be valid. But it's not random. People choose to have it on or not based on conditions. And it's far more often on for easy miles, like freeway driving when not merging or frequently changing lanes, and off for harder miles, like city traffic with many intersections and driveways. The confounding factor - some miles are safer than others - can't be separated out from the autopilot factor.
It would be a like a drug test, in which healthy and sick people get the drug, and you let them choose the drug or the placebo, which of course they do based largely on if they are sick or not! The drug looks bad because only the sick people took it and they didn't do was well as the healthy people who took the placebo.
Re: (Score:3)
Re:What color pants was the driver wearing? (Score:5, Interesting)
If autopilot is on, the car gets in trouble, the driver intervenes which shuts off autopilot, but it's too late the stop a crash... does Tesla say the autopilot was on or off during the crash?
I don't trust anything a company tells me anymore. All data can be massaged into anything they want, especially "telemetry".
Re: (Score:2)
I feel that autopilot and similar are the way to go in the long run, but I'm really curious why it is that Tesla has one crash per 1.59 million miles when driven manually and other cars have one every 484 thousand miles. It makes me wonder if they're actually measuring the same thing, or if it's stuff like nobody lets a 16 year old drive a Tesla, and they have more of the crashes. Do you have a pointer to the NHTSA stats? All I can find is fatality reports, and one table that says police reported 6.75 milli
Re: What color pants was the driver wearing? (Score:2)
Re: (Score:2)
Re: (Score:3)
We don't know how often drivers crash in the situations in which Autopilot is engaged.
My autopilot will disable most features as soon as it starts raining. So if we're comparing 4.31 million miles on controlled freeways without traffic lights during fairweather to downtown Manhattan in a rain storm at night... yeah you're going to get fewer crashes.
That's not to say that it's impossible that autopilot is safer. It's just to say that we have no idea because we don't any useful data to say one way or anothe
Re: (Score:2)
That's surprising. I say that because Advanced AutoPilot was very willing to run in pretty heavy rain and generally did better than I did about picking out white semi-truck trailers in the heavy spray on I880. I could pick them out, but AutoPilot was normally about a second quicker in detecting them.
I suspect the biggest problem AutoPilot has is that the majority of the early miles were driven in California and so they have an outsized amount of data for California weather and roads which are skewing the mo
Crappy stats (Score:4, Insightful)
Hopefully a first year student would be able to explain that you have to account for confounding factors. AP will only be engaged in good weather on good quality roads. It switches off if it can't find white lines. It switches off if the weather is bad. It shouldn't be used where there are hairy traffic conditions. Once you correct for all those then no doubt you'll find it isn't particularly dangerous, but not particularly safe either. Tesla of course have access to that data but Musk prefers to make a fool of himself by using uncorrected stats.
Re: (Score:2)
Re: (Score:2)
you are comparing apple and orange (Score:2)
"1.59 million miles when Autopilot was not engaged"
"one automobile crash every 484,000 miles driven"
Those are not representing the same dataset. Tesla autopilot is not self driving, it is an enhanced steering. As such it can only steer/brake/accelerate in a lane in other word only in circumstance where you are going straight ahead. THAT alone eliminate a lot of the crash (" 36 percent of all automobile accidents happen at intersections. More than 480,00
Re: (Score:2)
What BS? Have you provided any evidence that autopilot, overall, is more dangerous overall than not having it enabled?
Re: (Score:2)
Re: Still BS (Score:2)
I evidence numbers showing the strong possibility that autopilot is safer. You have speculation as to why the numbers are what they are. Now you have theories as to why those numbers may not be proof of autopilot being better than a human driver, yet you act like it is a proven fact that autopilot is more dangerous than a human driver. You need to either prove that your speculation is true or stop trying to block autopilot.
Re: (Score:2)
Re: (Score:3)
Or are you saying you're perfectly fine with vehicles ...
No fatal accident is "perfectly fine", but let's keep some context here: Human-driven cars kill over 100 people every day in America and about 3000 per day worldwide.
It is not reasonable to expect self-driving cars to be perfect. As long as SDCs are killing fewer people per mile than HDCs, they should stay on the road, and we should continue to work to improve them.
Re: (Score:2)
Your last link is total horseshit. Even the author questions it while reporting it.
Any time that Autopilot is engaged, including the FSD beta, sufficient pressure on the steering wheel in either direction, or a tap of the brakes will cause the system to disengage completely. There is NO mode where it will fight for control unless you're a 2-year-old who can't muster the strength to move the steering wheel past the threshold where it disengages Autopilot.
And 2-year-olds should not be driving cars.
Re: What color pants was the driver wearing? (Score:2)
Comfortable automatic lane keeping combined with the cars inability to react to stopped objects at high speed was a lethal combination. It's a step too far in automation which adds nothing to safety, pretending society needs to allow the entire package of automation features or none of it is a false dichotomy.
Far more than Tesla being targeted, it's the Tesla reality distortion field which hide above's fundamental truth and killed these people.
Re: What color pants was the driver wearing? (Score:2, Interesting)
PS. also rammed a lot of police cars and fire trucks, Tesla got lucky the system induced the killing of two people in a cheap car and not a cop. Even their reality distortion field wouldn't have been able to save them from that.
Re: What color pants was the driver wearing? (Score:2)
That is the false dichotomy of whole package or none of it. Killed two people here.
Re: What color pants was the driver wearing? (Score:5, Insightful)
In Q4 2021, Tesla recorded only one crash for every 4.31 million miles driven with Autopilot engaged, as per the company’s recently-released Safety Report. Tesla also recorded one crash for every 1.59 million miles when Autopilot was not engaged. Based on Tesla’s report, there is a significant difference between the results when Autopilot was activated and when drivers did not use it.
You're assuming the miles driven with autopilot are under similar conditions to the non-autopilot miles. If people are disproportionately engaging autopilot for low-risk driving scenarios like highway driving then one would expect a big delta regardless of whether autopilot worked.
Without knowing the conditions it's entirely possible that auto-pilot increases accidents.
Tesla saves lives, that cannot be disputed by anyone who still has rationality in their thought process.
Those numbers you have were accidents, not serious accidents or fatalities. Again, it's possible that autopilot reduces fender benders but when a failure does happen, due to driver distraction, it's far more likely to result in a fatality.
We simply don't have the data.
Re: (Score:2)
> We simply don't have the data.
You have fallen victim to the fallacy fallacy. "Nobody knows!" is a complete cop-out given the absolutely ridiculous amount of data that is collected and aggregated about auto accidents.
The data absolutely exists to compare AP vs Human performance in identical circumstances. Just because it doesnt show up in the "marketing number" doesn't mean that it cannot be done; it's NTSB's job to do it and determine if the AP feature should be permitted on public roads. They employ a
Re: (Score:2)
This crash has nothing to do with Autopilot, but that does not matter does it? The car has autopilot that is good enough to be exploitable by the anti-Tesla vultures.
FTA:
In the Tesla crash, police said a Model S was moving at a high speed when it left a freeway and ran a red light in the Los Angeles suburb of Gardena and struck a Honda Civic at an intersection on Dec. 29, 2019. Two people who were in the Civic, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez died at the scene. Riad and a woman in the Tesla were hospitalized with non-life threatening injuries.
Criminal charging documents do not mention Autopilot. But the National Highway Traffic Safety Administrat
Re: (Score:2)
It is. Blame marketing.
Re: (Score:2)
It isn't.
Autopilot in aircraft is more like cruise control in a car than any other feature. It does not detect and avoid obstacles and will happily fly into a mountain. People who think autopilot in aircraft is more than it actually is, isn't even a marketing fail, it is an education fail. Tesla autopilot is much more than aircraft autopilot.
Re: (Score:2)
Re: (Score:2)
Assuming what autopilot is can be found on the driver's license tests?
Re: (Score:2)
No, it has everything to do with it. (Score:2)
This isn't about autopilot, but about how people will use autopilot.
Right now, autopilot-enabled cars are a relative rarity, but when they become more common, we may find the manner in which people use autopilot, expecting it to drive for them, may create more problems than the technology itself. Anti-lock brakes, for example, which gave idiot drivers more control in slippery situations, actually lengthened stopping distances. My sister found herself in an accident in broad daylight on dry pavement bec
Re:No, it has everything to do with it. (Score:4, Insightful)
For the average driver, who doesn't know how to brake better than ABS without locking up and skidding, it's not a reduction in effectiveness.
Re: (Score:3)
It bears mentioning that when ABS was allowed in Formula-1, every driver used it. I guess that's because Ayrton Senna was a pretty mediocre driver, right?
Re: (Score:2)
Re: (Score:2)
A modern ABS system with traction control out performs all but the expert professional driver.
I owned a number of Lotus Elise/Exige cars. The first one had no electronics at all - no ABS, no power steering. The breaks and accelerator pedal were directly connected, no modern 'by wire' as it is with all cars these days. I once came over a hill to find the traffic stopped in my lane, I hit the brakes and locked up, and when it was obvious I wouldn't stop in time, I had to release the brakes to stop skidding an
Re: (Score:2)
You're acting like this is the first inattentive driver to crash into something.
Not even close. This guy was operating the vehicle in an unsafe manner by not paying attention to the road while in the driver's seat and the vehicle was moving. It doesn't matter how many driver assistance techno-gizmos he had turned on at the time. He was negligent, and now must face the consequences for that negligence.
under some DUI LAWS just having an app can = DUI a (Score:3)
under some DUI LAWS just having an app can = DUI as you are in control / have the keys on you.
Re: (Score:2)
Re: (Score:2)
The only way to get there is forward. We cannot put the technology on a shelf and expect it to mature. Autopilot will always be involved in accidents. The real question is "are there fewer accidents per mile" with autopilot vs. human drivers.
Re: (Score:2)
Re: (Score:2)
You have no information to back up that claim.
Re: can someone translate (Score:2)
Re: (Score:2)
Just like both your comments.
I always find it somewhat amusing how much one can infer about someone from so few words: salty, offended, grammatically lazy, inarticulate, absolute statements from self applied authority, lacks conviction to post under their own account...
Ah-Ha!! - Hey: You wouldn't happen to be one of those BuzzFeed 'journalist' who 'learned to code' by chance?
Re: (Score:2)
For the first time, a driver who was using autopilot and got into a fatal crash has been charged with a felony. (I gotta wonder how many such crashes there have been. The ones I remember, the driver was killed, so charges would have been pointless...)
Re: (Score:2)
Re: (Score:3)
Easier explanation (Score:2)
We already know from decades of study that younger drivers are disproportionately more likely to crash. They are also much less likely to be wealthy enough to afford a Tesla.
What Tesla is doing is skewing their driver base toward older, more experienced, and safer drivers, which makes their safety record look good, when it may merely be the result of their owner demographic.
Re: (Score:2)
Re: (Score:2)
Based on Tesla’s report, there is a significant difference between the results when Autopilot was activated and when drivers did not use it.
TESLA SAVES LIVES. That cannot be disputed by anyone who still has rationality in their thought process.
Well, unless that rationality considers the possibility of confounding variables. For example, is it possible that Autopilot tended to be engaged in more benign situations?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
10 of these involved AP/FSD.
Of the 231 deaths, 105 were in the tesla.
THere is no safer car on the road, and apparently, with AP, it is even safer.
wrong (Score:2)
WIth that said, 1M cars, 6+ years of AP, and only 10 deaths while using AP. Hmmmm. Sounds impressive to me.