US Regulators Investigating Tesla Over Use of 'Autopilot' Mode Linked To Fatal Crash (cnbc.com) 379
An anonymous reader quotes a report from CNBC: The U.S. National Highway Traffic Safety Administration said on Thursday it is opening a preliminary investigation into 25,000 Tesla Motors Model S cars after a fatal crash involving a vehicle using the "Autopilot" mode. The agency said the crash came in a 2015 Model S operating with automated driving systems engaged, and "calls for an examination of the design and performance of any driving aids in use at the time of the crash." It is the first step before the agency could seek to order a recall if it believed the vehicles were unsafe. Tesla said Thursday the death was "the first known fatality in just over 130 million miles where Autopilot was activated," while a fatality happens once every 60 million miles worldwide. The electric automaker said it "informed NHTSA about the incident immediately after it occurred." The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield. Tesla quietly settled a lawsuit with a Model X owner who claims his car's doors would open and close unpredictably, smashing into his wife and other cars, and that the Model X's Auto-Pilot feature poses a danger in the rain.
There had to be a first case... (Score:4, Insightful)
It was bound to happen sooner or later.
Luckily for Tesla this sounds like it couldn't have been avoided in any way.
There will be more... but, like Tesla says, their Auto-pilot system has thus far proven VERY safe. What remains to be seen is how the world reconciles the fact that there will always be outliers...
Re:There had to be a first case... (Score:5, Insightful)
Luckily for Tesla this sounds like it couldn't have been avoided in any way.
On the contrary, this seems like exactly the type of collision that auto-pilot systems should offer vastly improved protection from:
"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.
Seems like a pathetically primitive excuse for an "electronic eye" to me.
Re: (Score:3, Interesting)
Re: (Score:3)
This death happened in the area of the car Tesla spoke of the test machine breaking during a crush test of a roof, apparently withstanding the weight of 4 cars on its roof. A Trailer moving sideways crushing the car, you would think if the roof was super strong the car would be pushed sideways, instead of going under.
From another story I read, the Tesla was at speed when it hit the trailer that was across the road. Momentum caused the bottom edge of the trailer to shear off the top of the car (and probably the top of the driver) as it passed underneath. It's not like the car was parked and the trailer rolled over it.
Re: (Score:2)
Re: (Score:2, Informative)
Yes that is what appears to have happened.
Washington Post has the actual Florida Hwy Patrol traffic diagram from the accident:
https://www.washingtonpost.com/news/the-switch/wp/2016/06/30/tesla-owner-killed-in-fatal-crash-while-car-was-on-autopilot/
Re:There had to be a first case... (Score:5, Insightful)
Re: (Score:3)
The trailer didn't crush the roof, it cut it right off at mid windshield where the support is minimal on all sides due to windows. Probably also took the drivers head off as well.
Re:There had to be a first case... (Score:4, Insightful)
Crush is not the same as shear.
Re: There had to be a first case... (Score:2, Interesting)
Sounds to me like whatever beam they use to look for things in the way passed under the trailer - they'd need a wider range to ensure long hanging objects are detected.
If that's the problem, I'd call that negligence in design.
Re: (Score:3)
Yeah, I totally want my car to apply the emergency brakes every time I go under a bridge!
Maybe the answer is to fix the trucks, not mess with a system that has half the accident rate of human drivers.
Re: There had to be a first case... (Score:4, Insightful)
Sounds like this was at the top of a hill. The problem Tesla identified is one that happens because radar doesn't identify the road. So the radar could (sounds like it did) tell them (for example) the object is 15 foot above the cars elevation, but they would have no way (with radar alone) to know if the sign is past the crest of the hill, IE the car was climbing a hill, so when a 100 feet away the radar hit was likely 20 foot above the tesla, but because the tesla was going up the hill, when it was 50 feet away, it could be 12 foot above, but at 20 foot from the car, it had to know that it only had 4 feet. Sounds like Tesla had categorized this return, and because it never moved, Tesla never re-evaluated it, or it would have at least hit the breaks in the last few feet.
Lidar also maps the road, so the moment it got a scan of the road surface under the truck, it could calculate the height of the "sign" from the road, and know if it was going to impact. We ran both lidar and radar, the lidar has issues with rain/snow/fog, the radar with these issues, and a lack of precision (misses smaller dark objects) But in this case, the lidar would know the road height, so even if the truck had been all black, the radar hit would have been combined with the lidar, to know it's elevation wasn't far from the road surface.
Re:There had to be a first case... (Score:5, Insightful)
Why couldn't it see the tires, undercarriage, and side reflectors? And if the image was so washed out that it couldn't make out the outline, then because it couldn't see clearly, the car shouldn't have been moving so fast. It violated the Basic Speed Law just as surely as if it had been driving the speed limit in heavy fog, and that's a programming error.
It would also help to upgrade the camera to one with a wider dynamic range and/or more resolution so the image is less likely to get washed out again.
So there's a software fix and a hardware fix that will prevent this from happening again in the future. Unavoidable, my foot!
Re: (Score:2)
Too bad that he died but good that this moron didnt take anyone else with him, a moron who put his faith in a opt-in beta function and following the instructions.
Re: (Score:2)
Re: (Score:3)
It does seem like a bit of an oversight to have distance sensors that only work at grille height, not right up to the height of the roof. When Nissan recently demoed their auto-pilot system, it had sensors mounted at the top corners of the windscreen.
Secret faster than Ludicrous speed mode (Score:5, Funny)
In addition to the Tesla having Ludicrous speed mode, it appears there's a secret Kevorkian speed mode.
Re:There had to be a first case... (Score:5, Insightful)
Having the car on autopilot, but requiring the driver to "pay attention" and be ready to take over within seconds is the worst combination possible.
If the car did not have autopilot, the driver would have been more attentive, since he would have been driving the car.
If the car was completely on autopilot, then one hopes the computer would detect objects on the road, at least the bigger ones, like lorries and tanks.
What is the reason one would switch the car to autopilot? Most likely so that they can be less attentive to the road. If I need to be as attentive as driving an older car, then I will not use the autopilot. The reason is that the constant minor adjustments I usually have to make (the road is not straight after all) help me to keep my attention on the road.
If autopilot means that the driver has nothing to do most of the time, but has to react really fast when something bad happens is a problem, because boredom reduces the attentiveness.
Also, let's say the driver was paying attention to the road. The car is on autopilot, he sees that a lorry is getting closer to him. How is he supposed to know that this is the time the computer will fail to notice a huge lorry (it noticed much smaller cars and pedestrians with no problems before) and take over?
Re: (Score:3, Insightful)
Having the car on autopilot, but requiring the driver to "pay attention" and be ready to take over within seconds is the worst combination possible.
No. Having the human in full control is worse. Despite this fatality, Autopilot still has a far better safety record than human drivers.
Need more data (Score:5, Insightful)
Autopilot still has a far better safety record than human drivers.
There is insufficient data to take that claim seriously at this point. Autopilot features are promising but need a LOT more miles in real world conditions before it is safe to make generalizations like that. I know a lot of people have high hopes for autopilot (myself included) but let's not let our expectations get in the way of scientific evidence.
One thing I do feel comfortable stating is that more people are going to die before autopilot features become truly safe. There are lots of corner cases that we're going to have a hard time predicting and we'll only learn about after some accidents occur. This is the case with any new transportation technology. Airlines are very safe these days but early on they were significantly less so because there were problems we didn't know about yet. Airline windows are rounded because we learned the hard way about stress fractures from sharp cornered windows which wasn't obvious at the time. People had to die to learn that lesson. This won't be the last time someone dies making autopilot safe.
Re:There had to be a first case... (Score:5, Insightful)
The balance is that Tesla will learn from this accident. They will change the software on the existing vehicles to try to detect this situation better, and they will undoubtedly outfit the next generation of cars with improved sensors to avoid this specific accident.
In contrast, in a human-driven car, the only one who learned anything is dead, so the next person who gets in the same situation will likely react the same way and end up just as dead. At best, there might be a slight change to driver education because of it, but it isn't worth adding e.g. an extra lesson to the curriculum to avoid one specific accident.
Re:There had to be a first case... (Score:5, Interesting)
1 in a 100 million will still result in hundreds of deaths a year considering how often a car is used..
How is that worse than the one in a million caused by human drivers?
YouTube is full of videos of people driving along minding their own business when somebody else falls asleep at the wheel and drifts into their lane.
Just because tech isn't perfect, doesn't mean it isn't better then the existing system.
Attention vs autopilot (Score:3)
Having the car on autopilot, but requiring the driver to "pay attention" and be ready to take over within seconds is the worst combination possible.
Not in all cases. We've had cruise control for many years which is a crude form of autopilot. It's been my experience that (surprisingly) it doesn't result in paying less attention to the traffic around you. If anything my experience has been the opposite. I tend to actually pay as much or more attention when the cruise control is on. I've spoken with other drivers who have experience the same thing. What you are saying has merit but there is some nuance there too. I don't think it is quite as simple
Re:There had to be a first case... (Score:5, Insightful)
My opinion is that Tesla's self-driving system is not nearly as safe as they claim. One doesn't have to look very hard to find videos like this [youtube.com] one where the driver has to react to prevent the auto-pilot from causing a crash. I question how long, realistically, a production Tesla can stay on the highway before a human needs to intercede to prevent an accident.
Given enough time, and enough lawsuits, I think that Tesla will shut off their self-driving feature. It needs to be a lot robust than it current is. I can't say with any expertise, but it seems like their competitors are taking their autonomous vehicle research far more seriously with plans to install a more sophisticated sensor package on their cars.
Re: (Score:2)
Depending on how close the truck was when it left it's lane, standing on the brake might not have helped.
Re:There had to be a first case... (Score:4, Informative)
Depending on how close the truck was when it left it's lane, standing on the brake might not have helped.
The truck wasn't in a lane -- it was crossing the highway (and perpendicular to the lane the Tesla was driving in) while making a left turn (presumably from a road that intersects the highway).
Re: (Score:3)
Looking again, I see that. Yeah, if the driver had been paying attention at all rather than relying on autopilot, he could have stopped.
Re:There had to be a first case... (Score:5, Insightful)
Re:There had to be a first case... (Score:5, Insightful)
Re:There had to be a first case... (Score:5, Interesting)
That seems more dangerous than having the driver do all or most of the work
It may seem that way, but nevertheless there is overwhelming evidence that Autopilot improves safety. You should look at actual data rather than relying on gut feelings about what "seems" to be true.
Re: (Score:3)
Tesla doesn't _HAVE_ a self-driving feature.
They have auto-pilot feature, which just like in real planes requires the pilot / driver to retain situational awareness at all times.
Re: (Score:3)
Even if the driver was paying attention (in that case what is the point of the autopilot?), how is he supposed to know that this is the one time the computer is going to fail to stop the car? I mean similar situations, but with other cars or pedestrians may have happened before and the computer stopped the car. So, one may believe that if the computer can see a pedestrian, then it most definitely can see a lorry which is may times bigger than a pedestrian. By the time you notice that the computer is not goi
Re: (Score:3)
I too am not sure I agree, but, there's one thing I'm particularly curious about, and that's the role of the truck driver in the crash.
I'm not familiar with the 'rules of the road' in the US, but I did look at the police sketch of the accident scene. The truck turned left, crossing oncoming traffic. The truck driver is reported as saying "The Tesla was moving so fast I didn't even see it" or words to that effect.
Yet, as far as I can tell, there's no indication that the Tesla was exceeding the speed limit fo
Re: (Score:2)
Sure you could, you can't just look for objects on the pavement, you need to have your radar looking for any object up to the full clearance (plus a few inches) minimum. This exposes a hole in their algorithm as the software isn't looking at objects above a certain height which is foolish.
Re: (Score:2)
My first thought, only killed one person? How many is GM or Chrysler up to this year alone?
Re: (Score:2)
"How safe are driverless cars going to be if Roombas don't even work right yet? "
How safe are nuclear pressure vessels if a PET bottle with water in it will explode when heated in a microwave? Clearly we're using alien zero point technology and not water boilers for power production.
Re: (Score:2)
What kind of sandwich runs on thorium?
Re: (Score:2)
What kind of sandwich runs on thorium?
Here's one [nih.gov]
Re: (Score:2)
Re: (Score:2)
The problem is that replacing subways and trains with self-driving cabs will make the cost of public transportation skyrocket. It's telling that the people who are pushing this
Re: (Score:2)
The problem is that replacing subways and trains with self-driving cabs will make the cost of public transportation skyrocket.
I don't think so. If I exclude parking, it is cheaper for me to drive than to take a bus or train. Poor people don't use public transit because it is cheaper than driving, they use it because they can't afford to own a car. But with self-driving-taxis, they don't need to own it, and the cost of the car is amortized across many more people. SDCs will likely be cheaper, and certainly far more convenient, than current mass transit. Even poor people value their time.
So twice as safe then? (Score:3, Insightful)
That's still pretty impressive if it's twice as safe as letting a human drive.
Even more so after seeing all the videos on youtube with people in the back of the car letting tesla drive.
Re: (Score:2)
I would bet that factlet wouldn't stand up to much scrutiny.
Re: (Score:2)
Divided highway driving is relatively safe. Autopilot is basically highway driving.
They compare their highway fatality rate to others total worldwide fatality rate.
How many Teslas have been sold in places like India, Africa, S. America? Where driving death statistics are insane.
Re: (Score:2)
And it's not twice as safe as careful drivers, it's twice as safe as all drivers including those who fell asleep, drunk, on drugs, etc. Not a good standard to match.
"They" who? (Score:2)
If you read the Tesla blog post that the (biased?) article is talking about, instead of the (clearly biased) summary, you'll see that Tesla actually points out that the US fatality rate is 1 per 94M miles (though the blog post does also mention the 60M worldwide number). Blame CNBC for picking the worse comparison point (and the submitter for choosing that article, I guess), not Tesla. What "they" are you talking about here? [teslamotors.com]
Re: (Score:2)
I'd like to know how many highway miles is typical before saying it's impressive.
The real issue right now is that self driving cars seem to cause accidents.
The Google cars are in far many not at fault accidents relative to a normal driver, which implies they're actually quite bad drivers, doing weird shit.
Re: So twice as safe then? (Score:2)
Re: (Score:2)
That's still pretty impressive if it's twice as safe as letting a human drive.
Even more so after seeing all the videos on youtube with people in the back of the car letting tesla drive.
It's not clear to me that the Tesla system is safer than a human based on the quoted numbers. First, the incidents are very unlikely for either human or Tesla, so it's not clear that once in 130e6 or once in 60e6 miles is statistically different. Second, the populations for the two numbers are definitely different. Tesla owners are clearly not representative of the general population, so the more apples-to-apples comparison is between Autopilot and manual driving for the same type of drivers, probably ch
Re: (Score:3)
You realize that traffic fatalities are a multiple-times-daily occurrence in the USA alone, right? That's not some fuzzy guesstimate, it's about as statically sound as you could hope for. 94M miles (the number Tesla gives per fatal accident in the US, which is a better comparison than the idiot submitter and CNBC author chose to display) is nothing in a country with over 2.5 times that many vehicles. The worldwide rate is, if anything, possibly less well-established just because it's hard to collect accurat
Re: (Score:2)
The problem is that the computer crashed the car in a very stupid way that most likely would have been prevented if a human was driving the car. I mean the computer failed to notice a huge lorry. I would have noticed it even without my glasses.
If the accident was something more like the ones sober people cause, it would not be such big news. However, if a human driver caused such an accident, people would think that he was really inattentive (searching for his phone on the floor, texting etc) or drunk.
So, a
Re: (Score:2)
The problem is that the computer crashed the car in a very stupid way that most likely would have been prevented if a human was driving the car. I mean the computer failed to notice a huge lorry. I would have noticed it even without my glasses.
I remember talking with someone who shared an experience she had had a few weeks earlier. She was a road construction worker, holding a stop sign and there were a few cars stopped. A big tractor trailer came to a stop, a little bit later the back of the trailer jumped up and went back down. She radioed in that she thought something was wrong and went to investigate. A motorcycle had come to a complete stop behind the trailer, but the pickup truck driver behind the motorcycle wasn't paying attention, didn't
Re: (Score:2)
If course, if the driver is busy writing a text message or browsing the net (or drunk, or asleep) then he may not even notice a cargo ship.
However, when this happens, the driver is (correctly) blamed and ridiculed for it, since normally you may not notice a cyclist or a pedestrian (dressed in black clothes at night with no reflectors), but a tractor trailer is kinda obvious.
This is why it is bad if the computer makes a mistake most human drivers wouldn't. I am sure it normally has much better visibility and
Re: (Score:3)
If course, if the driver is busy writing a text message or browsing the net (or drunk, or asleep) then he may not even notice a cargo ship.
In the UK, being an island nation, we've honed our cargo ship detection capabilities and spot them on the roads even when we're drunk, texting or turned to talk to the kids in the back.
This explains the absence of large container and freight vessel related injuries on the UK roads.
Not twice as safe I feel (Score:5, Insightful)
Re:Not twice as safe I feel (Score:5, Informative)
well, comparing to the worldwide rate of accident might not be reasonable. Some countries have a very high rate of accident and fatalities. One should compare to the accident rate in the same locations.
According to wikipedia [1], fatalities in driven accident in the us is about 15 per billion mile. Which also about 1 per 65 million miles.
[1] https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
well, comparing to the worldwide rate of accident might not be reasonable. Some countries have a very high rate of accident and fatalities. One should compare to the accident rate in the same locations.
According to wikipedia [1], fatalities in driven accident in the us is about 15 per billion mile. Which also about 1 per 65 million miles.
[1] https://en.wikipedia.org/wiki/... [wikipedia.org]
The comment you answered asked for fatalities on highways. Highways are typically much safer than any other type of road, so you can't compare highway only to general average.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
"the first known fatality in just over 130 million miles where Autopilot was activated," while a fatality happens once every 60 million miles worldwide..
It is quite disingenuous as it is comparing US high-end vehicle driver statistics with world-wide statistics including 3rd-world countries where driving can be borderline suicidal. As a quick comparison via Google. The Insurance Institute for Highway Safety [iihs.org] reports as of 2014 (last year stats are available, including all vehicle types), there were 32,675 vehicle crash-related fatalities. By state, that ranges between one fatality in 68 million miles driven (South Dakota) and 161 million miles driven (Vermon
Re: (Score:2)
Wow.. I messed that up by picking the second worst and second best some how. worst case is 1 in 60.6 million miles (South Carolina), best case is 175.4 million miles (Massachusetts). These figures include fatalities of motorcyclist/bicylist/pedestrians as well as fatality injured drivers with blood alcohol content (BAC) >= 0.08.
Re: (Score:2)
No statistic is, especially when coming from the PR department...
What's that saying? Lies, damn lies and Statistics? No, the other one... Figures never lie, but liars figure..
Autopilot fatalities? (Score:5, Interesting)
Re:Autopilot fatalities? (Score:5, Interesting)
If my car is in autopilot, and I take control of the vehicle just before dying in an accident, is it considered an autopilot fatality?
Depends on who's lawyer you ask. You can bet the counsel for the automobile manufacturer is going to blame the dead person....
Re: (Score:2)
Re: (Score:2)
In this particular scenario: You're dead. You don't care.
don't worry guys (Score:4, Informative)
Tesla's autopilot mode is still in beta, so it isn't a big deal. Because it's apparently OK to sell cars that have only been beta tested at most.
Tesla noted that customers need to acknowledge that autopilot "is new technology and still in a public beta phase" before they can turn it on. Drivers also acknowledge that "you need to maintain control and responsibility for your vehicle."
Re: (Score:2)
EULA .... Gota have them I guess.
I'm surprised that it's not a 45 page legal waver of liability you have to acknowledge every time you hit the button..
Re: (Score:3)
bottom of the trailer hit the Tesla vehicle's wind (Score:2)
>bottom of the trailer hit the Tesla vehicle's windshield
aka driver got decapitated smokey and the bandit style :o
Rushing things to market that can KILL YOU (Score:3, Insightful)
Re: (Score:2)
IF we are talking about probabilities.... Why on earth did you put guns on your list. Gun deaths are pretty rare overall, even with the weekly tallies from places like Chicago... I believe you are more likely to drown than get shot...
Re: (Score:2)
IF we are talking about probabilities.... Why on earth did you put guns on your list. Gun deaths are pretty rare overall, even with the weekly tallies from places like Chicago... I believe you are more likely to drown than get shot...
Well, in the USA, fatalities due to traffic and guns are about equal (around 30K/year, or 2 per state per day.) Note guns is 1/3 homicide and 2/3 suicide.
Re: (Score:2)
Actually this is a good thing for the autopilot. (Score:5, Insightful)
According to the article this was not something the driver could see and avoid, and while the autopilot could not see it either, based on the data from this crash it *could* see it the next time. Drivers learn from their own experience and fatal crashes terminate their learning experience, while autopilots learn from ALL autopilots on the road, and there are no "fatalities".
Of course it is always the fault of humans in the end, in this case the tractor trailer was not supposed to be there, so we'll only have perfect records when we get rid of all the drivers and have all the cars on autopilot.
Re: (Score:2, Insightful)
"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.
The driver likely didn't notice because he wasn't paying attention. That is something drivers can see when driving. You can see the front of the truck cross in front of you, you understand the sky doesn't suddenly shift colors, you understand previously seen road doesn't suddenly turn into sky, and if all that escapes you you can still see the truck wheels along the road. The car probably saw them too and figured it would fit between them. Why didn't the radar see the truck? Too angled towards the road
Re:Actually this is a good thing for the autopilot (Score:5, Informative)
Why didn't the radar see the truck?
The radar saw the truck's trailer, but misidentified it as an overhead sign, because it was so high off the ground.
Did the driver die?
Yes.
If so, how do we know he didn't notice the truck?
If he had noticed the truck, he presumably would have applied the brake. (we'll have to assume the driver wasn't feeling suicidal)
Re: (Score:2)
If I understand correctly he was an enthousiastic early adopter, putting youtube videos up that showed how well the computer in his Tesla did. He was probably reading a book or something just to show off his smart car.
Re: (Score:2)
Re: (Score:2)
Tesla is clear in their manual and click-through agreement that autopilot is not a self-driving car, but a sophisticated driver assist. Most people won't read anything, though, so I suspect the majority of Tesla owners don't understand this.
What's the point? Most things people like about cars are subjective. I like the (fairly limited) driver assists in my car a lot. The intelligent cruise control is great for limiting your top speed when a cop is around, the "beep if you're in a dangerous situation" fe
Re: (Score:2)
Re: (Score:3)
Sure, just be perfect. Great plan for humans.
Re: (Score:2)
No, this was something the driver did not avoid. We don't know if/when the driver saw it, or if the driver could see it.
It is likely the driver was staring at their phone and not looking at the road because they assumed the car's autopilot mode worked.
Re: (Score:2)
Autopilot mode isn't "hands off" or true autonomous driving. In fact, Tesla's implementation doesn't even use the GPS. It's really a more sophisticated lane keeping and cruise control system. It can change lanes, but you have to command it to do so.
In fact, if
Re: (Score:2)
Re: (Score:3)
If this had been a Google car, there would have been no accident. Google has much better radar which maintains a model of all vehicles in the vicinity before they turn into your direction, and it's high enough up that it would not miss a trailer.
Re: (Score:2)
"so we'll only have perfect records when we get rid of all the drivers and have all the cars on autopilot."
And all the cyclists and all the children and all the senior citizens and all the animals....
Basically the only way to get perfect records is to create roads that are dedicated to driverless cars. Then a very complicated problem suddenly becomes fairly easy to solve.
Re: (Score:2)
Nah, it's not self preservation, it's the car. If I die, I'm dead, shrug. If I wreck the car there's a ton of insurance paperwork, I've got to sort out a rental replacement, I need to search for a new car, there're delays, there's hassle, it's expensive.
Easier to just avoid having an accident.
What if the tractor trailer was on autopilot? (Score:5, Interesting)
This would not have occurred I suspect.
Welcome to the creepy valley (Score:3)
Until "autonomous" means exactly that, we will have people lulled into not paying attention, and a driving system that cannot handle everything that is thrown at it. The result will be crashes.
No manner of EULA, or cries of BETA will get around that predictable result.
Expecting human nature to change to match your product's limitation is a fool's journey.
Re: (Score:2)
Expecting human nature to change to match your product's limitation is a fool's journey.
Tell that to the collectivists who are in favor of various styles of socialist or communist forms of government.
Strat
Of course the AutoPilot would see the truck (Score:2)
It would be a high priority to look for vehicles crossing unexpectedly. That is a major cause of accidents. And not that hard to do with just stereo vision.
However, the article contains no useful information. Did the AutoPilot actually see it? Did it interpret it correctly? Did it try to slow down at all? At what point after the truck started moving (but before it entered the other road) did the AutoPilot react?
Those are the critical questions.
I think it would be safe to assume that the auto pilot did
Re: (Score:2)
Never hit the brakes. Saw the trailer, thought it was a sign. Forward radar is aimed at road level, didn't detect trailer and apparently thought the tractor was clear.
Re: (Score:2)
I think it would be safe to assume that the auto pilot did slow down once the truck was in front of the car, but that, of course, would be too late.
From TFA:
Why isn't it the trucks fault (Score:5, Interesting)
It sounds like the truck crossed the lane without enough time for the oncoming cars to make it but all we hear is how the autopilot is at fault. I can understand how the the sensors missed the trailer and that is going to be something all developers will have to add to their tests (when seeing a rig with a space after it then check for tires).
We are going to see cases like this come up now and again with self driving cars but there won't be a need for a recall. What should happen is an alert go out to the owners of cars while the manufacturers check their systems. If their cars pass tests then they can send out messages to their customers. If not then they create an update, test it, verify it, and send it out. Until owners of the cars hear that the system has been verified then they need to be extra vigilant when such an event happens.
Re:Why isn't it the trucks fault (Score:5, Informative)
I wonder how the car got under the trailer. Is there no regulation for trailer impact protection?
In Europe, trailers are required have strong bars on sides and back of the trailer to prevent cars getting under. The sides also must have flat covers, which improve visiblity.
Here's an example: https://de.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:3)
According to the diagram of the crash (In this WaPo article linked from another reply in this thread) [washingtonpost.com] it looks like the truck was making a left turn onto a side street, across the path of the Tesla coming from the other direction. The accident seems to have occurred here at US27A and NE 140 Ct. [google.com]
So it was an un-signaled intersection, at a typical grade crossing of a rural 4-lane US highway divided with a grass median. This meant that the truck apparently crossed when it was not safe to do so. I don't see any
Trailer design (Score:5, Informative)
There seems to be big flaw in the design of the trailer that allowed this to happen.
In the UK HGV trailers are required to have side and rear run-under prevention to stop this very thing from happening.
http://www.transportsfriend.or... [transportsfriend.org]
Re: (Score:2)
Tbrakes.sys has caused a system error. Hold down start to reboot.
Re: (Score:2)
The article may or may not be a hit piece, but the regulator will determine whether the current Tesla software is safe enough for public roads.
I don't think it is, but I'm not the regulator.