Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Businesses Government Transportation AI Security Software News Hardware Technology

Third Tesla Crashes Amid Report of SEC Investigation (usatoday.com) 297

An anonymous reader writes: Tesla hasn't had the best month so far as not one, not two, but a total of three crashes have been reported with the car's Autopilot self-driving system engaged at the time -- two of which resulted in fatalities. In addition, The Wall Street Journal is reporting today that the Securities and Exchange Commission is investigating whether Tesla violated securities law by failing to disclose more quickly a fatal accident in Florida in May involving a Tesla Model S that was in self-driving mode. The SEC didn't comment on the report, and Tesla issued a statement saying it has "not received any communication from the SEC regarding this issue." As for the Autopilot crash that was reported today, the driver said he activated Autopilot mode at the beginning of his trip. Tesla is looking into the crash and has yet to confirm whether or not Autopilot was a factor. Tesla CEO Elon Musk teased a "Top Secret Tesla Masterplan, Part 2" via Twitter that he is "Hoping to publish later this week."
This discussion has been archived. No new comments can be posted.

Third Tesla Crashes Amid Report of SEC Investigation

Comments Filter:
  • by Anonymous Coward on Monday July 11, 2016 @07:49PM (#52493387)

    Tesla marketing department needs a better term -- "Autopilot" implies something that the car is incapable of. Just call it "cruise control" and shield themselves from liability.

    • Back Seat Driver? (Score:3, Insightful)

      by Anonymous Coward

      Or maybe co-pilot to make clear the driver is the captain of the car?

    • That would work if cars didn't already have a thing called cruise control and if it wasn't a completely different thing.
      • by AK Marc ( 707885 ) on Monday July 11, 2016 @08:59PM (#52493795)
        Others have a substantially similar thing. They call it cruise control. Subaru calls theirs "Eyesight" but Toyota calls theirs "cruise control with lane keeping assist". Honda called theirs "lane keeping assist" as well.

        In general, they are adding modifiers to cruise control, and Subaru was the only other one I saw in a quick search that used a completely new name.
        • It's good to have "assist" in the name. Makes it clear that it doesn't do it for you.

    • Agreed! Surely they know how stupid people can be.

  • Autodrive car's may have to be at the FAA level of software testing / code review.

    • Autodrive car's may have to be at the FAA level of software testing / code review.

      While you bring a valid point here, the average automobile these days doesn't get a new coat of paint every few years in order to rack up half a million miles before being retired. Would be nice if they did, but the distribution chain of today would never stand for it.

  • by Anonymous Coward on Monday July 11, 2016 @07:56PM (#52493435)

    Tesla hasn't had the best month so far as not one, not two, but a total of three crashes have been reported with the car's Autopilot self-driving system engaged at the time -- two of which resulted in fatalities.

    The article about the most recent crash contradicts the summary poster's statement that two of the crashes resulted in fatalities. Only one of the crashes has resulted in fatalities.

    • by Rei ( 128717 ) on Monday July 11, 2016 @07:58PM (#52493447) Homepage

      That's exactly what I was thinking. I can't find any evidence that there has been a second fatality.

      And really, have only three Tesla vehicles period crashed, period, while on autopilot in 130m miles? If so, that's bloody impressive. More impressive than just a statistic of 1 fatality in 130m miles.

      • In fact, in the submission about the second crash, someone questioned the involvement of the autopilot as being a result of someone's rectal extraction.
      • They meant to say "non-fatal."

        That's a problem because anybody who engages the autopilot shouldn't be breeding.

      • by fluffernutter ( 1411889 ) on Monday July 11, 2016 @08:33PM (#52493629)
        Considering Autopilot only activates on the safest sections of road, three crashes in a month is pretty damn bad.
    • There has been only one fatality according to the linked articles. Mod parent up.
    • And let's not forget, the second incident actually wasn't confirmed the autopilot was on, according to data at tesla it wasn't, but the driver hadn't responded to calls from tesla for more information.. Also in this case we don't know if autopilot was actually on, the driver says so, but it doesn't mean it was.. But then again, looking at the crashphoto's of the third report, it seems to be a road which is on the tesla 'problem list' (no road divider etc), but then again, the car should notify the driver th
  • Slippery slope? (Score:4, Interesting)

    by Pezbian ( 1641885 ) on Monday July 11, 2016 @07:59PM (#52493455)

    When one attempts to make something idiot-proof, nature builds a better idiot. Not necessarily true, but we live in world where innovators are hampered by the chance of being sued by idiots who just-don't-listen.

    "Fire is hot", "peanuts may contain peanuts", "online play not rated", "cruise control is not auto-pilot", "autopilot is experimental", etc.

    Beta-testing is work.

    What I wonder is whether the automated steering fights the driver if said driver takes over to correct a computational error.

    • Re:Slippery slope? (Score:5, Interesting)

      by larryjoe ( 135075 ) on Monday July 11, 2016 @11:16PM (#52494425)

      When one attempts to make something idiot-proof, nature builds a better idiot. Not necessarily true, but we live in world where innovators are hampered by the chance of being sued by idiots who just-don't-listen.

      "Fire is hot", "peanuts may contain peanuts", "online play not rated", "cruise control is not auto-pilot", "autopilot is experimental", etc.

      I don't drive a Tesla, but the only message I heard about Tesla's Autopilot was the name. Yes, there are safety warnings in the manual and when you start up the car, but who actually pays attention that that? The same people who read EULAs? There's a reason the product is called Autopilot and not assist or level-2, and the reason is that they want to implicitly convey the idea that they are better than the competitors with mere assist or level-2. The name is not accidental.

      • by olau ( 314197 )

        I don't drive a Tesla either.

        According to this review [motortrend.com], they are far better than the competition.

        As far as I understand, you cannot miss the warning. It's not like an EULA with walls and walls of text.

      • Re:Slippery slope? (Score:4, Informative)

        by Anonymous Coward on Tuesday July 12, 2016 @07:19AM (#52495675)

        Funny thing, Autopilot is what this is.

        You may have the idea that "Autopilot" means the plane flies itself. Nope. Typically autopilot on the plane means it will fly straight and level until ordered otherwise. The autopilot on a plane absolutely will fly straight into another plane even, the human pilot is expected to take care of that sort of thing.

        • Funny thing, Autopilot is what this is.

          You may have the idea that "Autopilot" means the plane flies itself. Nope. Typically autopilot on the plane means it will fly straight and level until ordered otherwise. The autopilot on a plane absolutely will fly straight into another plane even, the human pilot is expected to take care of that sort of thing.

          Technically autopilot implementations on airplanes do exactly what you said and require a measure of continued vigilance on the part of the pilots. However, that is not what the term means in common language. The English idiom of putting something on autopilot means that something will work without any continued vigilance. In a way, this is a brilliant marketing strategy. The term connotes self-driving while denoting strictly not self-driving. A perfect marketing term.

    • Comment removed based on user account deletion
  • in the options. Plus it is cheaper. Win. Win.
  • by PopeRatzo ( 965947 ) on Monday July 11, 2016 @08:15PM (#52493537) Journal

    Because I like you guys, I'm gonna do you a solid and save you all kinds of tsuris later on. There will not be self-driving cars in any of our lifetimes. Yes, we will have something like super cruise control and driver assist, but no, you will never be able to call for your robot Uber to pick you up and drive you to your part-time job. It's just not going to happen. And finally, the people who know most about "driverless" cars are starting to come clean:

    The most realistic industry projection about the arrival of autonomous driving comes from the company that’s done the most to make it possible. Google, while never explicitly saying so, has long intimated that self-driving cars would be available by the end of the decade.

    In February, though, a Google car caused its first accident; a bus collision with no injuries. A few weeks later, Google made a significant, if little-noted, schedule adjustment. Chris Urmson, the project director, said in a presentation that the fully featured, truly go-anywhere self-driving car that Google has promised might not be available for 30 years, though other much less capable models might arrive sooner.

    Historians of technology know that “in 30 years” often ends up being “never.” Even if that’s not the case here, if you’re expecting a self-driving car, you should also expect a wait. And so you might want to do something to pass the time. Maybe go for a nice drive?

    http://www.nytimes.com/2016/07... [nytimes.com]

    Yes, you read that right. The project director for "self-driving cars" at Google just added 25 more years to his projection on when you're going to see them. And as the writer points out, most of us know that any tech prediction for 30 years down the road always ends in tears. If you go back 30 years, they were predicting tech that never showed up and mostly totally missed on the most important tech advances that did show up.

    Now I don't have a particular interest one way or the other regarding self-driving cars, except this: I don't want to see one dollar in public funds spent to develop this technology or to create infrastructure for a self-driving fleet until we've made actual public transportation affordable and viable, the way it was early to middle last century before Standard Oil and GM conspired to destroy public transportation in the United States (and yes, they were even convicted of doing so in court). So go ahead, Google and Elon and Tim Cook and all the visionaries. Make your self-driving golf carts all you want. Just don't ask for a dollar of taxpayer money, especially not until you start paying your taxes.

    http://www.whale.to/b/street_c... [whale.to]

    http://www.baycrossings.com/Ar... [baycrossings.com]

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • by OzPeter ( 195038 )

      There will not be self-driving cars in any of our lifetimes.

      You nay sayer you. It's only going to take near human level AI small enough to fit in a car before we can all sit back and take a ride. That's closer than fusion power and flying cars isn't it?

    • Maybe, but there's a HUGE amount of functionality that lies between "no autonomous driving capability at all" and "able to drive your kids to school, then bring the car home and park it in the garage for you".

      The fact is, we had the technology to make semi-autonomous cars capable of lanekeeping and collision-avoidance more than TWENTY YEARS AGO. The catch is, it would have:

      * doubled or tripled the cost of an average car

      * required the construction of thousands of miles of new freeway lanes for the exclusive

    • by AmiMoJo ( 196126 )

      It really depends what you mean by "self-driving car". I think it's realistic to expect Google's cars to be better than humans on paved roads by the end of the decade. Okay, you can't tell it to drive you literally anywhere, like off-road or perhaps onto a ferry/train, but in terms of being able to take a nap on the way to work I don't think that's an unreasonable expectation.

      What will probably slow it down is the pace of legal and insurance reforms require to support it.

    • by Ogive17 ( 691899 )
      Here's my thought on why I agree that it will be a long time to never; autonomous vehicles will not be completely effective until 100% of vehicles on the road are autonomous and they communicate with each other using an agreed upon standard.

      So now the question is do eventually roll out cars with this capability and require everyone on the road to have one.. or do we not allow people to use the function until all non-conforming cars are retired.. or do we have a separate road system just for these cars..
  • Tesla's autopilot is what systemd is to Linux. Beta tested by the users and still not ready for anything serious. Does it compile? Great, upload to the mirrors!

  • Lost focus (Score:5, Interesting)

    by Trogre ( 513942 ) on Monday July 11, 2016 @08:21PM (#52493567) Homepage

    Seriously, can't we just have a nice electric car without all this self-driving crap screwing it up?

    • by AK Marc ( 707885 )
      Yeah, if you don't like it, don't turn it on. And certainly don't turn it on, then hop in the back seat for a nap.
    • Yup. Unfortunately Melon Usk lost it when Ford registered focus as its trademark.
  • saved lives (Score:4, Interesting)

    by fluffernutter ( 1411889 ) on Monday July 11, 2016 @08:31PM (#52493617)
    How many lives has Tesla saved now? Is anyone keeping count??
  • Just one fatality (Score:5, Interesting)

    by wildsurf ( 535389 ) on Monday July 11, 2016 @09:07PM (#52493831) Homepage

    two of which resulted in fatalities.

    Sigh. One of the crashes resulted in one fatality. The other two crashes, no fatalities. (And it is not yet known whether Autopilot was engaged at the time of those two incidents.)

    Getting distracted with Autopilot engaged is like removing your seatbelt because you have airbags. You may be able to occasionally get away with it, but it's still an incredibly dumb thing to do. (And the former endangers other drivers, not just yourself.) The silver lining of these incidents is that maybe more drivers will start paying more attention while using AP, though it should have been up to Tesla to properly instill this sense of caution to begin with.

    And side skirts/guards should really be mandated for trailers nationwide. (They're already mandated in California.) It may not physically prevent an underride at high speed, but it doesn't have to; the radar is much more likely to detect them and trigger collision-avoidance braking. It's only a small patch for a small part of the problem, but better than not patching it at all.

    • two of which resulted in fatalities.

      Getting distracted with Autopilot engaged is like removing your seatbelt because you have airbags. .

      Why call it autopilot? just call it cruise control, and don't let the driver out of the loop. I have lane change assist in my Porsche. If it does not detect input from the driver after some number of minutes, it will automatically disengage and beep out a warning. Point is, the system doesn't even pretend to be engaged if you're not.

      • because Tesla's AP is nothing like Porsche's cruise control.
        In fact, Tesla is so far ahead of Porsche, it would be like comparing Porsche's lane assist, safety features and cruise control, vs. an old style cruise control.
      • Why are people so upset about the name? There is no evidence that points to that these three accidents would not have happened if the AP held another name. The main problem is not the name but that the AP is so capable that people have ignored the instructions and warnings and have begun to upload fun videos to Youtube where they drive from the back seat.
    • good points, but with 1 issue. AP works pretty decently once a car has gone over the lane. I was impressed when I tried it on a side road 3x. In the first case, my hands was about 1/2" away and to be honest, I was pretty nervous. It was moving from side to side, which included a curve and just minor fall off the edge (it was only 20-30''). During the curve, other cars were coming and I took back control for a bit and then reverted back to AP. This went on for a couple of miles before hitting the next real i
  • *Two* of which? (Score:5, Informative)

    by Theaetetus ( 590071 ) <<theaetetus.slashdot> <at> <gmail.com>> on Monday July 11, 2016 @09:15PM (#52493869) Homepage Journal

    Tesla hasn't had the best month so far as not one, not two, but a total of three crashes have been reported with the car's Autopilot self-driving system engaged at the time -- two of which resulted in fatalities

    The first one was the guy watching the DVD who went under the truck. Or at least, all of him below the neck did. Fatality!
    The second one was in Michigan, and the driver "survived a rollover crash." [slashdot.org]
    This is the third one, and "the driver said he activated Autopilot mode at the beginning of his trip."

    That's one fatality, Subby. These are your own links and summary. We expect you to read them, even if none of the posters or editors here do.

  • there was only 3 crashes

    not one, not two, but a total of three...

    and not 100!

    not one, not two, not three ... not fifty-two...

  • Anyone knowing how autopilot works should find the report sounding fishy. One doesn't "activate the car's Autopilot driver assist system at the beginning of the trip", so at zero speed, but when the road is appropriate, like being on a highway, at positive speed. Activating autopilot at zero speed doesn't work.
    Further this incident, which remains to be confirmed on several points, didn't caused any fatality. Slashdot doen't improve its declining aura in participating to what looks like a disinformation camp

  • First and foremost, On the Florida crash had a fatality.
    However, it is still thought that Pennsylvania's crash did NOT involve AP, [detroitnews.com] and Montana is still un-verified (though, I would expect this one to have AP since it had been running for some time). IOW, at this time, it is 1 crash using AP, and only 1 fatality.

    So, this brings up the question of why did an AC submit this story which has multiple false statements, and why is /. not checking validity? In addition, why has the header NOT been updated wi
  • There has been one reported death while AP was engaged. There have been other Tesla accidents in which the owners have allegedly reported AP is in use, but as of yet these owners have refused to allow Tesla to examine the logs to confirm or deny this allegation. As of yet only one of the accidents reported in the recent media has occurred where it is confirmed that AP was engaged at the time of the crash. Rumours make could clickbait, but once upon a time slashdot was concerned with facts.

  • Wish Elon Musk would concentrate on a few important difficult things than to go for every fancy thought that crosses its mind. Self driving cars are Google's obsession. Leave that to google. Concentrate on getting an affordable middle class, ok, ok, upper middle class electric car. The rocket is good. The power wall is good. The giga factory is good. Hyperloop is a stretch even for Elon.

    Focus, man, Focus.

  • I see they've started filming the sequel to Tucker.

Successful and fortunate crime is called virtue. - Seneca

Working...