Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Crime Transportation

Felony Charges Are 1st In a Fatal Crash Involving Autopilot (go.com) 142

X2b5Ysb8 shares a report from ABC News: California prosecutors have filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot who ran a red light, slammed into another car and killed two people in 2019. The defendant appears to be the first person to be charged with a felony in the United States for a fatal crash involving a motorist who was using a partially automated driving system. Los Angeles County prosecutors filed the charges in October, but they came to light only last week. The driver, Kevin George Aziz Riad, 27, has pleaded not guilty. Riad, a limousine service driver, is free on bail while the case is pending.

The misuse of Autopilot, which can control steering, speed and braking, has occurred on numerous occasions and is the subject of investigations by two federal agencies. The filing of charges in the California crash could serve notice to drivers who use systems like Autopilot that they cannot rely on them to control vehicles. The criminal charges aren't the first involving an automated driving system, but they are the first to involve a widely used driver technology.

This discussion has been archived. No new comments can be posted.

Felony Charges Are 1st In a Fatal Crash Involving Autopilot

Comments Filter:
  • Early adopters... (Score:5, Insightful)

    by RussellTheMuscle ( 2783037 ) on Tuesday January 18, 2022 @05:48PM (#62186015)
    can also become legal guinea pigs. At least he'll get a mention in Wikipedia!
    • Comment removed based on user account deletion
      • When I was taught to drive a motor-coach, the simple rule was this: the registered owner is responsible for anything the vehicle does.

        Which is exactly what's happening here so "the system" worked as it should.
        • by saloomy ( 2817221 ) on Wednesday January 19, 2022 @12:22AM (#62186823)
          A few things to note though. Yes, the driver was responsible, but he is also negligent. In 2019, Tesla Autopilot was basically three technologies: TACC, Active Lane Keeping, and Navigation on Autopilot (freeways).

          For the uninitiated:
          TACC: Traffic Aware Cruise Control is just like Mercedes Benz or Lexus Radar guided cruise control. The cars had a radar and a camera, with which to gauge the distance between your car and the vehicle in front of you, and slow down your cruise control set point to not collide with the car in front of you.
          Active Lake Keeping: is just as the name suggests, the car would keep in the lane of travel as long as it sensed the lanes. If the car couldn't detect a lane it would alert you to take over and slow down until you did.
          Navigate on Autopilot: At the time the most advanced feature. On a freeway, the two lane blue lines from Active Lane Keeping would merge into one path the car will take, and the car was able to merge lanes from one lane of travel to the next if it recognized a lane with a dashed line and no cars in the way of the merge. It would do this if you had a destination set, and could merge into exit or free-way to free-way interchange lanes. When you exited the freeways, the car would come to a stop and disengage Nav on Auto, because it did not do city streets at all.

          It sounds to me like this guy left his car in TACC or Lane Keeping and let it run into an intersection on a freeway exit, which Teslas at the time could not do. Sometime in 2020 they rolled out a feature to detect traffic lights and stop signs. The car stops even at green lights when TACC or Lane Keeping is engaged until it detects it can proceed (by watching a car in front of it proceed), sees a green light, AND has the driver acknowledge it is safe to do so.

          Long story short, the driver put the car in a situation it was certainly going to fail, because in 2019 they did not stop for red lights. They do now. Reliably so..
          • by AmiMoJo ( 196126 )

            There is probably some liability for Tesla too. The system is designed to monitor the driver to make sure they are paying attention, but it is trivial to defeat and can't detect things like the driver being asleep with their hand resting on the wheel.

            When Tesla first introduced autopilot, it only need to detect your hand on the wheel when entering corners below a certain radius. As the accidents and regulator interest mounted up, they first made it so that it needed to detect a hand on the wheel once every

            • I am not following your logic.
              Tesla had found out its initial settings were not safe enough, so they changed it to make it safer, before a set of lawsuits and politicians who have stock in other self driving companies making a big fit about it.

              I do not remember any time where it was set for the half hour, or the 10 minutes, please provide sources of this default action.

              However (may be a bit bias) it seems that autopilot feature is actually safer then not [cleantechnica.com]. So I don't think it is a case that Tesla is being n

              • by AmiMoJo ( 196126 )

                The changes appear to have been in response to regulatory interest and lawsuits. The fact that they made those changes can be used to argue that the original system was unsafe and they should reasonably have known that before releasing it.

                • by hawk ( 1151 )

                  I am a lawyer but this is not legal advice. Pay my retainer if you want advice.

                  >The fact that they made those changes can be used to argue that
                  >the original system was unsafe and they should reasonably have
                  >known that before releasing it.

                  This is simply not true.

                  The inadmissibility of "subsequent remedial measures" is a longstanding legal principle in anglo-american law.

                  To *not* do so would rather strongly *discourage* learning from experience and making changes.

              • Tesla had found out its initial settings were not safe enough, so they changed it to make it safer, before a set of lawsuits and politicians who have stock in other self driving companies making a big fit about it.

                False. They decreased the time to shut up whiners. Nothing more. When you have 10 deaths associated with AP over 6 years, that is pretty impressive. NO OTHER CARS are that good, with human or automated driving.

        • If it had worked it would have deterred to stupid bastard from doing it in the first place. I mean it was only two people killed, wasn't it?

      • So if I rent a car, the car company is responsible for everything I do with the car, since they are the registered owner? If you lease a car, the bank is the registered owner, so you're in the clear for of the things you mention, is that what what you were taught? Someone steals your car and runs over some pedestrians, you believe that you are responsible? I think someone made some stuff up or you didn't pay attention in class. Even Tesla, in their autopilot fine print, clearly states that the driver is re

        • by bws111 ( 1216812 )

          New York traffic law: Every owner of a vehicle used or operated in this state shall be liable and responsible
          for death or injuries to person or property resulting from negligence in the use or
          operation of such vehicle, in the business of such owner or otherwise, by any person
          using or operating the same with the permission, express or implied, of such owner.

          The 2005 federal Transportation Equity Act specifically exempts rental and leasing companies from this, but in general if you own the vehicle you are res

          • by jbengt ( 874751 )
            That only covers negligence. But, the owner can sue the driver if it was the driver's negligence. It usually doesn't get that far unless the vehicle is uninsured or under-insured.
  • Aside from the name and the extra features, I can't see a difference between the two car features. I'm sure there are past instances of mentally deficient people using cruise control in the city murdering people with their car/negligence too.
    • Aside from the name and the extra features, I can't see a difference between the two car features.

      Adaptive cruise control [wikipedia.org] will automatically maintain a safe distance between your car and the car in front of it. But you still steer the car.

      Tesla Autopilot does much more than that. Autopilot has lane control, will pass slower cars on a multilane road, and will follow map directions.

  • The human is still responsible.
    • by gweihir ( 88907 )

      Indeed. And in particular, when controlling dangerous machinery (as a car clearly is), the human is required to read and understand the fine print. There is a reason you need to get licensed in order to be allowed to control a car.

    • But the question is whether Tesla has a valid reason to think all humans are capable enough to take over safely if there is a problem. Have they done studies? Common sense would dictate that some humans get distracted when doing nothing in a car for long periods of time; what has Tesla done to determine this is not the case?
  • I don't get from the article why this rises to a felony level offense. Was the driver merely inattentive or is there more there? The author goes into plenty of background involving *other* "abuses" of autopilot, but those don't directly speak to what this driver did or didn't do.

    • by tlhIngan ( 30335 )

      I don't get from the article why this rises to a felony level offense. Was the driver merely inattentive or is there more there? The author goes into plenty of background involving *other* "abuses" of autopilot, but those don't directly speak to what this driver did or didn't do.

      Running a red light leading to an accident that kills someone might be the felony level offense. Not paying attention to the road, causing you to run the red light to the point of getting in an accident that kills someone bumps it u

    • by gweihir ( 88907 )

      Probably some "criminally negligent" path here. In the end, the car did run a red light and killed people while it clearly was the responsibility of the driver to prevent that. He did not. And it was reasonably clear that something like that could happen. At least to somebody without mindless trust in technology. Also the instructions said the driver needs to be ready to take back control at any time.

    • by hawk ( 1151 )

      The historic intent for murder is "reckless disregard for human life", while for manslaughter it is criminal negligence.

      criminal negligence is more than regular negligence, but not all the way to actual awareness of the risk imposed.

      Operating a driving assistance system that requires driver attention and deliberately defeating the system that monitors that attention would probably reach at least criminal negligence, of not reckless disregard.

      • Unless I'm missing something from the article, this wasn't charged as criminally negligent, this was charged as the higher level manslaughter where one intended the act but not the outcome. Most states have this distinction although I will admit I have not looked at California.

        • by hawk ( 1151 )

          "criminally negligent" isn't a crime. Rather, it's the required mental state for a small number number of crimes, particularly manslaughter.

          Common Law recognized murder, voluntary manslaughter, and involuntary manslaughter.

          Murder required a reckless disregard for human life (roughly awareness of the risk), voluntary manslaughter was a reduced form of murder with reached culpability by provocation (classic case being finding spouse in bed with another), and involuntary manslaughter required only criminal ne

  • It's impossible for a human to do that long term, reliably. Unless you're autistic or something. It's harder and more stressful for a human to sit and be vigilant in case the automation fails in some way, than it is to just drive the car. So it isn't any wonder that human nature just ramps down the stress and the (stupid) person stops paying attention, eventually becoming neglectful. Unless the technology is perfect, this will happen over and over again. Either perfect the technology or make the human drive

    • Comment removed based on user account deletion
    • by King_TJ ( 85913 )

      I don't agree.... I mean, sure - a LOT of people out there are irresponsible, and others get overly confident in technology like Autopilot after driving with it enabled for long enough. But my experience using it on highways for road trips is, it makes it a *little* easier than "just driving the car yourself". In those scenarios, you're often driving many dozens of miles without even passing another vehicle, so the "vigilant watching" is only minimally stressful. Meanwhile, you're saving yourself that con

      • It won't stress you because you'll be using your phone for most of it.

        Trying to stay attentive to the road for miles of empty road is the problem. Manual steering is unforgiving, manual steering with primitive drift correction is uncomfortable enough you'll want to prevent it from kicking in. Miles of comfortable automated lane keeping with a stopped Honda Civic at the end which the car won't react to, then there's a problem.

    • by gweihir ( 88907 )

      I tend to agree. Monitoring a car constantly is probably quite a bit more stressful than actually driving it. Still, that only means people should not use the "self-driving" features in the first place. You start the car, things are your responsibility, no excuses. Tesla may share some blame for marketing a product only suitable for experts to regular people, but that one is tricky. There are quite a few high-end cars, for example, that regular people cannot drive safely. Still no blame on the manufacturers

  • source code or you must acquit!

  • I'm confused why the dateline said "Detroit", then went on about a California case. After the article there's a note that says the AP reporter is in L.A. with a researcher in New York.
    Is "Detroit" supposed to be shorthand for "car stuff"? Does someone from Detroit want to make sure a negative article about Telsa gets coverage?
  • Tesla never had fully autonomous and probably will not get there for quite a while yet. While their ads and naming of features imply differently, if you look at the actual docs they tell you that you need to monitor what the car is doing constantly. And if you do not and the car kills somebody, that is indeed manslaughter.

  • Technology is on trial. Driver assist technologies are enabling distracted driving. What good is technology that’s just going to put me behind bars? Why buy it in the first place. Why use it? Why pay for it? That’s how this is going to play out going forward.

    Driving through a red-light? Stupid human tricks kill. People die. Error prone wetware, distracted driver and attractive nuisances such as cellphones, in-car entertainment and technology that a driver is also in-charge of while a car is movi

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...