Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Government United States Technology

Auto, Tech Industries Urge Congress To Pass Self-Driving Legislation (axios.com) 72

John Bozzella, president and CEO of Global Automakers (a trade association and lobby group of automobile manufacturers), said at an Axios event Thursday that it's "critically important" that Congress pass federal legislation on autonomous vehicles. A year ago, the House approved the Self Drive Act, but it has yet to be passed by the Senate. Axios adds: This delay is set against a growing fear in Washington, Silicon Valley and the auto industry that the U.S. will fall dangerously behind in autonomous vehicle standards and policies while China and Europe leap ahead. "My fear is we fall behind with the rest of the world," said, Congressman Robert Latta (R-Ohio), chairman of the Digital Commerce and Consumer Protection subcommittee. As breakthroughs are happening on the mechanical, computer and engineering levels with regard to autonomous vehicles, "time is running out" on moving policy forward, he added.
This discussion has been archived. No new comments can be posted.

Auto, Tech Industries Urge Congress To Pass Self-Driving Legislation

Comments Filter:
  • by Anonymous Coward on Thursday September 13, 2018 @06:12PM (#57309940)

    All it took was Uber killing someone before they went straight to Congress to limit their liability.

    • Or do we need some CEO to sit in an small town jail till they can work out all of the NDA / EULA BS to not be in contempt of court

    • One does get a bit nervous when confronted with capitalists begging to be regulated. Perhaps it's time to check our wallets and count the silverware.

    • by rtb61 ( 674572 )

      From the article "It's unrealistic to expect there won't be accidents involving autonomous vehicles during testing" ie shut the fuck up lab rats, dying is a necessary part of future profits, as long as you do the dying and they get the profits, good ole privatise the profits and socialise the losses. From our perceptive, the number of accidents acceptable on public roads during the development stage, zero. Post development, well, as long as the executives are executed when their programming kills people, I

  • by HornWumpus ( 783565 ) on Thursday September 13, 2018 @06:16PM (#57309978)

    Give it to them. Make them all develop ECUs using FAA 'commercial air' software standards!

    That's not what they meant? Too bad for them, it's what they need.

    • by Anonymous Coward

      You obviously have no idea how much more complex driving is than actual plane autopilots to so freely equate them by slogan.

      • I do though.

        I want current ECUs fixed BEFORE we let the bozos start shipping, insanely buggy, 'self driving AI' alphas.

        If some other nation wants to make their roads the alpha test site, let them.

        • I am a conservative, I hate excessive legislation. BUT I AGREE with you. There is too much software, already installed on vehicles that I do not think has been properly tested. That is a problem that cross's all of I.T.. And yes, I am I.T. just saying, self driving cars with bad software equals death. I actually would like to see this to it's conclusion. BUT, I want it done right. And this time, the government needs to step in. Just like they did with air bags, the ability to open the trunk with a 50 cen
      • So you're suggesting even more stringent standards than those of the FAA? Sounds good to me.
    • by m00sh ( 2538182 )

      Give it to them. Make them all develop ECUs using FAA 'commercial air' software standards!

      That's not what they meant? Too bad for them, it's what they need.

      There are plenty of ISO standards for ECUs. ISO 26262 [wikipedia.org]

      • If current generation 'brake by wire' passes, the standard isn't close to good enough.

        • by m00sh ( 2538182 )

          If current generation 'brake by wire' passes, the standard isn't close to good enough.

          Are you talking about the Toyota's "sudden acceleration" problems from 10 years ago?

      • by ThosLives ( 686517 ) on Thursday September 13, 2018 @08:07PM (#57310664) Journal

        26262 is only functional safety though - it only covers "if some electronic part breaks, can the vehicle be made safe," and "have best processes to eliminate systematic errors been followed?"

        Full autonomous driving has all those pitfalls of random hardware failure and systematic design errors, plus it has SOTIF - Safety of the Intended Function - concerns. Basically, are things safe (enough)when parts are not broken and if you had zero software bugs?

        SOTIF is really hard, and we don't have time-test processes for it. Consider this: 26262 is based on around 50 years of aerospace and other industrial automation experience. We don't have that for SOTIF.

        And yes, those that say ADAS level 5 is harder than aviation autopilot are correct: autopilot is essentially route following and very limited decision making in a highly controlled environment (autoland, for instance, is in a controlled airport with ILS...) it is not decision making and situational awareness in an uncontrolled environment which ADAS level 5 implies.

      • by tsqr ( 808554 )

        That's nice. Can you point to the FAR that requires compliance for certification?

    • This is a good start, but these standards are developed for commercial air travel where pilots are assumed to be present to accommodate a system failure. Also, airplanes operate with much larger distances between them when compared to automobiles on a road. These standards just aren't good enough for self-driving cars.

    • Give it to them. Make them all develop ECUs using FAA 'commercial air' software standards!

      This definitely should happen.

  • We all know this will mean thousands of American toddlers, kids, and pets will die as a result of this.

    And every single family will sue.

  • by geekmux ( 1040042 ) on Thursday September 13, 2018 @06:27PM (#57310078)

    Greed says Fuck Safety. It's only important that our technology is first to market. Whether or not rushing to market will ultimately kill people does not matter. Greed will also regurgitate annual traffic death statistics as a justification to push forward as quickly as possible with this technology, security and integrity be damned.

    Oh well. It's not like we haven't seen infrastructure tech millions rely on get rushed to market with little or no concern for safety or security (cough, IoT, cough)

    Let's hope there won't be another Takata-grade airbag recall in our autonomous future...might be the only thing that saves your ass when the inevitable happens.

    • It's pretty simple.

      Don't buy a car without a physical throttle cable and a physical connection between brake pedal and master cylinder. Also a real transmission.

      At this point, it will be almost 20 years old. You will have to fix it/have it fixed.

  • Other than the blanket legal liability they want, why is this an imperative? The states can determine whether or not to allow autonomous vehicles to operate and under what restrictions.

    • It's not an issue for California. But in the northeast where states are small, what happens if you live in New Jersey and commute to New York every morning and then visit family in Rhode Island on the weekend (passing through Pennsylvania and Connecticut on your way)?

  • by nickmalthus ( 972450 ) on Thursday September 13, 2018 @06:30PM (#57310096)
    I am sure Silicon Valley would love nothing more than to enshrine their technologies in law and reap the royalties. Even better, ban all private ownership of cars [slashdot.org] except for the fleets they own and operate. I suppose they expect some type of return on their investment from all of the money they have spent on lobbyist. Regulations are for you, not for them. [theregreview.org]
  • its interesting to see a republican senator accepting science and technology,
    being left behind on healthcare and education by the same "rivals" seems to be ok though.

  • by BrendaEM ( 871664 ) on Thursday September 13, 2018 @07:04PM (#57310332) Homepage
    We don't need driverless cars.
    • You're right, of course. What we need are reforms of driver education, training, and testing. Bring back Driver Ed/Driver Training in highschools. Stricter DMV skills testing of current drivers, more often too, to weed out dangerous, incompetent, and angry drivers.
    • "We don't need driverless cars."

      You and I don't perhaps. But some of the elderly, inferm, or handicapped do. And potentially autonomous vehicles are far safer than human controlled vehicles. The problem looks to be how to debug the vehicles without killing or maiming half the population of the planet.

  • My only concern at this point is that I don't suffer permanent injuries, or die, in all the accidents I'll inevitably get in with so-called 'self driving cars', so I can sue the living daylights out of the manufacturers, and retire wealthy.
    • by zlives ( 2009072 )

      you can't sue, if it was a govt mandate... two things
      1.O those poor manufacturer's they didn't even want to implement the tech which was obviously not ready as shown in tests after tests.
      2. the EULA says the driver is still responsible in case of crash.

      • I think you're missing out on my sarcasm. I don't think so-called 'self driving cars' are anywhere NEAR ready for general use, and in fact I don't think they'll ever be, not until we have full-on general AI, at least equivalent to a human brain. At current they can't 'think' at all, no capacity to do so -- because we have no idea how a human brain does that, and won't anytime soon, either, we don't have the instrumentality to determine that. SDC's will be a disaster. I feel bad for all the people who will d
  • It begins soon!

  • by sphealey ( 2855 ) on Thursday September 13, 2018 @07:42PM (#57310530)

    - - - - - - t's "critically important" that Congress pass federal legislation on autonomous vehicles. - - - - -

    Let me guess what is "critically important" in this legislation:
    1. Elimination of all liability on the part of the automakers for accidents involving self-driving cars
    1a. Federal preemption of local and state criminal charges against automakers for accidents involving self-driving cars, including fatalities
    2. Huge dollar subsidies to manufacturers of self-driving cars
    3. Re-orientation of federal infrastructure spending toward self-driving cars
    [ I would add 3a. at the expense of pedestrians and human-centered development, but I'd just be repeating 3 ]

    Anything I missed?

    • by mentil ( 1748130 )

      I actually skimmed through the bill, it's short at 38 pages [congress.gov].
      It does nothing about liability or handing out money to companies working on self-driving tech. However, you're right that it preempts state/local laws regarding self-driving cars. Mostly it establishes committees and tasks various bodies with reviewing/researching/planning. It also log-rolls a couple unrelated auto regulations mandating 'improved' headlights, and a warning indicator to remind you if someone is in the back seat when you turn the en

  • by WillAffleckUW ( 858324 ) on Thursday September 13, 2018 @07:42PM (#57310534) Homepage Journal

    What?

    Too soon?

  • by Chas ( 5144 )

    Sorry.

    Liability of the automaker should not be eliminated. This is a buggy product, even today. And it's still insanely limited.
    So these automakers SHOULD share liability if their software fucks up and kills/injures someone.

    This should NOT be subsidized. This is a buggy product as-is. And simply throwing money at it isn't going to debug it any faster.

    The infrastructure for autonomous driving should be looked at when the product actually WORKS PROPERLY, and it finally makes sense.
    Doing so right now would

    • Liability of the automaker should not be eliminated. This is a buggy product, even today. And it's still insanely limited.
      So these automakers SHOULD share liability if their software fucks up and kills/injures someone.

      They should own liability in that case. They should be getting an insurance policy that covers the vehicle in self-driving mode, for ever and ever or until it is scrapped amen.

  • An effort to pass legislation like this does not mean that automated cars will hit the road immediately. Silicon Vally knows that the slowest part of any technology to develop is the regulatory environment around it.

  • However, I refuse to share the road with them. The main problem is an unequal share of driving responsibility exists when humans share roads with machines whose manufacturers enjoy the protection of corporate citizenship. Who is found at fault when accidents inevitably occur? If the autonomous vehicle is found at fault, who goes to jail? The CEO? I doubt that. It's the same as the New Jersey Turnpike. They have a separate set of lanes reserved for commercial vehicles so passenger cars can travel safely. Sa
  • Arizona banned Uber from continuing to test self-driving cars.
    Seems like they're basically shooting for a law that will prevent such things in the future.
    Under laws like the ones they propose: individual states would no longer be able to prevent or require special permissions for SDCs.

    "(b) PREEMPTION. (1) HIGHLY AUTOMATED VEHICLES.—No State or political subdivision of a State may maintain, enforce, prescribe, or continue in effect any law or regulation regarding the design, construction, or performan

Old programmers never die, they just hit account block limit.

Working...