Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Robotics The Courts Hardware Technology

A Rogue Robot Is Blamed For a Human Colleague's Gruesome Death (qz.com) 407

A new lawsuit has emerged claiming a robot is responsible for killing a human colleague, reports Quartz. It all started in July 2015, when Wanda Holbrook, "a maintenance technician performing routine duties on an assembly line" at an auto-parts maker in Ionia, Michigan, called Ventra Ionia Main, "was 'trapped by robotic machinery' and crushed to death." From the report: On March 7, her husband, William Holbrook, filed a wrongful death complaint (pdf) in Michigan federal court, naming five North American robotics companies involved in engineering and integrating the machines and parts used at the plant: Prodomax, Flex-N-Gate, FANUC, Nachi, and Lincoln Electric. Holbrook's job involved keeping robots in working order. She routinely inspected and adjusted processes on the assembly line at Ventra, which makes bumpers and trailer hitches. One day, Holbrook was performing her regular duties when a machine acted very irregularly, according to the lawsuit reported in Courthouse News. Holbrook was in the plant's six-cell "100 section" when a robot unexpectedly activated, taking her by surprise. The cells are separated by safety doors and the robot should not have been able to move. But it somehow reached Holbrook, and was intent on loading a trailer-hitch assembly part right where she stood over a similar part in another cell. The machine loaded the hardware onto Holbrook's head. She was unable to escape, and her skull was crushed. Co-workers who eventually noticed that something seemed amiss found Holbrook dead. William Holbrook seeks an unspecified amount of damages, arguing that before her gruesome death, his wife "suffered tremendous fright, shock and conscious pain and suffering." He also names three of the defendants -- FANUC, Nachi, and Lincoln Electric -- in two additional claims of product liability and breach of implied warranty. He argues that the robots, tools, controllers, and associated parts were not properly designed, manufactured or tested, and not fit for use. "The robot from section 130 should have never entered section 140, and should have never attempted to load a hitch assembly within a fixture that was already loaded with a hitch assembly. A failure of one or more of defendants' safety systems or devices had taken place, causing Wanda's death," the lawsuit alleges.
This discussion has been archived. No new comments can be posted.

A Rogue Robot Is Blamed For a Human Colleague's Gruesome Death

Comments Filter:
  • And so it begins... (Score:3, Interesting)

    by Oswald McWeany ( 2428506 ) on Tuesday March 14, 2017 @08:06AM (#54035777)

    And so it begins...

    • Yeah, what could go wrong with dumb robots everywhere, and pesky people getting in the way?

    • begins it so does /me repeats in the story //snarky mcsnarkface
    • by TheRaven64 ( 641858 ) on Tuesday March 14, 2017 @08:36AM (#54035991) Journal
      Begins? This sounds exactly like the sort of issue from that start of the industrial revolution, where people were routinely mauled by machinery with inadequate safety standards. About 200 years too late for 'and so it begins'.
      • by Jawnn ( 445279 )

        Begins? This sounds exactly like the sort of issue from that start of the industrial revolution, where people were routinely mauled by machinery with inadequate safety standards. About 200 years too late for 'and so it begins'.

        No shit. "Man killed by industrial machinery when a safety feature failed." That may be news, but not outside the local community.

    • by syn3rg ( 530741 )
      Someone get Elijah Baley on the case....
    • by grumbel ( 592662 )

      It already began back in 1979 [wikipedia.org], humans getting killed by robots is nothing new.

  • by sizzzzlerz ( 714878 ) on Tuesday March 14, 2017 @08:06AM (#54035779)

    You had the switch on "kill" rather than "assemble".

  • Industrial accident (Score:5, Informative)

    by kav2k ( 1545689 ) on Tuesday March 14, 2017 @08:16AM (#54035847)

    A failure of one or more of defendants’ safety systems or devices had taken place, causing Wanda’s death.

    That's it. That's all this lawsuit is about, faulty failsafes on industrial equipment that lead to an accident. Probably with merit.

    But sure, call it "rogue robots" and "killing"...

    • by ScentCone ( 795499 ) on Tuesday March 14, 2017 @08:18AM (#54035859)
      Yup. "By a robot" will be the new "with a computer."
    • by Baron_Yam ( 643147 ) on Tuesday March 14, 2017 @08:27AM (#54035907)

      My experience with industrial accidents is that it's almost certainly human error. I've seen someone deliberately disable the safety systems because they were inconvenient, then get mutilated doing something stupid the safeties would have prevented them from doing.

      Personally, I've operated machinery on manual override when it should have been on automatic, the machine blaring warnings at me the whole time which just didn't register because I heard them so often at work. Luckily, the passive safety systems (the big steel protective cage I was in) kept me from harm.

      With robots, failures are more likely to stop the system than to start it up. To accidentally start something when it shouldn't be started usually takes human interference.

      • by thegarbz ( 1787294 ) on Tuesday March 14, 2017 @12:29PM (#54037931)

        My experience with industrial accidents is that it's almost certainly human error.

        My experience with human error is that the only way to avoid future human error is to find the underlying cause which permitted that error.

        I've seen someone deliberately disable the safety systems because they were inconvenient, then get mutilated doing something stupid the safeties would have prevented them from doing.

        Sheer stupidity exists. The only way to eliminate it is by removing the stupid people. I remember a palatising machine at a biscuit factory where one employee asked the other to lock him into the cage and start the machine because of a faulty sensor was misplacing every 8th box. When a manager walked in, he hit the e-stop. Shortly after security walked in and escorted both the person in the cage and the person who locked him in the cage off site.

        When they said "no one told me" they were made a huge example, and management made sure that everyone at the plant knew the story of the two idiots who got themselves instantly fired.

        It is one area that is really difficult to manage.

    • by Nidi62 ( 1525137 ) on Tuesday March 14, 2017 @08:39AM (#54036011)
      Looks like the factory has both a history of accidents (2 previous deaths) and owner/name changes. That could indicate a culture of disregard for safety. At the same time, however, if the robots routinely move from section to section in the normal course of operation and (one would assume) the whole line is probably shut down while she is working on the one section, then it seems to me that ti wasn't properly locked out. If you have to stop an assembly line to work on one part of it, you should probably be locking out every portion of that line.
      • by fahrbot-bot ( 874524 ) on Tuesday March 14, 2017 @09:34AM (#54036389)

        Looks like the factory has both a history of accidents (2 previous deaths) and owner/name changes. That could indicate a culture of disregard for safety.

        Perhaps even welcome. From The President Changed. So Has Small Businesses’ Confidence [nytimes.com]

        The owner of an automotive parts assembler [electroprime.com] gave thanks that he would not be receiving visits from pesky environmental and workplace overseers.

        The president [kellerlogistics.com] of a trucking company spoke of a “tremendous dark cloud” lifting when he realized he would no longer be feeling the burden of rules and regulations imposed by the Obama administration.

        “My gut just feels better,” said Bob Fleisher, president of a local car dealership. [franklinparklincoln.com] “With Obama, you felt it was personal — like he just didn’t want you to make money. Now we have a guy who is cutting regulations and taxes.

        Thankfully, those pesky environmental and safety rules and regulations protecting people and getting in the way of profits over disposable employees will be going away...

    • Re: (Score:3, Insightful)

      The other thing to note is that a good solution to the robot safety problem is to simply add more automation. Mixing humans and automation requires huge effort and cost at the interface. Even if it costs a lot more than the marginal cost of labour to eliminate the last pieces of manual work on a production line, the potential savings across the wider system could make it worth doing. This is likely to mean that even those prepared to work well below minimum wage will not be able to get jobs that can be auto

      • She was in there to repair the robots. We haven't developed robots yet that can diagnose and repair problems in other robots. That work will likely remain with Humans until AI is sufficient to accomplish these tasks and I don't see that happening anytime soon.

      • what about repair and the push to keep the line moving as much as it can while working to repair the automation?

        abolish health and safety laws?? and when the rich boss says do some unsafe thing or your gone some may just say fuck it and beat the shit out that boss just so they can get free room, board and doctor in the prison!

  • Orders (Score:5, Funny)

    by ardmhacha ( 192482 ) on Tuesday March 14, 2017 @08:19AM (#54035869)

    The robot was just following orders.

  • Terrible (Score:5, Informative)

    by argStyopa ( 232550 ) on Tuesday March 14, 2017 @08:25AM (#54035893) Journal

    It's sounds absolutely terrible, but one of the primary things you learn when doing heavy machinery maintenance is lock out/tag out that renders all related machinery completely inoperable while servicing. It doesn't seem that this was done?

    To be clear, if the company maintenance policies prevented her from properly locking or what she was working on, then they certainly do have a suit.

    • by mhkohne ( 3854 )

      Honestly, the way the article is worded, it sounds like the 'safety doors' were supposed to lock out the other robots, rather than say a breaker being flipped. I'd love to know how those doors are supposed to work, I'd also love to know whether what she was doing was supposed to be done with the robots powered or not (not everything can be done with them powered down).

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        The issue has very little to do with the lockout in the robotcell (well.. see below)).

        The problem is that you have two adjacent cells that can interact. If Cell A is in lockout mode then the Robot from Cell B should not be allowed to reach into Cell A, but it may well still be allowed to work in it's own cell.

        So the possible faults here are:
        1. User failed to use correct lockout (climbing over door, having other human help reset door (it's supposed to be impossible from insdie the cell))
        2. The lockout failed

    • Re:Terrible (Score:5, Insightful)

      by wagnerrp ( 1305589 ) on Tuesday March 14, 2017 @08:42AM (#54036041)
      The machine she was working on was not the one that crushed her. This sounds more like bad industrial design, that allowed one robotic arm to reach into another work area. Either they should have been separated further, or they should have been energized through overlapping lockouts.
    • Re:Terrible (Score:5, Insightful)

      by gweihir ( 88907 ) on Tuesday March 14, 2017 @08:49AM (#54036089)

      Indeed, and that is a completely standard approach. If the safety equipment was faulty, not present, or there was pressure to not use it, then there is a very good case. If the equipment was there, working, but not used, then there is no case at all. Machinery is always dangerous and you must never bypass safety procedures, or suffer the consequences.

      Many people are stupid though. Refer, for example, to all the photos on the Internet where you see somebody operate a circular bench saw without the protection bar. Just stumble once while the thing is running....

      • . If the safety equipment was faulty, not present, or there was pressure to not use it, Well the work place can say someone did not lock out the full area so we are not at fault even if that lock out is say not really that well known that to kill power you need to do A + B + C to fully turn off work zone 130 and 140.

        • by gweihir ( 88907 )

          Indeed. Safety equipment that is difficult to use (except when there is no other possibility, e.g. for a "moon-suit") is faulty by definition. When there is no other possibility than complicated safety equipment, it has to be assured that everybody is trained and capable to use it correctly, and this has to be tested and assured regularly by training exercises.

        • by green1 ( 322787 )

          The laws on worker safety in most jurisdictions are pretty well thought out in that regard. The only way the employer would avoid responsibility is if they can prove that the worker was properly trained in the proper procedures, and that the corporate culture encouraged them to use them.
          If the employer can't prove the worker was properly trained, or is known not to enforce their safety procedures, then the employer will be held responsible.

          All that said, it seems like the lawsuit is trying to blame the manu

      • Refer, for example, to all the photos on the Internet where you see somebody operate a circular bench saw without the protection bar. Just stumble once while the thing is running...

        Frankly, every circular bench saw I've ever seen had had a protection shield that probably wouldn't protect you if you used it on the ground, and then fell on it. I converted my bench saw into a table saw, so it would be a lot harder to fall on. Which is good, because it doesn't have a guard either.

  • by Archtech ( 159117 ) on Tuesday March 14, 2017 @08:35AM (#54035973)

    The term "human colleague" immediately reveals that the writer has no idea of what a "robot" is. The most important thing always to keep in mind is that a "robot" is a machine - or, more likely nowadays, a collection of machines. It is a tool, even if that tool is capable of a limited set of autonomous actions. The accidental death described in TFA is a perfect illustration of this vital principle. Maybe there should be signs ten feet tall prominently displayed on all walls in workplaces that use robots: "A ROBOT IS *NOT* A 'COLLEAGUE'!"

    Mind you, this confusion has been inherent since the word was first coined. "The word 'robot' was first used to denote a fictional humanoid in a 1920 play R.U.R. by the Czech writer, Karel Capek but it was Karel's brother Josef Capek who was the word's true inventor". [Wikipedia] The word is derived from the Slavic language root meaning "work" or "worker", and strongly suggests that a robot is to some extent intechangeable with human workers. Of course, that is absolutely not the case.

    Isaac Asimov confronted these issues head-on when he began writing science fiction stories about robots. His "Three Laws of Robotics", which essentially forbid any robot to harm a human being, are treated as indispensable in his stories. But Asimov blandly ignored the obvious fact that there is no known way to implement such laws, which incorporate high-level abstract notions and moral principles. Until robots become at least as intelligent and complex as human nervous systems, such commands cannot be implemented. And if they ever do, we will immediately face even more tremendous problems.

    • Humans anthropomorphize machines all the time. Maybe we should blame Hanna/Barbera for making "The Jetsons".
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      But Asimov blandly ignored the obvious fact that there is no known way to implement such laws, which incorporate high-level abstract notions and moral principles. Until robots become at least as intelligent and complex as human nervous systems, such commands cannot be implemented. And if they ever do, we will immediately face even more tremendous problems.

      You haven't actually read Asimov's Robot stories, have you? If you had, you'd know that the implementation of the three laws of robotics was entirely i

      • You haven't actually read Asimov's Robot stories, have you? If you had, you'd know that the implementation of the three laws of robotics was entirely irrelevant to the points he was trying to make.

        I had read all of them - some several times - by 1965. When were you born?

    • Agreed that a Robot is no more a colleague than a screwdriver.

      I think you're wrong about Asimov, though. It's obvious that to write about theoretical concerns of future technology, the author must proceed without knowing how to actually implement the technology, but may be able to say that it's theoretically possible. There is no shortage of good, predictive science fiction written when we had no idea how to achieve the technology portrayed. For example, Clarke's orbital satellites were steam-powered. Steam is indeed an efficient way to harness solar power if you have a good way to radiate the waste heat, but we ended up using photovoltaic. But Clarke was on solid ground regarding the theoretical possibility of such things.

    • But Asimov blandly ignored the obvious fact that there is no known way to implement such laws, which incorporate high-level abstract notions and moral principles.

      He didn't ignore that, he simply assumed that it would become possible in the future -- and extensively explored the ways in which it could still go wrong, showing that even with that nonexistent technology, the seemingly foolproof Laws of Robotics were anything but foolproof.

  • "The cells are separated by safety doors and the robot should not have been able to move. But it somehow reached Holbrook, and was intent on loading a trailer-hitch assembly part right where she stood over a similar part in another cell".

    From a design/programming point of view, the key words are: "...the robot should not have been able to move. But it somehow reached Holbrook..."

    "Should not". Hmmmmmmmmmmmmmmmmm.

    Sounds as though someone made a mistake designing the system. Which is easily done. Restoring a d

    • Auto drive cars need a avionics software level code reviews and QA!.

      If not just wait for one to take poor map data and crash.

    • by green1 ( 322787 )

      The problem is that some idiot relied on "should" when they should instead have been looking at "can"

      When designing safety procedures you NEVER look at what a system should do in a given set of circumstances, you always look at what the system is capable of doing under any circumstances, and act accordingly.

      If that robot sometimes goes in to that cell, but shouldn't decide to right now, you don't take that as good enough, you think about what you need to do to stop it from doing so no matter what. So that c

  • by Eloking ( 877834 ) on Tuesday March 14, 2017 @08:42AM (#54036039)

    "Robot" engineer here. And when I say "Robot", I really talk about "Industrial Robot". Not the one that look like human.

    It's 2015 all over again when another "Robot" killed a Volkswagen worker. People were all "Matrix have begun" rogue.

    First, let me tell you to scary part : "The robot have done exactly what it have been programmed to".

    Second, let me tell you the encouraging part : "The robot have done exactly what it have been programmed to".

    It's always the same thing, "industrial robot" kill/hurt someone, and we see an headline about Robot revolution coming to kill us all in Terminator style. Those robot are just basic program controlling a bunch of servomotor, nothing "AI rogue humanoid robot with a shutgun" like. But there's on thing that are common to each of those story : "Safety violation".

    In my mind, industrial robot are still the most dangerous piece of hardware you'll ever work with, period. And that's why there's a shit ton of safety measure for them. Yeah gears are dangerous and could tear off your finger, but you indistinctly know that as long as you don't put your finger close to them, they won't bite you. It's not the case with robot.

    Back to the Volkswagen case, the worker didn't respect the safety procedure. The robot are connected to a safety gate that "must" be open when there's a worker inside the cell. You enter the cell, you put your lock in the gate to deactivate everything dangerous inside of it. But, from what I've understand, those worker wanted to work fast and took a "shortcut" while testing their equipment and decided to close the gate while a worker was inside. Of of the system then activated the robot that started it's wielding procedure with the worker right between both : https://www.youtube.com/watch?... [youtube.com] (Look between 0:05 and 0:30, everything else in this video is shit).

    I work constantly in this sort of system and you'll be amazed how many "close call" I've seem so far. The thing is, people are completely clueless about robot (Hell, one time I was presenting a robotic cell with two KUKA robotic arm to some potential customer and one of the cute asian girl asked me if she should "see" the body of the robot. She was thinking there's was a huge robot under the floor controlling the two arm).

    Long story short : Respect the freaking safety procedure.

    • That video is HILARIOUS!
    • The story makes me wonder a little if there was a mistake in the safety lockouts or general software if the machine that killed her was never supposed to work in the area she was in. However I agree most problems are with not following safety procedures. Stating someone has worked safely on certain machines for years says nothing about how safe they are working with them. Someone can cross busy roads not on a crosswalk many times before getting run over.

      • by green1 ( 322787 )

        If the machine is NEVER supposed to work in that area, then it shouldn't physically be able to get there.
        If the machine wasn't supposed to work in that area AT THAT TIME, then it should have either been powered down, or physically blocked from getting there.

        This is safety 101.

    • by RobinH ( 124750 )
      Yes, there are a number of things that could have gone wrong here, but it's not evident that she bypassed or ignored any safety protocol. There's a chance that the system integrator designed and implemented it wrong, and that the safety inspection missed it, though since it's Michigan I'd suspect it's a case of nobody doing a safety inspection (government has to cut regulation to be friendly to businesses after all). So sure, she might have gone in and had someone close the door behind her and reset it, o
    • by Kjella ( 173770 )

      In my mind, industrial robot are still the most dangerous piece of hardware you'll ever work with, period. And that's why there's a shit ton of safety measure for them. Yeah gears are dangerous and could tear off your finger, but you indistinctly know that as long as you don't put your finger close to them, they won't bite you. It's not the case with robot. Back to the Volkswagen case, the worker didn't respect the safety procedure. The robot are connected to a safety gate that "must" be open when there's a worker inside the cell. You enter the cell, you put your lock in the gate to deactivate everything dangerous inside of it. But, from what I've understand, those worker wanted to work fast and took a "shortcut" while testing their equipment and decided to close the gate while a worker was inside.

      Somehow I don't find robots in a safety cage that aren't supposed to be turned on with humans inside particularly scary. Getting killed by that is like being killed by a car sliding off the jack and crushing you, if you'd just bother to secure it properly before you crawled under it'd be completely harmless. I bet more people have died from the GPS giving faulty directions than industrial robots, much less faulty brakes, defective medical equipment and such that could quite easily kill people by simply not

  • At least if you remove the pervasive stupidity of today's press reporting. There is nothing special here. A piece of machinery was programmed wrongly, and there was no independent safety-equipment to stop it or it was not used. This is essentially not different from other machinery-related deaths at all.

  • by jfdavis668 ( 1414919 ) on Tuesday March 14, 2017 @08:59AM (#54036153)
    Someone set this thing to "evil".
  • It seems we have our first Runaway [wikipedia.org]. Better check to make sure its circuits haven't been modified.

  • 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

  • by wardrich86 ( 4092007 ) on Tuesday March 14, 2017 @11:45AM (#54037475)
    "The machine loaded the hardware onto Holbrook's head."

    I'm not sure what the literary device is being used here, but that word choice really makes it the entire blurb leading up to this point makes it sound like the robot became sentient and was on a mission. The last line (the bit I quoted) was equal parts gruesome, horrific, and fantastic. I would read a book by whomever wrote the bit there.

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...