Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Government The Military

Debating a Ban On Autonomous Weapons (thebulletin.org) 228

Lasrick writes: A pretty informative debate on banning autonomous weapons has just closed at the Bulletin of the Atomic Scientists. The debate looks at an open letter, published In July, 2015, in which researchers in artificial intelligence and robotics (and endorsed by high-profile individuals such as Stephen Hawking) called for 'a ban on offensive autonomous weapons beyond meaningful human control.' The letter echoes arguments made since 2013 by the Campaign to Stop Killer Robots, which views autonomous weapons as 'a fundamental challenge to the protection of civilians and to international human rights and humanitarian law.'

But support for a ban is not unanimous. Some researchers argue that autonomous weapons would commit fewer battlefield atrocities than human beings—and that their development might even be considered morally imperative. The authors in this debate focus on these questions: Would deployed autonomous weapons promote or detract from civilian safety; and is an outright ban the proper response to development of autonomous weapons?

This discussion has been archived. No new comments can be posted.

Debating a Ban On Autonomous Weapons

Comments Filter:
  • Covered this topic exceptionally well, including at what point an autonomous weapon could be trusted...

    And the answer is when it becomes human.

    GrpA

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      When it becomes human, it can't be trusted anymore.

    • Re:Turing Evolved (Score:4, Insightful)

      by ShanghaiBill ( 739463 ) on Friday February 12, 2016 @01:51AM (#51492767)

      And the answer is when it becomes human.

      Humans killed 400+ civilians at My Lai, and 200+ civilians at No Gun Ri. Both massacres were the result of rage and fear. Robots don't feel those emotions, and have committed no massacres on that scale. I trust robots more than I trust humans.

      Restrictions on nuclear warheads, ships, etc. make sense because they can be verified. Restrictions on software have no means of verification, so any ban on autonomous robots is wishful thinking.

      • Re: Turing Evolved (Score:4, Insightful)

        by ferret4 ( 459105 ) on Friday February 12, 2016 @04:15AM (#51493055)
        Robots also do not feel mercy, and they never question orders - no matter how deranged.
        • Re: (Score:2, Insightful)

          by Anonymous Coward

          The majority of humans will follow orders no matter how deranged, even when they know its wrong.

          Seeing as we have pretty good experimental proof that emotion will not override orders most of the time, then the benefit of positive emotions like empathy is functionally nil. However, negative emotional states WILL significantly alter a persons actions.

          "I won't do that sir, its wrong" has saved very few people. "The enemy isn't human and should be exterminated" has gotten entire races exterminated. Given the op

        • Robots also do not feel mercy, and they never question orders - no matter how deranged.

          You know that we have a name for this in humans. We call them psychopaths.

          • by tnk1 ( 899206 )

            Well, psychopaths have their own motivations. Those motivations may include following a deranged order.

            Or they may simply be motivated to kill their officers in their sleep.

            No, most of the order followers aren't psychopaths, they're just detached emotionally from what they are doing. The have created a justification for it, and since no one is calling them on their BS, and worse, everyone else is reinforcing that justification, they feel like they are not responsible. They're quite sane, however.

          • In certain circumstances, the right psychopath is exactly who you want.

      • Re:Turing Evolved (Score:5, Insightful)

        by N1AK ( 864906 ) on Friday February 12, 2016 @04:49AM (#51493127) Homepage

        Robots don't feel those emotions, and have committed no massacres on that scale. I trust robots more than I trust humans.

        Pretty shit logical reasoning given that robots have never been in a position where they are capable of committing a massacre on that scale. Kim Jong Un has never ordered the use of nuclear weapons so maybe we should give him control of US nukes.

        The biggest issue with robot soldiers isn't that they'll make "evil decisions" it's that when a country can engage in warfare without risking its civilians lives it's likely to get involved in more conflicts. That isn't to say that there are no valid concerns about how robots will be programmed, and ordered, to behave in the field as well.

        • by tnk1 ( 899206 )

          More to the point, it is GIGO. If we program the robots to follow our Doomsday order, they *will* execute it. A human, even a hard ass, might reconsider at the brink of nuclear war. The robots and computers would not hesitate for a millisecond in executing the bombastic, genocidal orders they were programmed with.

          The problem with humans is not that we're savages, but that we'll talk like savages and when someone else reacts to that or our own people act on it, we realized that savagery isn't really what

        • by sudon't ( 580652 )

          Pretty shit logical reasoning given that robots have never been in a position where they are capable of committing a massacre on that scale.

          Not surprising when you consider that robots, (of the type people are imagining), only exist in Science Fiction. I think some people get a little too caught up in the movies they watch.

          • We're not talking about humanoids, but autonomous weapons. There are naval mines that will sit there until they detect an enemy submarine, then fire a torpedo at it.

      • Restrictions on nuclear warheads, ships, etc. make sense because they can be verified. Restrictions on software have no means of verification, so any ban on autonomous robots is wishful thinking.

        Justifiable restrictions make sense, because putting rules into law means that those tasked with enforcing them are then allowed to take action - it clears away a hurdle. Bans are often like that - their intent is reactive, no proactive. It may not be possible to stop malicious software and hardware being developed or deployed, but at some point there may well be a reckoning, and then it becomes possible to determine whether a banned technology has been used, and the penalty can be adjusted accordingly.

        The

        • Justifiable restrictions make sense, because putting rules into law means that those tasked with enforcing them are then allowed to take action - it clears away a hurdle.

          Such laws also give corrupt institutions and governments a justification for bullying and oppressing others. And when it comes to international treaties, they make citizens of sovereign nations subject to the whims of international institutions that they have no democratic control over.

          The cost of enacting laws is a high one, much highe

        • The other important point to make is that when nations sign up to a treaty that bans something, then they will be very reluctant to ignore the ban (openly, at least), and of course, if they don't sign up to it, that tells us something as well. These things can have significant repercussions for the reputation of those countries.

          So if the U.S signed a treaty that banned all guns from citizens, it wouldn't be enforceable as the U.S Constitution's 2nd Amendment would trump the treaty. Many other countries may not have that issue, but a treaty can only go so far and a country can only go so far or it risks revolt by its citizens and exiting the treaty any way.

      • Robots don't feel those emotions, and have committed no massacres on that scale. I trust robots more than I trust humans.

        Do you trust a gun? Do you trust a bomb? Of course not, because the concept is meaningless: neither will cause harm without instructions from a human. Both can magnify the amount of harm that a human can do. Autonomous weapons, of which landmines are the simplest possible case, expand both the quantity that a person can do harm and the time over which they can do it.

        During the cold war, there were at least two incidents where humans refused to follow legitimate orders to launch nuclear weapons - in eit

        • You haven't really made a compelling case one way or another.

          A sufficiently intelligent AI system would reject a nebulously valid nuclear launch order. But you've also given exact examples of sufficiently intelligent humans still doing terrible things.

      • by EvilSS ( 557649 )

        I trust robots more than I trust humans.

        Humans are the ones who will give those robots their rules of engagement. Humans will also be the ones to create the AI software that interprets those rules. How much do you those humans?

        You mention My Lai. How much easier would it be for a massacre like that to be covered up if no human soldiers were involved. Who would step in to stop it, and report it (it was a human helicopter crew who stepped in and saved a number of civilians, and reported the atrocity). Don't try to tell me a robot wouldn't do th

      • by Nemyst ( 1383049 )
        The thing you're forgetting is that at some point in the chain, there's a human calling the shots, and a robot not having emotions goes both ways: yes, it definitely means that they will keep their cool in all situations and never kill people unexpectedly (assuming no bugs of that magnitude), but on the other hand it also means they will never refuse to do orders humans would consider reprehensible. The general orders all killbots to gas the entire populace? They'll do it.
      • by sudon't ( 580652 )

        Humans killed 400+ civilians at My Lai, and 200+ civilians at No Gun Ri. Both massacres were the result of rage and fear. Robots don't feel those emotions, and have committed no massacres on that scale. I trust robots more than I trust humans.

        A lot of people put their trust in nonexistent beings.

        As for me, I want a bunch of those Sentry Guns [youtube.com] on my lawn.

    • This has been thoroughly covered by great thinkers of the past: Doomsday Device [youtube.com]

  • War is all about winning at virtually all cost. In today's world that will increasingly mean autonomous weaponry. Aircraft in particular have passed the ability of a human pilot to control. The biggest limiting factor in warplane development is the pilot.

    • Self Defense (Score:3, Interesting)

      Arguably while war is all about winning, it's not at "all cost". We (ignoring GOP presidential candidates) will not bomb an urban civilian center these days in order to kill a few hostile bad actors. Arguably the availability of precision munitions changed the moral balance from "bomb a city into submission" into the modern sensibility of "only bomb specific targets of military importance."

      Even though war is a terrible and bloody affair, we as societies have constantly been moving towards more humane and

      • Re:Self Defense (Score:5, Insightful)

        by FlyHelicopters ( 1540845 ) on Friday February 12, 2016 @04:42AM (#51493119)

        Arguably while war is all about winning, it's not at "all cost".

        It isn't, because we've been winning for a long time... that changes when you're losing... Look at the Japanese 1944-45, with the Divine Wind and tell me they weren't willing to pay any price?

        It took nuclear weapons to get them to see reason.

        Even though war is a terrible and bloody affair, we as societies have constantly been moving towards more humane and less deadly conflict.

        From your safe place behind a safe computer, you can say that.

        Go ask the people in Syria if they feel like they are taking part in a "more humane and less deadly conflict".

        You're kidding yourself if you think that is war. War is hell, and you don't win by "only kinda sorta almost fighting..." You win by so completely crushing your enemy that he puts up the white flag and says "you win, sorry for all that, let us know what the new rules are"

        Anything less and it never really ends.

        • by jafiwam ( 310805 )

          No matter your feels on the subject, the GP is factually correct.

          Nuclear weapons being used in war marked the turning point where deaths from conflict fell considerably (and continue to fall.)

          If you think otherwise, you are ignorant of history and getting all spun up in emotion like a child.

          • The timeframe involved since nuclear weapons isn't that long, and frankly I suspect you need to do some math, you might find your visions of peace and deathless conflict are colored by a lack of death in the west, not total deaths.

            How many deaths in the Korean War? Vietnam? Iraq? Ok, put aside wars the US fought, how about India/Pakistan wars, Isreal/Everyone else, USSR/Afghanistan.

            While it is true that we haven't had a world war since WWII, that doesn't mean that we haven't had tens of millions of death

        • "It's good that war is so terrible, otherwise we should grow too fond of it" - Robert E Lee

          If war were like Starcraft, I would join the military.
        • From your safe place behind a safe computer, you can say that.

          Go ask the people in Syria if they feel like they are taking part in a "more humane and less deadly conflict".

          That's like saying "Global Warming isn't true because it snowed today!" The Syrian conflict is a horrific, awful anomaly. Even then, it's pretty restrained by historical standards and more than offset by the otherwise tranquil world scene at present. It should also be noted that much of what Assad is doing isn't involving guided munitions because he just flat out doesn't have them. And his opposition has no airforce at all.

          Compare that to 100 years ago and a similar conflict probably would have resul

          • https://en.wikipedia.org/wiki/... [wikipedia.org]

            The past 30 years honestly doesn't look all that peaceful to me.

            Frankly, if you think that major war is gone, well, I invite you to travel back to 1912, when many people said the same thing. Europe was ablaze in prosperity and growth, there had been no war for 2 generations, and the leaders of all the major nations were related and tied together in various ways.

            What could go wrong?

      • by dargaud ( 518470 )

        Arguably the availability of precision munitions changed the moral balance from "bomb a city into submission" into the modern sensibility of "only bomb specific targets of military importance."

        And I'm beginning to seriously wonder if that works. Drop a bomb on a building containing a few 'terrorists' or similar. OK, they're dead. But their sons and nephews and cousins in the house next door are now pissed off and want vengeance, the neighbors are pissed off and scared and want random bombs to stop falling, etc. So you end up with a lot more people pissed off at you than you started with.

        I mean, just look at Iraq and Afghanistan: are they at peace since precision missiles started dropping on the

      • by swb ( 14022 )

        I would argue that many of the most successful military campaigns have involved total warfare, which includes targeting civilian populations. The strategic argument is that it demoralizes your opponent, disrupts his economic system and supply of materiel, and damages his political power and control.

        The trend towards less aggressive use of total warfare seems to be mostly a byproduct of media exposure of the military theater to non-combatants, resulting in negative public opinion and diplomatic pressure.

        Th

        • by amiga3D ( 567632 )

          We lack the will to do the necessary evil things that must be done to win. If William Tecumseh Sherman were fighting in Afghanistan the Taliban would be done. He crushed the core Southern States in the Civil War by war to the bone. He looted, raped and burned a path through Georgia that still over 150 years later is remembered. That level of destruction or more is what is required to subjugate a population. I remember a tale about a Roman officer back in antiquity that had a problem with supply convoys

          • by swb ( 14022 )

            From what I remember reading, Sherman's March to the Sea had general orders to destroy Southern economic output but not to wantonly harm civilian population or things necessary to keep them alive, although even if true, it's an open question on what level of discipline was maintained over the campaign at the unit level.

            The Romans largely set the gold standard for total warfare, often annihilating their opponents armies completely, burning their cities to the ground, looting everything of value and enslaving

  • by penguinoid ( 724646 ) on Friday February 12, 2016 @01:16AM (#51492675) Homepage Journal

    When autonomous weapons are outlawed, only outlaws will have unstoppable armies of soulless killing machines.

    • by CBravo ( 35450 )
      If everyone has autonomous weapons, every insane person will have one.
    • by AmiMoJo ( 196126 )

      When nuclear weapons are outlawed, only outlaws will have nuclear weapons.

      It seems to have worked out okay, because it turns out building such things and delivering them to targets is rather difficult and mostly controllable. Plus, most countries don't even want them anyway, and signed an agreement not to develop them.

      • The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. ... And as you go forth today remember always your duty is clear: To build and maintain those robots.

  • by fragMasterFlash ( 989911 ) on Friday February 12, 2016 @01:17AM (#51492679)
    -OR-

    how /. forgot about the nasty downside of an autonomous doomsday devie.
  • by Krishnoid ( 984597 ) on Friday February 12, 2016 @01:28AM (#51492709) Journal

    But support for a ban is not unanimous. Some researchers argue that autonomous weapons would commit fewer battlefield atrocities than human beings—and that their development might even be considered morally imperative.

    In particular, Dr. Miles Dyson and his associates, Drs. Skyler Natalya and Keel Lbot, Ph.D.

  • Personally, I'd rather see robots kill each other as opposed to humans slaying each other. Not only does it make sense for the militaries involved (robots can't desert, they will never be afraid of gunfire, and you don't need to tell their family that they died in the war), but the civilians would prefer it too (you don't have to risk your life being shot, you won't have to abandon your home, etc.). Autonomous machines are an advantage for everyone involved, and would be a much more humane way to solve war

    • by AHuxley ( 892839 )
      Re 'Autonomous machines are an advantage for everyone involved, and would be a much more humane way to solve wars."
      The US plan will be for area denial, the free fire zones https://en.wikipedia.org/wiki/... [wikipedia.org] dreams of the US in Vietnam or the British in South Africa during the Boer wars.
      Anything that moves and an AI has a pattern for in its database will be engaged within a large area without hesitation.
      Its a very old idea and the US mil still seems to think air power or area denial alone will magically wi
    • Personally, I'd rather see robots kill each other as opposed to humans slaying each other. Not only does it make sense for the militaries involved (robots can't desert, they will never be afraid of gunfire, and you don't need to tell their family that they died in the war), but the civilians would prefer it too (you don't have to risk your life being shot, you won't have to abandon your home, etc.). Autonomous machines are an advantage for everyone involved, and would be a much more humane way to solve wars

    • Yep, I'd advise a re-screening of RoboCop 2014 for you...

  • by Kjella ( 173770 ) on Friday February 12, 2016 @01:53AM (#51492773) Homepage

    What's the difference between a search-and-rescue bot and a kill bot? The function is going to pretty much identical right up to the point the target is located, just duct tape a gun to point in same direction as the camera and wire the "person located" signal to pull the trigger. It's one thing to ban ABC weapons because they're very specific technologies, but this is way too generic to work. And it's not like the military is going to avoid developing it for intelligence gathering and decision support systems, even if you keep a human in the loop it's literally going to be one flip of the switch to full automatic where the computer's recommendations are implemented by itself.

    The primary reason to keep soldiers in the loop today is because you're trying to fight a "good war" and avoid antagonizing the civilians so you want manual confirmation of each target, if you take the gloves off and say if you're found outside after curfew we'll shoot to kill and live with the collateral you could automate much more. And don't get up on the high horse, when the US nuked Hiroshima and Nagasaki they knew there's be about 100-200k civilian casualties. In a real war nobody's going to give a fuck if the robots are just 99% or 95% right, if it can save our troops and civilians and end the war for sure we're going to let them fight for us.

  • Uh huh... (Score:5, Interesting)

    by Pseudonym ( 62607 ) on Friday February 12, 2016 @02:28AM (#51492847)

    Some researchers argue that autonomous weapons would commit fewer battlefield atrocities than human beings [...]

    No, it's just that if an autonomous weapon does it, it would be more difficult to call it an "atrocity". If a dozen villagers are killed because of a minefield that some idiot decided should go near where they live, the only reason you can't call that a "massacre" is that there was no human making the targeting decision.

    In the 1920s, there were some who argued that aerial bombing would be more humane because they could be far more precise than field artillery, hitting only the target that you want to hit. Look how well that worked out.

    • In the 1920s, there were some who argued that aerial bombing would be more humane because they could be far more precise than field artillery, hitting only the target that you want to hit.

      Wow, those people didn't understand the state of the art airforce technology of the time........during WW2 Germans would say, "If you want to be safe from bombers, find the thing they are trying to bomb and stand on it."

      • According to one study [wikipedia.org], only 5% of British bombers setting out bombed within 5 miles of their intended target. Yeah, it was probably about as safe standing right on the target as anywhere else.

        I'd imagine the US Army Air Corp had somewhat higher accuracy rates, bombing during daylight and using the Norden bombsight, although I'd guess it was still rather dismal by modern standards.

        • That depends on the period. Early British strategic bombing couldn't reliably find a city, a raid on a particular city being reported sometimes as scattered bombing in western Germany. They improved a lot over the course of the war.

          US strategic bombers, bombing identifiable targets in the daytime with good visibility, could be extremely accurate. They very often didn't get good visibility, weather over Europe being what it was. They exaggerated the precision achieved for morale purposes.

    • Re:Uh huh... (Score:5, Insightful)

      by AmiMoJo ( 196126 ) on Friday February 12, 2016 @05:57AM (#51493275) Homepage Journal

      It's worse than that, antonymous weapons seem to encourage bad behaviour. Look at the record of drones in Afghanistan and Pakistan. It seems that when you can kill people from thousands of miles away by pressing a button and having a robot do the dirty work, a lot of innocent people get killed. Operators actually call the victims "bug splat", like something you get on your windshield.

      The more remote you get from the act of killing, the easier to becomes to commit atrocities.

      • Re: (Score:2, Insightful)

        by BitZtream ( 692029 )

        Look at the record of drones in Afghanistan and Pakistan.

        You mean the ones that performed more strategically and killed fewer collateral targets than any wars before them ... EVER?

        Operators actually call the victims "bug splat"

        Yea, its called disconnecting, and they are trained to do so. Its the only way a morale and just person can spend their days killing people. You don't want the guys who can do it all day long and call it what it is doing it, those people are dangerous and enjoy murder so they will do things that shouldn't be done.

        On the other hand, most of the people doing these things are geeks just

        • by AmiMoJo ( 196126 )

          War? Remind me again when the US declared war on... Err... Remind me which country too.

          It's not a war, it's an anti-terrorism operation. An extreme form of policing. Comparisons to wars make no sense.

          Soldiers should feel bad when they kill civilians. It's a war crime to not take steps to avoid civilian casualties. Luckily for them the US doesn't allow its personnel to be prosecuted for war crimes, and no one can force them to.

    • In the 1920s, there were some who argued that aerial bombing would be more humane because they could be far more precise than field artillery, hitting only the target that you want to hit. Look how well that worked out.

      Pretty well. We have laser and image recognition guidance systems that can't take out an individual pickup truck or go through the window of a house before destroying basically only the single house ...

      I'd say they were right.

  • Landmines (Score:4, Insightful)

    by MrKaos ( 858439 ) on Friday February 12, 2016 @02:31AM (#51492849) Journal

    Landmines kill little kids without asking. Do we want more things killing automatically?

    • by LWATCDR ( 28044 )

      Autonomous weapons have been around for about 70 years probably starting with the German T-4 torpedo that they could fire in the general direction of a convoy and it would seek out a ship to sink. If you count landmines it is at least 100 years.
      This is just a chance to make news and get attention.

    • by rhazz ( 2853871 )
      Landmines kill indiscriminately - that is exactly the problem with landmines. An autonomous weapon in this context would be something which can analyze a situation to some extent and potentially decide not to kill. The comparison adds nothing to the debate other than a knee-jerk emotional response.
  • by Hartree ( 191324 ) on Friday February 12, 2016 @03:22AM (#51492939)

    Many restricted military technologies are fairly easy to detect. Nuclear weapons require a massive industrial input that has well known signatures, for example.

    A robot is different. It can be something that's dual use. One day it's a regular robot. Look at the software. All civilian and a nice strong optics mount point on it.

    Change the software and swap some of the camera gear for a small machine gun and use the rest for aiming, it's a killbot. This is just one example. Another obvious one is putting an autonomous drone software package into the flight computers of an airplane that can also be manned. This game can go on and on with just about any weapons system you can think of.

    It doesn't take industrial facilities that are different from usual ones to make them. If you can make versatile robots for civilian use, and separately make weapons you just have to put them together at the last minute. They don't have any particular signatures the way chemical weapons and their precursors do. Most nations are already making ordinance, so who's to say whether a human is going to be in the loop to fire it or if it's triggered by an AI?

    If people want to cheat on this, it'll be pretty easy to do so.

    So far, the landmine bans haven't seemed to have slowed down the planting of them a bit in various wars. We have to have demining teams, not just for cleaning up old wars, but the very ones that are going on now.

    I don't expect this to have a much greater effect.

    • A robot is different. It can be something that's dual use.

      A bending unit that ... must kill all humans, perhaps?

  • by Firethorn ( 177587 ) on Friday February 12, 2016 @04:00AM (#51493023) Homepage Journal

    That's something of a question, isn't it? What does "meaningful human control" translate to? Does it mean that a human has to okay each weapon discharge? Does it mean that we aren't supposed to release a swarm of von neumann kill-bots with 'destroy everything' as a goal? What if we release them in an area with orders to kill any humans with weapons that don't have a valid IFF signal?

    In addition, I've seen with UN weapon ban treaties that they're sometimes used as a 'we don't have them, so you shouldn't either' tool. Who's closest to these sorts of weapons? The USA. Who's NOT going to agree with any treaty limiting the effective use of these weapons? The USA. Rendering the ban useless.

  • ED-209 (Score:5, Insightful)

    by Coisiche ( 2000870 ) on Friday February 12, 2016 @04:54AM (#51493141)

    The original Robocop movie correctly predicted how these things might turn out. Infallible is not the adjective to apply.

    Still, at least it shot a company exec during a demo, so not a total failure.

  • "Some researchers argue that autonomous weapons would commit fewer battlefield atrocities than human beings"

    They would commit only as much atrocities as their master giving the order ,e.g. the generals handling the command, would allow them. Therefore it would still be HUMAN being declaring what's the ROE. 5% civilian casualty allowed. 20%. 100%. The number would be set by human. At least in the case of human we can have other human balking at atrocities and rebelling against order or getting taken to t
  • I think we all know not to build Terminators.
  • by jenningsthecat ( 1525947 ) on Friday February 12, 2016 @07:57AM (#51493563)

    What a shame that in this relatively late stage of human development we should still be debating the moral pluses and minuses of applying technology to kill each other. Talking about "offensive autonomous weapons beyond meaningful human control" and "autonomous weapons would commit fewer battlefield atrocities than human beings" - shouldn't we be raising the level of discourse? A lot?

  • This 'autonomous weapons systems' debate is under attack, a hostile takeover by radical factions of the Artificial Intelligence research community. When I first glimpsed Hawking's phrase offensive autonomous weapons beyond meaningful human control I immediately thought, he's talking about land mines, right? --- but no... it appears they had some hideously complicated Futurist Thing in mind. Okay... perhaps we're just talking about AI because it is fun to talk about it and it takes our minds away from other

  • I will give up my killbot when you pry the controller from my cold, dead hands! Because the Constitution!

  • The US does not have a population large enough to fight a traditional war against nations like China who can arm many millions of soldiers. We would be in the position of having to go to nuclear, germ or gas warfare in a land war with China and some other nations as well. The very nature of that kind of war would mean the death of many millions of noncombatants. It would also mean the slaughter of vast numbers of American troops and a devastation of our economy. But autonomous and drone
  • by gurps_npc ( 621217 ) on Friday February 12, 2016 @09:51AM (#51494091) Homepage
    Look, do you know how hard it is o convince a solider to commit an atrocity? They don't just start doing it. It takes years and years of training and exposure to the horrors of war before soldiers are willing to do those things.

    But robots could be turned on and ordered to do atrocities that no human would agree to do.

    If a rogue general wanted to convince US soldiers to kill everyone in the White House, or to kill everyone in Congress, he would find it very difficult. It would take almost nothing for a rogue General to convince a squadron of drones to do the exact same thing.

    • There is no putting the genie back in the bottle. For all of human history nothing stopped the masses from overthrowing tyranny except the restraint and convictions of the people. But fast forward 250 years where a mega corporation could have full automation production of these drones, robots, and autonomous systems - the entire remaining 99.9999% people on the earth could be wiped out with no recourse. I'm fairly sure we are seeing the final steps in the removal of any possible threat the people at larg
  • does not approve of this message!
  • As mentioned elsewhere in this story and thread:

    Landmines are autonomous, durable, and persistent. Very dangerous. And easily forgotten by their employers.

    Claymore mines ditto, though less appreciated for their durability, and found so much more quickly.

    Technology has produced much more interesting autonomous weapons, mobile, with greater range, but if you're all worked up because they are being controlled by operators half a world way that high-five each other when they obliterate a wedding party, well,

  • In The Day the Earth Stood Still [wikipedia.org], Gort is a member of an autonomous robot police force built by aliens, programmed to preserve peace in the universe by destroying any aggressor. Thus forcing fickle and emotional sentient beings to behave.
  • ...unless you're also going to halt *all* AI development because any automated weapon without an AI can be controlled by any sufficiently advanced AI.

    For instance, all the UAVs could be controlled by an AI, thereby taking a non-AI weapon and making it an autonomous weapon. As long as you have remotely controlled weaponry and AI development - however disconnected they may be - you have the potential for an autonomous weapon that could be outside of human control.

    That said, a claymore is a very simple a
  • Dear Umbrella Corp Customer,

    We rolled out an automatic firmware update last night that seems to have a few bugs. If your AutoGunner 3000 killed your entire village, please follow these steps to restore it to factory mode:

    1) Using a paper clip, press the small reset button (located on the front of your AutoGunner 3000's turret, directly below the .308 barrel).
    2) Continue depressing the button for approximately 5 seconds. If the turret begins to move

Whoever dies with the most toys wins.

Working...