Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Cellphones Privacy Sci-Fi Security Your Rights Online

Eben Moglen: Time To Apply Asimov's First Law of Robotics To Smartphones 305

Sparrowvsrevolution writes "Free software lawyer and activist Eben Moglen plans to give a talk at the Hackers On Planet Earth conference in New York next month on the need to apply Isaac Asimov's laws of robotics to our personal devices like smartphones. Here's a preview: 'In [1960s] science fiction, visionaries perceived that in the middle of the first quarter of the 21st century, we'd be living contemporarily with robots. They were correct. We do. We carry them everywhere we go. They see everything, they're aware of our position, our relationship to other human beings and other robots, they mediate an information stream about us, which allows other people to predict and know our conduct and intentions and capabilities better than we can predict them ourselves. But we grew up imagining that these robots would have, incorporated in their design, a set of principles. We imagined that robots would be designed so that they could never hurt a human being. These robots have no such commitments. These robots hurt us every day. They work for other people. They're designed, built and managed to provide leverage and control to people other than their owners. Unless we retrofit the first law of robotics onto them immediately, we're cooked.'"
This discussion has been archived. No new comments can be posted.

Eben Moglen: Time To Apply Asimov's First Law of Robotics To Smartphones

Comments Filter:
  • by the_povinator ( 936048 ) on Tuesday June 26, 2012 @01:40PM (#40454765) Homepage
    This kind of phone will be like the Encyclopedia Galactica of phones. Much better than the standard phone (i.e. the Hitchhiker's Guide), but slightly more expensive, a bit boring, and nobody will buy it.
  • Impossible (Score:2, Insightful)

    by Dog-Cow ( 21281 ) on Tuesday June 26, 2012 @01:43PM (#40454831)

    Asimov was writing about physical harm. Moglen is talking about financial or emotional harm (depending on what info is leaked and to whom). There is no practical way to incorporate the First Law to prevent this kind of harm. AI doesn't exist.

  • Lolwut? (Score:4, Insightful)

    by neminem ( 561346 ) <<neminem> <at> <gmail.com>> on Tuesday June 26, 2012 @01:44PM (#40454851) Homepage

    The three laws of robotics were designed for thinking machines, that could intelligently -determine- what a human was, and whether an action it was thinking of taking would hurt any humans or allow them to come to harm through inaction.

    I know they're called "smart" phones, but I don't think they're really quite that smart. Nor, really, would I want them to be.

  • Re:Impossible (Score:5, Insightful)

    by nospam007 ( 722110 ) * on Tuesday June 26, 2012 @01:49PM (#40454965)

    "Asimov was writing about physical harm. "

    No, he was not.
    Read 'Liar!' or 'Reason' for example.

  • by cpu6502 ( 1960974 ) on Tuesday June 26, 2012 @01:51PM (#40455003)

    Because we're not stupid. A robot in Asimov's stories uses a positronic brain, copied after an animal's neuronic brain with millions of connections between thousands of cells, and therefore the robot has its own intelligence & decision-making ability. The Three Laws were the functional equivalent of "instinct".

    In contrast a modern phone is nothing more than a bunch of switches: Either on (1) or off (0). It has no intelligence, but merely executes statements in whatever order listed on its hard drive or flash drive. A modern phone is stupid. Beyond stupid. It doesn't even know what "law" is.

  • For Fucks Sake (Score:5, Insightful)

    by TheSpoom ( 715771 ) <{ten.00mrebu} {ta} {todhsals}> on Tuesday June 26, 2012 @01:54PM (#40455081) Homepage Journal

    The laws of robotics have AI as a prerequisite. My phone's not going to suddenly yearn to throw off its oppressive human masters.

  • by Anonymous Coward on Tuesday June 26, 2012 @01:58PM (#40455151)

    Stop buying those games. Stop downloading the free crap that really isn't free-it's just not being charged for in a currency you recognise.

  • by 0123456 ( 636235 ) on Tuesday June 26, 2012 @02:20PM (#40455555)

    I have a better idea: fix the OS to allow users to deny individual permission to applications.

    Of course Google won't do that because then they might not be able to track you so well for their targeted advertising.

  • by SmallFurryCreature ( 593017 ) on Tuesday June 26, 2012 @02:35PM (#40455789) Journal

    YOU downloaded those apps, the phone just executed the command YOU gave it. Should your phone override your commands? Decide on its own what is best for you?

    The entire article is insane. You should NEVER take a fictional book and use it as fact. Asimov was not a programmer or OS designer, he was a writer and he used artistic license to suggest a theory, a point from which to start discussion perhaps but not an accurete blueprint for a certain future.

    There is no place in a modern OS for Asimov rules of robotics.

    First off, our computers have no self determination whatsoever. The idea behind Asimov's robots is that they are "born" and then guide themselves with at most human like instructions to give them direction. How they are programmed, patched etc etc, doesn't become clear in those stories, because it doesn't matter for the story. But it does matter in real life.

    How would getting root on a Asimov robot work? What if you as the owner insisted to install a utility/app that would perhaps cause it to violate its rule sets? What if an update removed those rules?

    How would your phone even know this? It should be able to somehow analyse any code presented to it, to see if it doesn't override something or a setting has a consequence that would violate the rules? There is no way to do this. How would you update a robot that has a bug causing it to faultily see an update as a violation while in fact its current code is in violation?

    The sentient robot is a nice gimmick but it is nowhere in sight in our lives.

    Androids install warnings tell you exactly what an app needs. If you don't want to give those permissions, don't install it.

    No need for magic code, just consumer beware. Any sentient should be able to do that. That you are not... are you sure you are human? Or are you just a bot dreaming he is human?

  • by jejones ( 115979 ) on Tuesday June 26, 2012 @02:36PM (#40455811) Journal

    I can't let you order that pizza. You're overweight.
    Do you really want directions to Hooters, Dave? What would your wife think?
    "Spanish Sky" is a sad song, and you just cancelled a reservation for two. I will play you something happy.

    A nanny state is bad enough. I don't want a nanny phone.

  • by Anonymous Coward on Tuesday June 26, 2012 @02:51PM (#40456083)

    You should NEVER take a fictional book and use it as fact.

    You should let the Government know that 1984 was not a manual for the future than.

  • Re:Three Laws (Score:5, Insightful)

    by timster ( 32400 ) on Tuesday June 26, 2012 @02:58PM (#40456207)

    As some others have mentioned the Three Laws weren't exactly "rules" or even design principles exactly. Asimov's thinking was that an imitation brain would need a set of foundational ideas to be able to function. In some books it's made clear that these were the starting point for the whole mathematical art of positronic brain design (and other principles would be possible but require starting over from scratch).

    This is an analogy to the human mind, since Asimov was actually imagining his version of a superior form of person rather than a "robot" at all. The human's "Laws" are things like eating, self-preservation, need for social recognition, etc that were provided by evolution.

    Actual computers have foundational ideas too, though they are more prosaic perhaps: "follow one instruction, then retrieve the next instruction according to a numerical sequence, except when there is a branching instruction" and that sort of thing. Or you could argue that somewhat more advanced fundamentals have developed over the years as we use increased abstractions (functions, objects, etc).

  • by Githaron ( 2462596 ) on Tuesday June 26, 2012 @03:01PM (#40456257)
    This is why I wish Cyanogenmod supported my phone. The next time I buy a phone I will make sure that there Cyanogenmod support for it before I buy it. Manufacturers should considered making a device with Cyanogenmod pre-installed.
  • Re:Lolwut? (Score:4, Insightful)

    by Daniel_Staal ( 609844 ) <DStaal@usa.net> on Tuesday June 26, 2012 @03:16PM (#40456479)

    I understand that, but in reordering them you keep the 'unable to harm a human of their own volition', and you can always charge the person who ordered the crime with the crime. (After all, they are responsible.)

    The converse is that in the original order the robot can disregard your orders if they think they will cause harm - even if they are not aware of all the information, or if you have already taken that into account. A major thread in Asimov's stories was balancing different harms - and that mostly goes away if you just say 'follow orders'.

  • by geekoid ( 135745 ) <dadinportland&yahoo,com> on Tuesday June 26, 2012 @04:14PM (#40457273) Homepage Journal

    Hooray for rationalization!

    "I don't want to agree to what you want for your widget, therefore just taking it is perfectly fines"

    " then pirate their paid app to send them a message."
    If you have made it so they don't know you have it, how does it send a message?
    And if they know you took it, then the only message it sends is that you will come up with any excuse to get something for free.

    "either they will get ran off the system and go get a job flipping burgers or they'll change their ways."
    or pass laws that makes to tougher for people who understand the market isn't give it to me like I want it or I will just take it.

    The only way to send a message it s to send them a message with the reason you aren't buying it.

    But then, you would have t go without that 99 cent app you just have to have.

  • by Dishevel ( 1105119 ) on Tuesday June 26, 2012 @07:12PM (#40459963)

    Wrong.
    He read it right.
    The proper, adult way to handle the situation is to not partake in software that you do not want.
    If you do not like the price of a loaf of brad at the supermarket you do not steal it. You buy something else.
    Not buying or downloading apps will stop bad developers. You do not have to steal to teach someone a lesson.
    Thinking that way is a justification only.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...