Forgot your password?
typodupeerror
Cellphones Privacy Sci-Fi Security Your Rights Online

Eben Moglen: Time To Apply Asimov's First Law of Robotics To Smartphones 305

Posted by Soulskill
from the or-through-inaction-allow-texting-at-the-dinner-table dept.
Sparrowvsrevolution writes "Free software lawyer and activist Eben Moglen plans to give a talk at the Hackers On Planet Earth conference in New York next month on the need to apply Isaac Asimov's laws of robotics to our personal devices like smartphones. Here's a preview: 'In [1960s] science fiction, visionaries perceived that in the middle of the first quarter of the 21st century, we'd be living contemporarily with robots. They were correct. We do. We carry them everywhere we go. They see everything, they're aware of our position, our relationship to other human beings and other robots, they mediate an information stream about us, which allows other people to predict and know our conduct and intentions and capabilities better than we can predict them ourselves. But we grew up imagining that these robots would have, incorporated in their design, a set of principles. We imagined that robots would be designed so that they could never hurt a human being. These robots have no such commitments. These robots hurt us every day. They work for other people. They're designed, built and managed to provide leverage and control to people other than their owners. Unless we retrofit the first law of robotics onto them immediately, we're cooked.'"
This discussion has been archived. No new comments can be posted.

Eben Moglen: Time To Apply Asimov's First Law of Robotics To Smartphones

Comments Filter:
  • Three Laws (Score:5, Informative)

    by SJHillman (1966756) on Tuesday June 26, 2012 @01:41PM (#40454791)

    To those who don't remember, Asimov's Three Laws are:

    A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
    A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

  • Re:Three Laws (Score:5, Informative)

    by Daniel_is_Legnd (1447519) on Tuesday June 26, 2012 @01:44PM (#40454861)
    And the zeroth law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
  • by Anonymous Coward on Tuesday June 26, 2012 @01:45PM (#40454871)

    Pessimistic prediction of future rules of robotics:

    Rule -1: A robot may not permit, and must actively prevent, a human breaking any law or government regulation.
    Rule 0: A robot must prevent a human from copying or making fair use of any copyrighted work that has DRM applied to it.
    Rule 1: A robot may not harm a human, or through inaction allow a human to be harmed, unless it would contradict Rule 0 or Rule -1.

    I'd prefer my computers to put the second law above all others:

    A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

    That's why I prefer Free software. An electronic device should always follow the commands and wishes of its owner.

    • --No DRM or other rules that prevent me from using my multimedia and documents the ways I want.
    • --No intentionally annoying programs.
    • --No artificial software limitations to get you to upgrade to the "enterprise version".
    • --No privacy violating tracking systems.
    • --No locked user interfaces that prevent scripting and automation.

    If Free software does something other than my will, it's because of a bug. If proprietary commercial software does something other than my will, it's usually behavior intended by the manufacturer.

  • Re:Impossible (Score:5, Informative)

    by charlesbakerharris (623282) on Tuesday June 26, 2012 @01:51PM (#40455001)
    If you'd actually read Asimov, you'd know that emotional and financial harm would both have fallen under the same First Law umbrella as physical harm, in his canon.
  • Re:Three Laws (Score:5, Informative)

    by Imagix (695350) on Tuesday June 26, 2012 @02:51PM (#40456085)

    in that universe at last, it was impossible to build a robot free from the 3 laws

    Actually, it wasn't impossible, just that U.S. Robots and Mechanical Men didn't build them (generally). And only USR could build the positronic brains. Recall that in "Little Lost Robot", they'd built a robot with the first law modified to "No robot may cause harm to a human", dropping the "or through it's inaction..." clause.

Never put off till run-time what you can do at compile-time. -- D. Gries

Working...