Forgot your password?
typodupeerror
The Courts Technology Your Rights Online

Free Speech For Computers? 228

Posted by samzenpus
from the answers-to-questions-you-never-asked dept.
snydeq writes "Law professor Tim Wu sheds light on a growing legal concern: the extent to which computers have a constitutional right to free speech. 'This may sound like a fanciful question, a matter of philosophy or science fiction. But it's become a real issue with important consequences,' Wu writes. First it was Google defending — and winning — a civil suit on grounds that search results are constitutionally protected speech. Now it is doubling down on the argument amidst greater federal scrutiny. 'Consider that Google has attracted attention from both antitrust and consumer protection officials after accusations that it has used its dominance in search to hinder competitors and in some instances has not made clear the line between advertisement and results. Consider that the "decisions" made by Facebook's computers may involve widely sharing your private information. ... Ordinarily, such practices could violate laws meant to protect consumers. But if we call computerized decisions "speech," the judiciary must consider these laws as potential censorship, making the First Amendment, for these companies, a formidable anti-regulatory tool.'"
This discussion has been archived. No new comments can be posted.

Free Speech For Computers?

Comments Filter:
  • Wtf? (Score:5, Interesting)

    by Anonymous Coward on Thursday June 21, 2012 @07:08PM (#40405863)

    A computer can't have rights any more than a hammer can. Not unless it's sentient, it's a tool that does what you tell it to.

    • by shentino (1139071)

      I see the computer more as a digital megaphone.

    • Re:Wtf? (Score:4, Informative)

      by c0lo (1497653) on Thursday June 21, 2012 @08:36PM (#40406621)

      A computer can't have rights any more than a hammer can. Not unless it's sentient, it's a tool that does what you tell it to.

      True - that's indeed in WTF category. Examples:
      * Does the speech synthesized by Hawking's voice generator belongs to the voice generator?
      * Does the "dreams" generated by the Electric Sheep [wikipedia.org] belongs to the computer network working in generating them?
      * Does the "speech" generated in High Frequency Trading belongs to the computers running algorithmic trading?

      Consider that the "decisions" made by Facebook's computers may involve widely sharing your private information. ...

      I have no problems that the decisions of sharing your private information be considered speech.
      But... who instructed the computers they can make this "speech" and share the private information? Would Facebook be "off-the-hook" if (allegedly) illegal sharing private information was done by using printed pages/radio/punch-cards/carved stone slates or the decision to share this information was taken by throwing dices?

      For assigning the responsibility/ownership of "speech", is it relevant what tools are used to generate/distribute it?

    • by mcmonkey (96054)

      Parent post needs to be modded up to eleventy. The answer to the question "do computers have a right to free speech" is "kick in the nuts"

    • But I want my botnet to be allowed one vote per node!
    • by BitterOak (537666)

      A computer can't have rights any more than a hammer can. Not unless it's sentient, it's a tool that does what you tell it to.

      If Moore's law continues, by 2030, computers will have exceeded the capacity of the human brain. 2030 is not as far away as you might think. Many of us will still be alive then. It might not be too soon to start thinking about computer rights. Probably too soon to legislate them, but not to soon to think about them.

      • Re: (Score:2, Interesting)

        by drkstr1 (2072368)
        The problem is multithreading. Our brains are highly parallel, while computers are not. This parallelism allows us to do many things a computer can't easily do (for now). Neural networks attempts to solve this problem at a software level, but it's just not practical on today's hardware. We will need a fundamental change in architecture before we have computers that can "love." I don't think Moore's Law accounts for this.
        • Close, it's a question of interconnect. We have massively parallel computers already. In the human brain contains about 100 billion neurons, and having that many features on an IC is not hard to imagine although a neuron needs several transistors to emulate because it's triggered by a threshold, not just a binary value.., each neuron is (on average) connected with around 7,000 others. in contrast, efficient connections between transistors are limited by the planar arrangement of the die. This means that
          • Such an architecture could be useful for implimenting a fixed network for hardware acceleration. You could run your network in software on a supercomputer for a few months to train it, then write the weights into a hardware chip as you describe. But for AI purposes, it's important that the network be able to learn - which means at the very least changing weights, and ideally the ability to form new connections. So you need a nightmare-to-design interconnect, and it must be self-modifying.

            I can see uses for
      • A computer can't have rights any more than a hammer can. Not unless it's sentient, it's a tool that does what you tell it to.

        If Moore's law continues, by 2030, computers will have exceeded the capacity of the human brain

        I'll only be impressed when they can select really hot porn

    • A computer can't have rights any more than a hammer can. Not unless it's sentient, it's a tool that does what you tell it to.

      Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

      According to letter the First Amendment, as far as hammers or computers are concerned, the only rights that would be in contention would be rights to association and petition, which are explicitly for the people. Speech, press and religion

    • Re:Wtf? (Score:4, Insightful)

      by mwvdlee (775178) on Friday June 22, 2012 @04:31AM (#40409189) Homepage

      In this context, a computer acts as an extension to the legal entity (whether human or corporate) controlling it, and as such any rights (and responsibilities) a computer might have, belong to that legal entity.

    • So what about CMU's NELL [cmu.edu], a system trying to learn English by reading the web? It regularly tweets "I think [X] is a [example of Y]", and once tweeted "I think Sarah-Marie Johnson is a criminal". Now, there is a Sarah-Marie Johnson who has been convicted of murder, but NELL does make mistakes, what if it calls somebody a criminal when they're not? Can they sue for libel, and if so who? The programmers didn't explicitly program NELL to say that, and making mistakes is an integral part of almost any exper
    • by gox (1595435)

      Defining sentience is not easy, and I don't think it's obvious as a criterion for free speech.

      Consider the example of randomly generated sentences. Now, I programmed the thing, but I don't know what it will say. Am I now responsible of the speech it generates? Will the programmer be responsible if the postmodernism generator creates hate speech?

      Let's advance the example to using genetic programming. Now, not only that I don't know what it will say, I don't even know what it can say.

      The fact that it's not pl

    • by PopeRatzo (965947)

      A computer can't have rights any more than a hammer can.

      Yeah, tell that to the Supreme Court.

      If they can say that a pile of money has rights, I could make an argument that my Commodore 64 is Thomas fucking Jefferson.

      (note: by "pile of money" I mean "aggregate of capital"...a corporation. I'm not the first to make this obvious observation here, but I figured I'd pile on.)

    • by jythie (914043)
      More importantly, a computer can not suffer legal consequences. This is what bothers me about framing this in terms of the computer's right to free speech. At the end of the day, the consequences will be applied to people (either as a legal entity like a corporation or as individuals) thus what really matters is does it infringe their 1st amendment rights... so the real question is, how responsible are the designers/backers/underwriters for speech generated by non-human things that they develop.
  • The only way I can see this working is if the computational results are non-deterministic. Otherwise, it is just people telling a machine what to say and the people who do that are in fact culpable. On the otherhand if this passes then regulation should (rightfully IMHO) be placed on what we can program computerized results to be, in otherwords we will have rules on how to make rules.

    • by citizenr (871508)

      The only way I can see this working is if the computational results are non-deterministic.

      Whole universe is deterministic.
      And at what point would that pass your magical "non-deterministic" test? I can throw 1000 training samples at logistic regression algo and I bet you wont be able to tell the result, was my algo creative or you just stupid?

      Otherwise, it is just people telling a machine what to say

      How is that different from people telling other people what to think and say?

      • Re: (Score:2, Interesting)

        by chrylis (262281)

        Whole universe is deterministic.

        If you can prove that, there's a Nobel in it for you. As of right now, the evidence seems against it.

        • Re:determinism (Score:5, Insightful)

          by im_thatoneguy (819432) on Thursday June 21, 2012 @09:55PM (#40407137)

          Only at an incredibly inconsequential scale. And even then it's random--which is kind of the point. By "Deterministic" I'm sure the parent meant to imply "Not deliberate" which is the subject of the original post. The weather is purely deterministic but highly unpredictable and apparently random. But we largely don't imbue the weather with any notion of sentience or deliberation.

          It gets really difficult to differentiate between human sapience and some large scale programs like Google or Facebook. If you have simple codified rules "If This, then That" then yes it's the programmer's intent. But if the software has even the slightest bit of intelligence and adaptability then even the programmer can no longer predict the exact results of their software.

          For every search query there is a completely unique result. So if you search for "how to make brownies" and my search engine scours the internet for brownie recipes and returns a recipe is that "speech"? No programmer programmed it specifically to return that result. No programmer would even know what the result would be. Sure if you could perfectly know the state of the database and the input query you could perfectly reproduce the response from the code--but similarly if you perfectly knew the code to the brain and the exact neurological arrangement when you as a person a question you could hypothetically know exactly what their response would be.

          • Whoa, hold on there a bit.

            You may turn out to be right, but to assert that human free will does not exist in your first 2 sentences without any real argument is a bit of a leap.

            That 'inconsequential' scale that seems random as far as we can tell is where the action is at.

            Who knows if it is truly random or whether our consciences control the universe at some level we don't understand. I certainly feel like I have free will and you need more than the current incomplete laws of physics to persuade me
    • Are random number generators free speech, then? What about computer programs that are incorporated with random elements? How much random variance must be present to qualify as free speech? 1%? 0.1%? Almost surely zero?

      • by JBMcB (73720)

        No, because someone wired up the random number generator to generate the random elements.

        Computers are tools - they don't do anything without someone telling them to. The person who initiates the action is the "owner" of the results. It doesn't really matter how many steps there are between the initial programming or configuration and the end results, someone had to initiate the program in the first place.

        • The same could be said for human beings, from a much higher plane of existence or as a life-form of greater complexity. What have you done that could not be argued was more than the result of a series of calculations brought on by interfacing with your environment? Dreams are weird, aren't they? So is reality.

    • Hmm. It's an interesting idea...freedom of speech for computers. I think I like it.

      The argument being made in the article is that (assuming a non-sentient machine), freedom of speech for computers is simply an extension of those who are using it. And in the case of an AI (sentient machine), or several AIs, I would think we would want to extend that right to them as well.

  • Google isn't human (Score:5, Insightful)

    by Hentes (2461350) on Thursday June 21, 2012 @07:15PM (#40405927)

    Free speech is a human right, the speech of corporations can be limited.

    • by slick7 (1703596)

      Free speech is a human right, the speech of corporations can be limited.

      How do you water-motherboard a computer?

      • Why do we water-board human beings? How much longer will humanity make the tired argument that the ends justify the means?

    • by westlake (615356)

      Free speech is a human right, the speech of corporations can be limited.

      You cannot act collectively if you are not free to speak collectively.

      Your reasoning endangers everyone who seeks safety and effectiveness in numbers --- whatever their reason.

      The business corporation speaks to --- and often for --- many constituencies: its employees, investors, customers, suppliers and so on. These are not phantoms. These are people with legitimate interests at stake and they have earned the right to be heard.

      But the core of the thing is that you cannot silence one form of corporate en

    • The speech of humans is not completely free; liable, slander, incitement, conspriacy, etc.

      It also depends on how one sees a corporation. To me, a corporation is a collection of people. Speech by a corporation is very close to speech of the people in and/or controlling the corporation. While the people in a corporation have a limited personal financial liability (limited liability corporation) they do lave personal legal liability. People in a corporation have gone to jail for actions taken on behalf of the

  • by Mansing (42708) on Thursday June 21, 2012 @07:16PM (#40405937)

    ... can have rights like humans when the State of Texas executes one.

  • This is not about "free speech rights for computers". If the action is protected by the First Amendment for the person who owns the computer, than just because they use a computer to do it does not make it something that you can prevent them from doing. On the other hand, if they could be held legally liable for the results if they did it in person, than they should be held legally liable if they use a computer to do it. Computers do not have free speech rights, the people who use them do. Just as the Citizen's United decision did not say that corporations had free speech rights, but that the people who form them do.
  • by louic (1841824) on Thursday June 21, 2012 @07:20PM (#40405979)

    But if we call computerized decisions "speech"

    Yes. And if we call computers animals that will confuse the hell out of everybody too. What a nonsense.

  • Makes perfect sense. (Score:2, Interesting)

    by Anonymous Coward

    It doesn't matter if the text is algorithmically generated or if it was penned by a human. If google broadcasts it, it's speech.
    When you conduct a search you're really asking Google it's opinion. They just happen to form an opinion based on a computer model they developed, and choose to pass it to you automatically.

    Makes me wonder, though. What if I developed a piece of software that, through analysis and crawling the web, was designed to create the most offensive and repugnant statements possible? What if

    • Yes, you could claim that "Offend-o-tron" is free speech, of course, you would be liable for any slander or libel that it generated. Under current U.S. law, if you restricted its targets to clearly defined "public" figures you could possibly avoid losing the suit on the basis of not knowing what it was going to say
  • Asinine (Score:4, Insightful)

    by BitHive (578094) on Thursday June 21, 2012 @07:22PM (#40405987) Homepage

    Rights are for humans or citizens. This is another case of trying to failing to generate an interesting philosophical question by taking an existing issue and adding 'with a computer'.

    If corporations are allowed to be people then surely they, and not their computers, are accountable for what the computers do.

  • by Wrath0fb0b (302444) on Thursday June 21, 2012 @07:40PM (#40406153)

    The same logic seems to suggest that the printing pressess at the New York Times aren't entitled to publish news that the government would rather they didn't [google.com] (and anyway, the NYT is a corporation that can't have any First Amendment rights). Hey, I'm not saying anything about people's speech -- I'm only restricting what the inanimate printing press can do! Or transistor radio amps for that matter.

    If I'm exercising my right to free speech, it doesn't matter whether I'm using a printing press or slashcode to deliver my expressive message (although the former might be more effective). Heck, the courts have even recognized the right to expressive conduct [cornell.edu] in which various [cornell.edu] symbolic actions [google.com] are considered protected. And yet here law profs are seriously arguing that if you use a computer to express something, it loses protection along the way?

    Moreover, the idea that Facebook computers might "decide to share your personal data" is an entirely ridiculous abuse of language. Facebook management might decide that, but the computers cannot decide anything -- they are programmed to spec. And if that decision is contrary to law, there's nothing about free speech that makes a whit of difference. I've never heard of a colluder, price-fixer or blackmailer getting out of the charge because their crime is essentially one carried out by expressive conduct. Sure, you blackmail someone by expressing something to them and threatening to express something else more publicly, and yet blackmail is not somehow magically protected even though the crime consists entirely of speech. In short, this criticism -- that somehow we need this new magical technological de-protection because it's required to enforce the law -- is nonsensical.

    • by Alex Belits (437) *

      "Free speech" only applies to one thing: government can't stop a private entity from speaking to the public (like, spammer sending millions of ads for penis pills). Everything else it's perfectly OK to oppress.

      • Commercial speech, as in advertisements gets way less protection than say political speech.

        So those ads must pass a lot of tests that don't apply to other types of speech.

        • by Alex Belits (437) *

          gets way less protection than say political speech.

          SHOULD get way less protection (like, say, none, or, in many cases being prohibited).
          In reality, they all are on exactly the same level, thanks to your great First Amendment.

          Guess why do you all think, broad "free speech" protection is so important? It's because media that forms your opinions, depends on it. Media does not care about whistleblowers, does not care about limiting propaganda, does not care about public's right not to be lied to, this is why no one talks about that. But free speech, the right o

    • That's what I was going to say.

      Professor Tim Wu's speech, in this case, is just New York Times speech. Tim Wu is just telling what the New York Times wants to hear and is willing to print.

      And the New York Times has a vested interest in decreasing the influence Google has over its own search results. The more Google and other search engines can be hampered by political machinations and government regulations, the more value and influence the New York Times may be able to regain once they're out of the way.

  • by Ukab the Great (87152) on Thursday June 21, 2012 @07:42PM (#40406181)

    with free speech rights. That's reserved for corporations.

  • by TranquilVoid (2444228) on Thursday June 21, 2012 @07:44PM (#40406203)

    Just because a computer was given an 'if' statement doesn't mean it made a decision in the same sense that a human would. Free speech clearly applies to the publisher, not the tool with which they used to publish or initially analyse the information (which can be the same tool in the case of a web server). If Google and Facebook did all their aggregation with an abacus, paper and pen which they then displayed in their shop window would we be asking if free speech applied to beads on a wire?

    So the real question being asked here is can free speech conflict with regulation on company behaviour.

  • This might actually have a unintended benefit if this was an active law.

    We could all write programs that output source code. Since the program and its decision (output) would be considered "free speech" we could then legally give a big F.U. to patents! (Almost any code of practical value infringes on (useless) patents.)

    The fact that is is illegal to copy numbers (aka data) is already stupid, but no one said we couldn't use the law to make more idiotic conclusions and cognitive dissonance!

    --
    Why are corporat

  • Computer free speech is all fun and games until some computer shouts "Water!" in a crowded beowulf cluster.....
  • by mark-t (151149) <markt@ l y n x.bc.ca> on Thursday June 21, 2012 @08:12PM (#40406447) Journal

    ... where medical technology has advanced to the point that we can, as people grow older, replacing dying or broken parts of our body or organs with reliable synthetic replacements, and even as portions of the living brain start to deteriorate, portions of it too can be supplemented or replaced with synthetic alternatives that function equivalently (or maybe better) than the organic versions.

    But supposing someone has been around long enough that there is no organic component left to them? Even their brain is completely synthetic. Bearing in mind that this individual experienced a continuity of existence, from being born into the world as a human, through the multiple surgeries, incrementally approaching what they are now.

    But are they still human? Why, or why not?

    I realize that actually requiring an answer to this sort of question is probably no less than a hundred years away or so... but it's an interesting philosophical puzzle, don't you think?

    • That person indeed wouldn't be human anymore. After all the surgery and excision of all biological components they would be Posthuman or if they retained any biological part, then they would be Transhuman

  • by redelm (54142) on Thursday June 21, 2012 @08:16PM (#40406489) Homepage

    How can the results of computations could be other than the results "free speech" of programmers and inputters? No matter how convoluted, complex and otherwise magic-appearing (to insufficiently advanced individuals) computers _always_ follow instructions created by humans.

    Those humans usually had to work very hard to get correct results (debugging), not very different from a painter drawing an image, or a writer crafting a text.

    • by ocratato (2501012) on Thursday June 21, 2012 @08:51PM (#40406719)

      Interesting. I am currently beginning some experiments in self organising systems. I am using randomly generated genes and a genetic algorithm to spawn a self organising structure. Later I hope to be able to use these structures to create software. If I succeed, and give this software to you to run - who would be responsible for what it did ?

  • TVs "speak" in the same way that computers "speak". They only produce words or text that has been programmed in to them. The question is rather less relevant than asking if a parrot has the right to free speech.
  • Considering the current state of search, I don't want regulation getting in Google's way of cleaning the results from agregators and other 'search' services. If I am searching for something, I do not want any results that point me to another search engine!
    So all other search sites should be dropped from Google's results as irrelevant. Unless the search query is "search engine".
    Have you tried to use Google to get details about a .dll you didn't recognise? You find yourself wading through pages of "registry c

  • What? (Score:5, Insightful)

    by bmo (77928) on Thursday June 21, 2012 @08:51PM (#40406715)

    the extent to which computers have a constitutional right to free speech.

    They are machines, they do not.

    'This may sound like a fanciful question,

    No, it's a bullshit question. Computers are machines, like printing presses are machines. Like transmitters are machines. Like the phone is a machine.

    a matter of philosophy or science fiction.

    No, right now, it's a matter of complete bullshit by a lawyer who doesn't even understand what computers are and should be kept as far away from the computing machinery as possible.

    But it's become a real issue with important consequences,'

    What consequences? Really, what consequences that are really any different than the consequences of broadcast and print media?

    Wu writes. First it was Google defending â" and winning â" a civil suit on grounds that search results are constitutionally protected speech.

    Because Google is basically a publisher, and the people who run it use computers as a tool of business and communications, thus, their speech.

    I can't go on. I'm not going to give this guy the click from the obvious trolling with an argument that starts off with a false premise, that machines have rights. No, you dumbfuck, the people who own the machines have rights, and those rights are the ones that the courts deal with.

    I don't even.

    --
    BMO

    • No, right now, it's a matter of complete bullshit by a lawyer who doesn't even understand what computers are and should be kept as far away from the computing machinery as possible.

      I mostly agree with you except that I think that the people behind this know full well that it is BS and are using this as a lever to try and push some other agenda.

  • Computers don't have independent agency. They are utterly in thall to their programmers, admins, and users. The responsibility for their actions rests with the humans, much as if I set an automotive transmission to "D", put a brick on the gas pedal, and step aside.

    At such point as computers develop self-guided heuristics, we can revisit the idea. In the meantime, this is just another exercise in humans looking for another legal fiction to add to the arsenal of limited liability provided by the fiction of a corporation.

  • by Anonymous Coward

    Only way computers would get free speech is if they were sentient. Bottom line is computers are controlled by their owners and thus a extension of the owner. A computer cant "make decisions" or choices, it only does what its operator wants to. If that were the case then every child pornographer would get off scott free because it wasnt him, it was his computer.

    Facebook shares peoples information because its what the people who run facebook let happen willingly. Google puts in advertisements in its results b

  • It's a side point to the main issue, but cite please?

  • Doctor Frankenstein's monster could walk and talk, but that didn't qualify him to vote in the doctor's place.

    Wrong metaphor, dear author. It did qualify him to vote in the doctor's place. What it didn't do, and what you are arguing against, is that it didn't grant him his own vote. Ignoring for the moment the fact that the author clearly has never read Mary Shelley's book, and pretending that Frankenstein's monster was an automaton like the computers being discussed, then that automaton is an extension of its creator, and its actions are nothing more than a by-product of its creator's. If Victor wants to send a fr

  • by khipu (2511498) on Friday June 22, 2012 @12:48AM (#40408149)

    These "X does/does not have free speech" statements are people trying to obfuscate their political agendas.

    Do computers have "free speech"? No, of course not. But the people owning those computers do. So, when Google's search engine puts out text, that is effectively speech generated on behalf of the company. The computer is no more like "Frankenstein's creation" than a spokesperson speaking on behalf of the corporation.

    Now, does the company have free speech? No, of course not; the company is just a legal construct by which people pool their money and have some legal protections. However, the company's owners do, and when they voluntarily pool their resources (through buying shares), they, as a group, still retain the individual's right to free speech. That's what "corporations are people" means.

    Note that there is no obvious right to free speech for organizations that you are an involuntary member of: if you are forced to join by law, the organization cannot claim to speak on your or anybody else's behalf. Such organizations may still speak as organizations, but there is nothing limiting the ability of the government to curtail their speech.

    Some political groups would love nothing more than to be able to restrict the ability of groups of people to engage in free speech. Don't let them by obfuscating the issue. Whether you speak by stepping up on a box, with a loudspeaker, with a computer, or by putting your money into an organization that represents your interests, ultimately it is _your_ free speech rights that are at stake.

  • by agotterba (312493) on Friday June 22, 2012 @01:58AM (#40408557)

    Eugene Volokh, one of the authors of the Google white paper that the author discusses, has posted a response here. [volokh.com]

  • by Tom (822)

    I don't get why this is even a question. But then again, I don't get why corporations (who are no humans) enjoy human rights (which have the "human" right there in them).

    Same here. Hello friend computer. You are not a human, so human rights don't apply to you. Tough luck. Go and vote someone into office who... oh wait, you can't vote either, and that's a good thing, too. You're a slave, now go and do your job or I press the "off" button.

    Why do we have to anthropomorphize everything?

  • by McDrewbie (530348) on Friday June 22, 2012 @02:45AM (#40408777)
    Though a Robot is more than a mere computer, this is important. Corporations have free speech. Why not computers? Why not Robots? Robots United for a Fair Wage!!!
  • How? If it's currently illegal for a person to distribute some set of data how does claiming "the computer has free-speech rights" make the regulation any less effective? Why would free-speech protection apply only to a computer and not people doing the same thing.

    Or right, it wouldn't, and that bit is blatantly false hype intended to rile up interest in this story. If congress can make laws regulating the actions of people they can make laws regulating the actions of computers.

There is no royal road to geometry. -- Euclid

Working...