In 2004, when Google users discovered that the top search result for the word "Jew" was the anti-semitic site Jew Watch, Google ran a disclaimer in the space usually reserved for ads, explaining that their results only reflected the reality of link counts on the Web, and that they did not endorse any Web sites which appeared at the top of their listings. Now the disclaimer has been dusted off again, as the top result on Google Images for "Michelle Obama" is a picture of a monkey's face with Michelle's hairdo. (Ironically, it looks as if the original image would have fallen out of the rankings, if it hadn't been for a follow-up blog post about the controversy, which itself now comes up as the first result.)
I first heard about the controversy from Dennis Prager's column in which he takes a New York Times columnist to task, because the columnist complained about "racially offensive images of the first couple" that come up in Google searches. Prager was unable to find any examples from Googling "first couple" or "Michelle and Barack Obama pictures," so he concluded that the NYT columnist "wildly exaggerated, if not made up" his claims. I tried Google Image searches for "first couple," "Barack Obama," and some other terms, and I couldn't find anything controversial either. However, it only took 10 seconds to enter "first couple google images controversy" on the regular Google Web search and find multiple blog posts explaining what all the fuss was about. Back to Google 101 for Dennis.
Many of the blog posts refer to Google's disclaimer about not tampering with search results. Those on one side are urging Google to make an exception and "fix" the results, while others sagely observe that Google just reflects reality, it doesn't create it.
All of this punditry is starting from a premise that's wrong. Google has actually removed pages from their search results — not because the pages were illegal or because the webmasters were search engine spamming, but because of the page's "offensive" content. In the "Chester's Guide" incident, a councilman in Chester, England discovered that one of the search results for "chester guide" was a satirical page titled "Chester's guide to picking up little girls." Although the page itself was obviously just someone's idea of sick humor, a Chester city councilman (who admitted that he hadn't looked at the page, saying that the title told him everything he needed to know) urged Google to remove the page from their index. Google at first refused, but later manually blacklisted the page to prevent it from appearing in their search results.
Whether or not you think this was the right decision, probably depends on what you think is the purpose of Google. If Google's purpose is to return the most useful results, then it made sense to remove the link, as Danny Sullivan of Search Engine Watch argued at the time, since it almost certainly was not a useful result for people searching for "Chester Guide." On the other hand, if the primary purpose of Google is to reflect the reality of what pages on the Web feature certain words most prominently (combined with all the other factors that Google weighs, of course), then the results shouldn't be altered.
But more people should at least realize that it happened. The Google disclaimer doesn't precisely say that they never blacklist pages or modify search results ("Google reserves the right to address such requests individually"), but it seems to give most people the impression that that's the case. According to that crudest of Googling techniques for which novice searchers are so frequently lampooned, there appear to be about 400 times as many stories on the Web about the Google "Jew Watch" controversy (where Google stood their ground) as there are stores about the "Chester's Guide" incident (where Google caved).
And Google-number-three Matt Cutts posted on his blog back in March explaining why Google does not remove "offensive" pages from search results; over a hundred comments followed, debating the pros and cons of the position, but none of them mentioned the Chester incident or any other case where Google actually had removed pages except as a result of a court order. One isolated comment from "Anonymous" said:
This is not quite true. I know of at least one web site that was de-listed for containing illegal content and/or promoting illegal activity.
which may or may not have been a reference to the Chester Guide incident. And that was it.
Is this a lot of hay to be making over something that happened years ago? Well, for one thing, I doubt if it happened just once. Consider that the Chester Guide incident involved a public declaration of outrage by a city council, and a public statement from Google, and still hardly anyone knows that it ever happened. If other incidents occurred without those high-profile elements, it would be even harder to discover them now. We'll probably never know how many such incidents took place, unless someone sues Google (maybe the owner of a blacklisted website, or maybe the victim of a RipOffReport hatchet job wondering why that site hadn't been blacklisted long ago), subpoenas Google for a list of cases where pages were de-indexed, and publishes the list if it's not sealed by a court order.
But whether it was one time or a handful, consider that political candidates like Arnold Schwarzenegger and Al Franken got asked during their campaigns about things they did 20 years earlier, and it's fair to ask a candidate about their past, because it's the same person standing in front of you now. Why did you do that? Have you stopped? Why?
And in the big scheme of things, Google is probably more powerful than a single US senator or the governor of California. So, can't we ask? What are their real rules about page removal? Have those rules changed since the Chester's Guide controversy? Can they even tell us what their rules are, or do they consider it a trade secret?
It is well known, of course, that Google censors some results in their search engines branded for different markets like China and even in liberal democracies like Germany. But nobody would call that a slippery slope towards censorship in the US version of Google, because the censorship in the Chinese and German versions is done at the behest of the governments there. On the other hand, Google does admit that they will de-index pages which include credit card numbers or social security numbers (which are all too easy to find on the Web). This might not seem like a controversial position, but even this act of voluntary self-censorship may be dipping their toe in the water further than it seems. Most people do consider their credit card information more private than their home address. But surely there are people like J.D. Salinger who less about the privacy of their credit card number (which is easily changeable) than their home address (which isn't). If someone finds Salinger's address and posts it on the Web, should Salinger be able to demand that Google de-index the page? Why should Google cater to the majority who want to keep their credit card number secret, but not to the minority who care more about keeping their address secret? Another commenter on Matt Cutts's blog post asked:
"hi. I have a question. My mom 'googled' herself and it shows some of her medical problems. She wants/needs these pages removed from search engines."
Again, why shouldn't that be considered at least as private as a credit card number?
And finally, even Google's decision to display an "offensive results" disclaimer, for some results but not for others, raises the same "Where do you draw the line?" questions as the issue of page removal. The Michelle Obama monkey picture gets a disclaimer. But search for 'george w bush' and the first row includes a photoshopped (I think!) image of Bush flipping off the press. Does that warrant a disclaimer as well? (Maybe that's considered less unfair because, even though the picture is fake, it does depict something that actually happened.) The first image result for "bristol palin" is a photo of her engaged in underage drinking — a real photo, but probably unfair to call it the single most relevant photo of her on the Web.
So while Google might consider credit cards and social security numbers and search engine spam to be on one side of a "bright line," and everything else is served up without alteration, I think the line is blurrier than that, for at least those three reasons: (a) credit cards and SSNs are less private than some other that things that Google serves up anyway; (b) Google has unambiguously removed some content that fell outside that bright line, as in the Chester's guide incident, and (c) they make other "slippery slope" judgment calls about search results all the time (as in the question of when to show the disclaimer). So I hope that Google someday comes out with a more complete answer to the question. What is their real policy on what they will remove? The Chester's guide incident — would they do that sort of thing if the same situation came up today, or have their rules changed? If they want to go really deep, then is there a general set of principles from which their rules follow — explaining why, for example, they treat credit card numbers as more private than sensitive medical information? (Google did not respond to my request for comment, either through official channels or the unofficial back channels of friends who work there.)
I hope Google gives an answer some day. Even just to say, "It's a classified internal policy and that's all we're going to tell you." But once and for all, the answer is not "Google doesn't remove content just because it's 'offensive' or 'harmful.'"
Meanwhile, a modest suggestion about the disclaimer displayed above the search results: Put it where people will actually see it, in a separate line below the ads, but above the search results. Right now the link to the disclaimer is displayed as one of three ads across the top, and people don't look at the ads. But hey, people do buy ads, so if you push the disclaimer down a bit where people will read it, you also free up space for 50% more ad revenue!