Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Privacy Wireless Networking Your Rights Online

Why Google's Wi-Fi Payload Collection Was Inadvertent 267

Reader Lauren Weinstein found a blog post that gives a good, fairly technical explanation of why Google's collection of Wi-Fi payload data was incidental, and why it's easy to collect Wi-Fi payload data accidentally in the course of mapping Wi-Fi access points. "Although some people are suspicious of their explanation, Google is almost certainly telling the truth when it claims it was an accident. The technology for Wi-Fi scanning means it's easy to inadvertently capture too much information, and be unaware of it. ... It's really easy to protect your data: simply turn on WPA. This completely stops Google (or anybody else) from spying on your private data. ... Laws against this won't stop the bad guys (hackers). They will only unfairly punish good guys (like Google) whenever they make a mistake. ... [A]nybody who has experience in Wi-Fi mapping would believe Google. Data packets help Google find more access-points and triangulate them, yet the payload of the packets do nothing useful for Google because they are only fragments."
This discussion has been archived. No new comments can be posted.

Why Google's Wi-Fi Payload Collection Was Inadvertent

Comments Filter:
  • Re:So? (Score:3, Interesting)

    by erroneus ( 253617 ) on Saturday June 19, 2010 @02:43PM (#32626764) Homepage

    Nothing explains why they stored the data so far. Recording names of access points? Okay. Recording locations of access points? Mmmmaybe. Recording data retrieved by connecting to unsecured access points? No. How can that data be used for any honest purpose? And let's be clear about this: collecting and storing data is an act directed by software which was written by a person or persons who were acting under direction ostensibly by specification. You find those specifications and directors and you will come closer to finding the truth as well as those responsible.

  • Re:So? (Score:2, Interesting)

    by MoHaG ( 1002926 ) on Saturday June 19, 2010 @03:07PM (#32626930) Homepage

    They accidentally recorded parts of publicly broadcasted data....

    It is not much different from a phone recording a conversation in a busy enviroment and being blameed for accidentally recoring parts of other people's conversations that you walked past...

  • Re:The good guys? (Score:5, Interesting)

    by mellon ( 7048 ) on Saturday June 19, 2010 @03:21PM (#32627030) Homepage

    Whether or not they are the good guys, laws that attempt to contravene physics are a bad idea. If the packets had been encrypted, it wouldn't have mattered that Google captured them--without the key, they're just noise. You could pass a law saying that capturing packets broadcast without encryption is illegal, or you could pass a law saying that if you want your packets to be private, you should encrypt them, and if you don't encrypt them, you have no expectation of privacy. Which of these two laws do you honestly think makes the most sense?

    Normally wiretapping involves a deliberate act of bypassing some kind of lock, if only the lock on the box that contains the wires. Here there was no lock, and the packets were hitting the antenna without any special effort on Google's part, and Google did have a legitimate purpose in putting up the antenna and listening for packets. Yes, they got more packets than their legitimate purpose required. Maybe they did so deliberately, although I can't see any reason why that would have been useful to them. But making it illegal is a really expensive way to solve the problem, and it doesn't solve the fundamental problem, which is that people are sending their personal information over the network in the clear.

  • Re:So? (Score:1, Interesting)

    by postbigbang ( 761081 ) on Saturday June 19, 2010 @03:22PM (#32627040)

    No. It was at best willful sloth.

    Any geek with stripes can strip the payloads after identifyng association attempt results, and their locus.

    Just gulping the data, which is what they did-- perhaps terabytes of it-- isn't excusable.

    There was once a TV show called F Troop. In the opener, they stripped all of the buttons and rank from two soldiers, an officer and an enlisted man, if memory serves. Google should have had by now, a similar such ceremony from their software QA director, and their lead systems engineer. Just WTF were they thinking? Let's have a merry little war drive with some of that open sauce software stuff? Egads. Accidental my ass.

  • by drew30319 ( 828970 ) on Saturday June 19, 2010 @03:35PM (#32627130) Homepage Journal

    Inadvertent or not Google broke laws in some countries. Accidentally breaking the law doesn't eliminate responsibility or culpability - even if people shouldn't have left their WiFi unsecured. If I accidentally run over someone with my car because I wasn't paying attention to what I was doing, it doesn't absolve me of the liability - even if that old lady had it coming, er, was jaywalking.

    Not necessarily. If a law in a country is based on strict liability then you are probably correct because strict liability does not require a "guilty state of mind." For example, statutory rape in the U.S. is generally a strict liability crime (e.g. it wouldn't necessarily help Adam if he truly believed that Eve was of legal age if in reality she's a minor because state of mind isn't a factor for strict liability crimes).

    However, strict liability isn't the only level of culpability; in the U.S. the other levels are negligently, recklessly, knowingly, and purposefully. To use your driving example: if somebody were driving negligently (shown by not paying attention) and hit an old lady who is jaywalking it is a very different matter than if he is driving recklessly (shown by steering with his feet) or purposefully (shown by keeping a tally on his website of how many old ladies he has run over). If the jaywalking old lady is killed, this distinction may mean the difference between manslaughter and murder.

    To apply these culpability levels to the issue at hand it will be necessary to look to the statutes themselves; if the statute defines "illegal data collection" as being an act that is done purposefully, then negligence may not rise to that level. If it is determined that an error in Google's code is the reason behind the data collection and that the presence of the error in the code is due to negligence on the part of Google then it's entirely possible that no law was broken.

  • by jrhawk42 ( 1028964 ) on Saturday June 19, 2010 @03:36PM (#32627132)
    Basically Google probably could of swept this under the rug, and most companies would have. Google on the other hand came out as the only source. There was no accusations, or indication that this information would leak yet Google freely informed the public that this was an accident, and took responsibility. Maybe there was some underlying motive, maybe there's information we don't have, but with all the info that's out right now it seems Google acted as a good samaritan.
  • Re:So? (Score:1, Interesting)

    by MokuMokuRyoushi ( 1701196 ) on Saturday June 19, 2010 @03:49PM (#32627222) Journal

    it very likely was accidental... It would be real easy... More than likely... Sloppy engineering maybe...

    ...and then "certainly not malicious". Its been fairly obvious that there are no clear facts in this case. Just like the quote from the summary, "Google is almost certainly telling the truth"... Almost this, probably that, maybe those. To say that it is or isn't malicious is to go out on a limb with an Opinion Safety Harness. The only clear fact is that this is a very shady and inadequately explained and planned event. Whether or not packets saved were to be used maliciously is up in the air.

  • Re:Well duh (Score:4, Interesting)

    by LordLimecat ( 1103839 ) on Saturday June 19, 2010 @04:29PM (#32627442)

    Its not that Google are any better than anyone else

    I would argue that; whether for PR reasons, technical reasons, or other, most of google's offerings are open in some way or other-- Gmail, for example, seems to be the only major email provider that does not restrict auto-forwarding, or client access, or contact export, or anything else. Yahoo, MS, and AOL all have some form of lock-in.

    So forgive me if I tend to cut them rather more slack than MS or AOL; the best thing about google is that if they ever become the Super Boogeyman, I can just pick up my data and leave.

  • by slimjim8094 ( 941042 ) on Saturday June 19, 2010 @04:30PM (#32627452)

    People go to greater lengths than Google did to receive TV broadcasts, such as from outside the usual service area. It's a whole hobby - see http://en.wikipedia.org/wiki/TV_and_FM_DX [wikipedia.org]

    This is a case of people of people who purchased a product to send and receive information to all computers in a particular radius, and are then upset when Google finds itself inside that radius and receives the information it's being sent. That's not exactly 'great lengths'.

  • Re:So? (Score:3, Interesting)

    by postbigbang ( 761081 ) on Saturday June 19, 2010 @05:27PM (#32627836)

    You may find your mistake early, after gigabytes worth of data. Then you fix it before it becomes TB or PB of data. Right?

    We're all allowed mistakes. Mistakes of this size from the uber-geeks of Google isn't a mistake. It's negligence..... not quite of BP's size, but just as shamelessly stupid.

  • by Tom ( 822 ) on Saturday June 19, 2010 @05:59PM (#32628078) Homepage Journal

    IT'S A BROADCAST

    Other than radio, it is an addressed broadcast. See, every packet has a destination written on it. That makes the argument a little more interesting. It is more like a postcard - yes, you can read it (no encryption), but it has an address. The law considers postcards to be covered by the telecommunications privacy regulations.

  • Re:FR0$T P&$$ (Score:5, Interesting)

    by Antidamage ( 1506489 ) on Saturday June 19, 2010 @06:04PM (#32628108) Homepage

    You make an excellent point.

    For my part, I'd like to point out that if Google wanted to read your email, they wouldn't bother collecting wifi data. They'd just read yer fucking email.

  • by Kenoli ( 934612 ) on Saturday June 19, 2010 @06:54PM (#32628430)

    This is /. and I was required to use a car analogy. I could have just as easily said "If I find an iPhone prototype and use the personal information in it to accidentally steal someone's identity, it doesn't absolve me of the liability - even if that old lady had it coming, er, left her iPhone behind in that bar."

    Nonsense. Maybe you should come up with an analogy that doesn't involve anything being damaged, destroyed, killed, or harmed in any way, and with the action being invisible to the supposed victim.

  • Re:So? (Score:3, Interesting)

    by Score Whore ( 32328 ) on Saturday June 19, 2010 @07:07PM (#32628512)

    Regardless of whether it's accidental, or difficult as the OP suggests, the reality is that both of those are merely excuses and rationalizations for externalizing the bad effects of behavior while privatizing the profits. Try translating those excuses to another industry and see how satisfying an answer they are. Consider medicine, there are undeniable benefits to modern therapies. However because it's hard to get right, we don't just accept any random treatment. Before companies unleash their new products upon the public we require that they take the time to ensure, as much as possible, that they are safe and don't have unintended effects. You may suggest that Google isn't a medical company whose products and services won't be killing anyone or causing them to grow a third eyeball, therefore they don't have the same obligations. OK, then how about banking? Credit reporting? Private investigators? Mining companies?

    Entirely outside any other arguments, I find it hilariously ironic that Google -- the company staffed entirely by PhDs, by the most brilliant minds in the industry, by saints who'll do nothing wrong -- always comes back to "look we have this awesome idea with splendid (but vague and non-specific) benefits beyond making us incredibly wealthy, however there are significant downsides for the rest of you and those downsides are hard to avoid." Which makes me think that maybe they aren't so smart, which means that maybe their idea isn't so great. Isn't the point of being smart that you can do things that are hard? QED.

  • Re:So? (Score:3, Interesting)

    by causality ( 777677 ) on Saturday June 19, 2010 @07:31PM (#32628654)

    The thing most people forget to ask, but was asked in this article, is something you conveniently forgot to mention. Here it is:

    What possible use could google have for this data? What would be their motive here?

    As the article says, there's almost no personal data in the emails. Even if there is, there's so little of it that what useful purpose could it serve? You'd have a hard time correlating it to any one person, or even finding out what it is. There's going to be so little data here, and it'll be so fragmented, that turning it into anything useful would be impossible.

    On the other hand, why would google risk collecting this data when they knew what was going to happen if it got out? The risk vs. reward here just doesn't make sense. They're going to risk their reputation on... what? Collecting a few fragments of unencrypted wifi traffic that probably contains so little information and could very well be generated by a bot running on your machine.

    I'm not going to believe google did this on purpose until someone can give me a motive that doesn't sound like something from a UFO convention.

    What if this were a calculated marketing maneuver designed to test the waters and find out how much people really care about privacy and the possible hard-to-justify violation thereof? This is, after all, a company that would make far less money if everyone had excellent online privacy. How much people are willing to protect that privacy and how much outrage they express at real or perceived violations of it could be very important data to a company like Google.

    This is data that would be difficult for Google to obtain from their usual channels. Just like in politics, it has to become an "issue" and then the reaction can be assessed. A privacy matter that collects little or no directly sensitive information (thus protecting Google from potential liability) that still raises the issue and gets people talking about it would be perfect for this purpose. That's exactly what happened here.

    The more successful a company, the more resources it possesses, the more talent it has hired, the more difficult it becomes to believe that they'd make trivial mistakes that most Slashdotters, acting alone with an infinitessimal fraction of the same resources, would have easily avoided. Good long-term strategy looks a lot like things just happening to work out a certain way as a product of chance. It's possible someone at Google could have made the incredibly trivial mistake that caused this chain of events. What's unlikely is that among all of the managers, designers, and programmers involved in this project, not one person noticed such a mistake.

  • by debatem1 ( 1087307 ) on Saturday June 19, 2010 @07:55PM (#32628762)
    I do not understand this argument. How is your data private if its sitting out in open air? That's like saying that just because I was yelling in public doesn't mean you have a right to hear what I was saying if I wasn't yelling *at you*.
  • by icebraining ( 1313345 ) on Saturday June 19, 2010 @11:05PM (#32629632) Homepage

    What's passive mean in this case anyways? They were not actively targeting a single entity? Not exactly true. They were explicitly targeting everyone. I would argue that Google was, in fact, actively recording all of the broadcasts.

    No. Passing scanning/sniffing means they were only receiving packets, not sending. An example is if you're trying to get an hidden SSID: you can either passively wait for a computer to connect to the AP to capture the SSID, or you can actively send "disconnect packets" to force clients to reconnect.

    Your analogy is incorrect. Most of the consumers, the unwashed masses, have an extraordinarily small understanding of how technology works and its true affect on their lives. If you asked 1000 people on the street if they thought their Linksys/Netgear/whatever wireless router they bought from Best Buy communicating with their laptops in their house was analogous to them walking around naked outside of their house I am pretty certain you are going to get 0/1000 answering yes.

    Criminalizing stuff because people are too lazy to learn how to use their own equipment seems dangerous to me.
    In my opinion, that's the AP manufacturers' job. For example, here in Portugal it's now very rare to spot an open personal AP. Why? Because most people buy them from the ISP, which sells them with WPA enabled by default, and the password (generated randomly) printed on the manual (also, the SSIDs seem to have a random component, to "kill" rainbow table attacks).

    I get the "large power hence higher standards", but even so in this particular case it shouldn't be illegal. And the Finnish seem to agree with me, since they made it legal to use networks that allow anonymous logins.

  • by c6gunner ( 950153 ) on Saturday June 19, 2010 @11:48PM (#32629794) Homepage

    It may well be that one day I paid with my c/c and you noted first two digits. Indeed nothing you can do with them. Next day I again paid with my c/c and you noted next two digits. Now it makes four. Next day ... [repeat until the logical end.] This is how you can get my entire c/c record. Any single observation is useless; but when combined they are very much useful.

    Yep, which would require a concerted effort to gather the required data, not just a single drive-by capture of a small portion of your CC number. If I came back enough times, then yes, I could get the info, but why would I bother? If I were interested in your CC, I'd just copy down the whole damn thing the first time.

    Anyway, if google wanted access to the data you were sending back-and-forth between your computer and router, it'd be pretty pointless for them to go grab a few dozen packets every couple weeks since the data is unlikely to be related. It would be like me coming over to your house every few weeks, writing down 2 numbers from a random document that you have lying around, and hoping to eventually construct a CC number from the jumble I've gathered. The CC analogy is a fun one, but doesn't really reflect the situation.

    The society instead decided to prohibit all intercepts since they have hardly any social advantages to begin with.

    If that were true, I could go to jail every time windows picks up a new access point.

    Besides, there is an easy way to have an unlisted phone number.

    There is an easy way to encrypt your packets.

  • by tftp ( 111690 ) on Sunday June 20, 2010 @12:00AM (#32629852) Homepage

    Is something stored if it is never accessed?

    Imagine that you had some inconvenient photos, and if those photos are "accessed" your political career will end. Someone stole the photos. But they called you to assure that those photos will be never accessed. Will that be as good as if you personally destroyed all media those photos were on?

    If just the potential to access it is enough then we're all guilty because we all have the "potential" to access the open Wifi networks in the first place.

    I can't imagine a sane situation where a potential to commit a crime is the same as the crime itself. However one guy was recently arrested [politicalforum.com] (illegally) and his lawful property "held" for a crime that other people thought he might be considering committing in the future.

  • Re:Well duh (Score:3, Interesting)

    by khchung ( 462899 ) on Sunday June 20, 2010 @12:12AM (#32629896) Journal

    Just see it this way - it's sometimes easier to log every information available when collecting the data and then filter out the interesting parts later. Especially when it's in the prototype state. And suddenly a prototype goes into production just because it works good enough.

    Yeah, right. Why not use this to justify the Sony rootkit too: "It's easier to just root the PC when preventing unauthorized action being done to the CD. And suddenly a prototype goes into production just because it works good enough."

    Do you buy that?

    No, the truth is people are defending Google not because it make sense, but because they want to believe Google is the good guy. This is no different from Creationists wanting to believe their idea in face of opposing evidence, it's only matter of degree.

  • Re:Well duh (Score:3, Interesting)

    by Pharmboy ( 216950 ) on Sunday June 20, 2010 @01:26PM (#32633168) Journal

    the truth is people are defending Google not because it make sense, but because they want to believe Google is the good guy.

    Truer words were never spoken. We need good guys, and will invent them if necessary. All of our historic "legends" were likely nothing like the myths that surrounded them, and some were outright asshats. In popular culture (Star Trek specifically), I love how Zephram Cochrane [wikipedia.org] was actually just trying to get rich when he came up with the warp drive, there was no "higher calling" to it. Even art gets it.

    There are no good guys when it comes to capitalism. Don't get me wrong, it's the only system for me, but what you have are "bad guys", "evil guys", and "guys that usually play by the rules", and that is about as good as it gets. It is in our nature. The real "good guys" never truly succeed, partially because success isn't worth the price they would have to pay: A willingness to be ruthless when it is required.

    In short, Google is simply the lesser of all the available evils. Perhaps their motto should be "Do less evil".

They are relatively good but absolutely terrible. -- Alan Kay, commenting on Apollos

Working...