Forgot your password?
typodupeerror
Cloud Privacy Your Rights Online

Can Translucency Save Privacy In the Cloud? 86

Posted by timothy
from the big-ol'-cumulus-clouds-maybe dept.
MikeatWired writes "Jon Udell writes that when it was recently discovered that some iPhone apps were uploading users' contacts to the cloud, one proposed remedy was to modify iOS to require explicit user approval. But in one typical scenario that's not a choice a user should have to make. A social service that uses contacts to find which of a new user's friends are already members doesn't need cleartext email addresses. If I upload hashes of my contacts, and you upload hashes of yours, the service can match hashes without knowing the email addresses from which they're derived. In the post Hashing for privacy in social apps, Matt Gemmell shows how it can be done." (Read more, below.)
"Why wasn't it? Not for nefarious reasons, Gemmell says, but rather because developers simply weren't aware of the option to uses hashes as a proxy for email addresses. A translucent solution encrypts the sensitive data so that it is hidden even from the operator of the service, while enabling the two parties (parents, babysitters) to rendezvous. How many applications can benefit from translucency? We won't know until we start looking. The translucent approach doesn't lie along the path of least resistance, though. It takes creative thinking and hard work to craft applications that don't unnecessarily require users to disclose, or services to store, personal data. But if you can solve a problem in a translucent way, you should. We can all live without more of those headlines and apologies."
This discussion has been archived. No new comments can be posted.

Can Translucency Save Privacy In the Cloud?

Comments Filter:
  • Hash (Score:4, Funny)

    by busyqth (2566075) on Saturday March 24, 2012 @03:13AM (#39459411)
    All my contacts upload their hash regularly.
    Well... mostly on the weekends.
  • by Jens Egon (947467) on Saturday March 24, 2012 @03:19AM (#39459431)

    Hashing is more difficult than not hashing.

    Customers are not going to stay away just because your security is atrocious.

    So only legislation (or serious liabilty) is left to get this off the ground.

    • by MoonFog (586818) on Saturday March 24, 2012 @03:30AM (#39459459)
      Actually, I find that people are starting to care a lot more these days. All the scare mongering with Facebook has ment that people take notice and think about what they do online. A bad security record gets more attention in the media as well so to me it's not so clear cut anymore, people do care and you can't get away with everything.
      • by Anonymous Coward

        All the scare mongering with Facebook has ment that people take notice and think about what they do online.

        I think I'm the only one in my family that doesn't have a FB account. When I go down the laundry list of why I don't have one, they shrug their shoulders. When someone posts something - like this twit that posted her friends birth dates and names, there's an initial outcry from a couple of people, but then it died down. The twit said that her page was private and later deleted the information. Everyone concerned thought it was fixed.

        Media attention?

        I don't see that much reaction against the privacy violatio

    • by SuperKendall (25149) on Saturday March 24, 2012 @03:35AM (#39459477)

      So only legislation (or serious liabilty) is left to get this off the ground.

      You would really rely on legislatures to get the wording of such a law correct and not impede what we can do with mobile devices?

      Apple is already changing the system to require user permission when accessing contacts. One of the main apps at fault, Path, has already switched voluntarily to using hashes.

      So why go the trouble of crafting regulation to solve a problem taking care of itself already? All you can do is make things more annoying for people.

      • by Jens Egon (947467)

        So only legislation (or serious liabilty) is left to get this off the ground.

        You would really rely on legislatures to get the wording of such a law correct and not impede what we can do with mobile devices?

        Who said anything about not impeding? Security does impede what you can do if you care about it at all.

        As for the politcos. No, I don't have high hopes. They'll understand and care about these issues shortly after the electorate starts doing so, at best.

        Apple is already changing the system to require user permission when accessing contacts. One of the main apps at fault, Path, has already switched voluntarily to using hashes.

        So why go the trouble of crafting regulation to solve a problem taking care of itself already? All you can do is make things more annoying for people.

        I would argue that Apple is acting more like a legislature here. It's what people are paying them for, after all.

      • by martin-boundary (547041) on Saturday March 24, 2012 @04:32AM (#39459583)

        So why go the trouble of crafting regulation to solve a problem taking care of itself already? All you can do is make things more annoying for people.

        The problem isn't taking care of itself. We are seeing Apple, Google and Facebook doing rearguard actions because they are afraid of regulation and lawsuits. Remove that threat, and they'll stop worrying about privacy. It's much better to have a standardized set of laws that spell out the rights of customers than a mish mash of piecemeal solutions that companies have to invent themselves.

        Moreover, the Europeans are doing it already, so why not copy^H^H^H harmonize with their laws in America? That'll save American companies a lot of work when they realize that their system must be redesigned anyway if they want European customers.

        • The problem isn't taking care of itself. We are seeing Apple, Google and Facebook doing rearguard actions because they are afraid of regulation and lawsuits.

          Lawsuits perhaps, but they are more afraid of CUSTOMERS. They want to serve CUSTOMERS better (and also avoid lawsuits).

          Moreover, the Europeans are doing it already, so why not copy^H^H^H harmonize with their laws in America?

          Well that's how we get SOPA. Great plan. Not.

          Just because Europeans are willing to submit to tyranny why should the U.S.? Why

          • by Jens Egon (947467)

            Just because Europeans are willing to submit to tyranny why should the U.S.? Why should anyone?

            Let's also bring over the vast array of cameras from the U.K. while w are at it!

            We're not the ones submitting to tyranny. Regulation of industry is part of the price we pay for not submitting.

            I'll grant you that the Brits seem a bit lax when it comes to protecting themselves from government, though.

            Then again, neither do they seem likely to elect quite the same quality of nutters [spreadingsantorum.com] to lead them?

          • Well that's how we get SOPA. Great plan. Not.

            How exactly did you get from Europe to SOPA? You do know how's sponsoring all that crap? It's the US that has been pressuring Europe to pass those laws, not the other way around.

            Let's also bring over the vast array of cameras from the U.K. while w are at it!

            Meh, the UK are barely European anyway ;)

            • Well, Europe and SOPA can be easily related. We got a SOPA-prototype in Spain, the "Sinde Law". It was approved and started during a change of government, in a very sneaky fashion and after initial resistance when it was attempted to pass for the first time.
              Some other reply says that Europeans submit to tyranny...nope, it's imposed on us, that's why it's tyranny!
              It might be the US pressuring EU to do this, but I feel like I can rightly blame EU politicians because their will was weak and they submitted way

              • Sure, I'm not deflecting the blame from our politicians; no amount of pressure excuses them from enacting terrible laws like those.

                I'm just saying that SOPA being pushed through the US Congress had nothing to do with Europe, and all to do with lobbying from their own industries.

          • by psmears (629712)

            The problem isn't taking care of itself. We are seeing Apple, Google and Facebook doing rearguard actions because they are afraid of regulation and lawsuits.

            Lawsuits perhaps, but they are more afraid of CUSTOMERS. They want to serve CUSTOMERS better (and also avoid lawsuits).

            There's one small problem there. Who are Google's customers? Who are Facebook's customers? I'll give you a clue: it's not their users, who (by and large) don't pay them any money at all. Their customers are their advertisers, and serving their customers better is usually in direct conflict with preserving the privacy of users.

        • The problem isn't taking care of itself. We are seeing Apple, Google and Facebook doing rearguard actions because they are afraid of regulation and lawsuits. Remove that threat, and they'll stop worrying about privacy.

          Privacy aside, companies are a lot less afraid of regulation than they say, simply because regulations actually help them in several ways:

          1 - The make it difficult for new companies to enter markets because of all the things that must be done, and associated costs, of complying with regulations. This is especially true for internet companies - the more it costs to comply with regulations, the harder it is for a startup to threaten established firms.

          2 - Regulations can often provide a shield from lawsuits

        • by msobkow (48369)

          why not harmonize with their [European] laws in America?

          Because they're socialists and communists over there! They can't have any good ideas that don't need to be manipulated, edited, changed, and regurgitated in an "American" version before they can be any good.

          What's next? Looking to those damned Canadian socialists for guidance on federal cannabis policy that could lead to a federal medical cannabis regulation? :P

      • Is it taking care of itself? Suppose I install a Facebook app (fat chance, but just suppose). My iPhone asks me if I allow the app to access my contact list. Sure, I say... to check if any of my contacts are also my FB friends, or for any other reason that makes my life easier. But not to collect a list of my contacts on the FB server even if they are not my friends, or to sell this data to 3rd parties. There are many apps that have both legitimate and nefarious reasons for accessing your personal data,
    • Privacy /= security.

      The effort to hash and salt contact information to provide a basic level of privacy over cleartext is computationally and programatically inconsequencial. We're locking the screen door, here; not securing Fort Knox.
    • by pjt33 (739471)

      More to the point, it's more difficult than not hashing for trivial gain. There's no way the protocol described in the summary could work with salt, so anyone who had any motivation could spend 10 minutes writing a script to build rainbow tables for various combinations of names @gmail, hotmail, etc. and reverse the hash.

  • by gnapster (1401889) on Saturday March 24, 2012 @03:23AM (#39459441)
    Gonna start generating the contact-data rainbow tables right now!
    • by qxcv (2422318)

      Because of how much more efficient it is to break into an organisation's database of contact hashes, hope like hell the contact hashes are unsalted and then run each of them through your rainbow tables just to get a single email address than it is to write a web crawler in 10 lines of Python which finds emails based on a regex. Your approach is great if you're trying to phish gullible Sony customers but not so great for anything else.

  • I can see how:
    Uncle Bob 01234 123456
    might be smart matched with:
    Robert Smith +1 1234 123456

    I'll be interested to see the hashing algorithm that will allow the hashes to be matched.
    • by Jens Egon (947467)

      A setup where uncle bob will only be found if he wants to is not necessarily a step in the wrong direction.

      A step backwards to be sure, but that's not always the wrong direction.

    • by Dark$ide (732508)

      I can see how:
      Uncle Bob 01234 123456

      might be smart matched with:
      Robert Smith +1 1234 123456

      Even without a hash you're going to struggle to match those two.There's too much difference even with stuff like soundex() unless you get the geo info so you can turn the "01234 123456" phone number into an international one.

      If you get bob.smith@example.com from both as an email address that matches easily.And when you've captured that data item it's only a small step to find that in Google.You can use that matching data item as a tag for all the other stuff you've stolen from the user who has jsut done the

    • specification/spesfikSHn/
      Noun:
      An act of describing or identifying something precisely or of stating a precise requirement.
      A detailed description of the design and materials used to make something.

  • apps wouldn't be free - 99c

  • How do you make money from free cloud apps, if it's not by selling the private information you extract from your customers files? I thought the cloud efficiency (good service at low cost) came by design from taping into privacy.

  • Almost yearly, I (as do most Americans) get a small little statement/disclaimer entitled "Notice of Disclosures" or something to that affect from various banking, insurance, and other types of institutions I regularly do business with. I believe the only reason they send this is by legal requirement, and it tells me all of the different bits of information they have on me and what they do with it/how they resell it, or excuse me, "share it with valued business partners." Some things I can opt out of, whic
  • still no privacy (Score:3, Informative)

    by allo (1728082) on Saturday March 24, 2012 @05:16AM (#39459663)

    some things to consider:
    - when you hash a telephone number, a rainbowtable is easily generated
    - even when you have ids, which are real pseudonyms, no option to crack them, then you can correlate "ah, user X knows Y, which is known by Z, too".

    So uploading contact data is exposing private things, even when the nodes are ano(pseudo)nymous and only the edges of the social graph are known.

    • by jcreus (2547928)

      - when you hash a telephone number, a rainbowtable is easily generated

      That's where salts are useful. When you MD5 or SHA1, add a random-but-constant string at the beginning of the to-be-hashed string. Rainbow tables will be far mor difficult, if not impossible. Instead of MD5'ing "slashdot", MD5 "f8ds9a03421314159_$!1337_jc0wikislashdot".

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        ... Which defeats the purpose that is being able to find someone by said hash.

    • by sdk4777 (1013597)
      What is this, everybody talking about telephone numbers, hash and rainbows?
    • when you hash a telephone number, a rainbowtable is easily generated

      Two words: salt.

      • by allo (1728082)

        cannot be used.

        imagine, you have a salted hash with random salt for each hash (if the salt stays the same, the rainbow-table is easily generated). Now your phone hashes some telephone-number with a random salt. Try to find it in the db of numbers hashed with random salts. no chance, because a different salt is used.

  • The issue is not how companies that want to preserve privacy can do so. They will find a way and the described solution is rather obvious. The question is how to stop companies that do not care about your privacy at all and _want_ to upload all your data to their own servers.

  • Once a provider has a large enough db, they can look for firstname.lastname@gmail.com or, knowing from the contacts distribution the region and language of the users, something like @free.fr or @yahoo.co.jp

  • The alpha-channel in JPEG sucks.

  • Maybe they knew, maybe they didn't know about this translucency method, but in the end one thing I am fairly certain of is that that "they" want all your information, it's how "they" get valid emails, make money and build profiles.
    If your information is obfuscated in any way the reliability of what they want to do is diminished and therefore not worth as much.

  • Of course they will want to spam people who haven't registered with their "social" service yet, so they need to harvest plaintext e-mail addresses / names and put the blame on you when they send them a spammy invitation. Remember, this is the "you are the product" market, practical solutions are whatever brings in more users/cash, not things that protect privacy as much as possible ...
  • by Anonymous Coward

    If Iphone users cared about their privacy, they wouldn't be Iphone users.

    And don't tell me companies don't use hashing because they don't know how to implement it. This is deliberate. They want your contacts. Data = money, personal data of real people plus who-knows-whom = more money.

    • If Iphone users cared about their privacy, they wouldn't be Iphone users.

      this is one of the most insightful comments I've seen on /. for a long time.

      why did you post this as ac? i would have modded you up for sure.

  • It is time to stop worrying about the 10-20 companies who make their money from violating privacy and selling data to advertisers. Just because Google and Facebook have become popular with this business model during the past decade doesn't mean that we should give up century old principles and that we have to protect this business model in all eternity.

    The vast majority of all companies can or could do respectable business without violating privacy. Notwithstanding the software patent nightmare, it is possi

    • by jgrahn (181062)

      It is time to stop worrying about the 10-20 companies who make their money from violating privacy and selling data to advertisers. Just because Google and Facebook have become popular with this business model during the past decade doesn't mean that we should give up century old principles and that we have to protect this business model in all eternity. [---] It's time to give the power back to real companies, who actually offer real products and who are interested in sustainable business based on making their customers happy.

      If we're going to redistribute power anyway, why not take it back? People instead of corporations, open protocols instead of apps, decentralized instead of centralized solutions?

  • passwords too (Score:4, Interesting)

    by mcelrath (8027) on Saturday March 24, 2012 @09:21AM (#39460425) Homepage
    Why are we not doing this for passwords too? Every site on the internet shouldn't need to store a plaintext password. Does there exist an algorithm by which a site owner could send the salt, the user hashes with his password, and the site owner can tell the password is the same, without actually having the password?
    • "We" are not doing this for Internet passwords because you are not an Internet programmer. If you were an Internet programmer dealing with user logins and such, you would know that unencrypted plain passwords have been gone for over a decade or more. All login passwords are hashed and salted by default. If not, it is because some unexperienced programmer decided to "write their own" without a proper understanding of basic password protection. I have been on lots of servers and I cant recall the last time I
      • WTF do I care if someone cracks my slashdot password? Are they going to steal my karma?

        I care about my bank account password, and, trust me, that is not used anywhere else.
  • all the problems people had with mainframes you are starting to see with remaining mainframes "the cloud". No on cares about your data as much as you do no matter what they tell you or promise.
  • A bad actor could rather easily convert the hashes back to email addresses. All he needs is a good source of email addresses (readily available from the dirtbags who supply spammers), which he can then hash and index. Takes some computer resources, that's all.

    A good actor need merely not misuse the email addresses in the first place.

    • ideally, every service/company would have it's own set of salts to immediately hash all incoming email addresses, which it would of course protect with its life. You'd have to steal the salts first, and the rainbow tables you make would only work for that one service, until it decides to reset its caches and generate new salts when it finds out someone stole them, which a responsible company should be doing.
  • The root of all these problems is that any idiot with a text editor can call themselves a "web developer" these days. The barrier to entry is extremely low, and the result is a very large group of people who have no forethought about what they're actually doing. They take the most naïve path from start to finish and end up creating all these security and privacy holes real programmers have long since learned to avoid.

    Case in point: people still store passwords and credit card info in plaintext, typically behind sloppy PHP or Ruby scripts that are vulnerable to SQL injection. Feed that stolen data into a simple script that tests the passwords against a handful of popular services like GMail, Facebook, Hotmail, Paypal etc. Within minutes, you have a few dozen accounts ready to be abused all over the web without the user's knowledge - all because of one idiot who didn't know how to protect his users' info.

    All this talk of securing the cloud is futile. It's like putting a dozen deadbolts on your front door, then leaving a spare set of keys under your neighbour's welcome mat.

    • Re: (Score:3, Interesting)

      by darkfeline (1890882)
      Actually, I think leaving the spare keys under your neighbor's welcome mat is a very good and unorthodox backup method. I'm pretty sure someone trying to break in will check your welcome mat and top of door frame, not your neighbor's. Maybe we can extend this analogy to web security? Have sites store their users password hashed on partner sites, and vice versa. Even better, have sites store the hashing salts on another partner site's servers. Quick, you and me patent this before big name companies star
  • This paper: Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization by Paul Ohm from the University of Colorado Law School is the best summary why it won't work even if people do it: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1450006 [ssrn.com] TL/DR: There are just far too many ways to infer the real meanings from a network of hashes.
  • This is a temporary solution that simply will not work for any large cloud service of significant size. I do program with hashing and other encryption mechanisms and I do know exactly what hashing does.

    Hashing is a 1 to 1 algorith. So given any single input (like an email address), the hash output will remain the same. So any hash value can quickly be verified to match a known email address by simply running the hashing algorithm on a known email address and matching it to the existing hash. If you get a ma

Facts are stubborn, but statistics are more pliable.

Working...