A Contrarian Stance On Facebook and Privacy 160
macslocum writes "Amid the uproar over Facebook's privacy maneuvers, Tim O'Reilly offers a contrarian view. He writes: 'The essence of my argument is that there's enormous advantage for users in giving up some privacy online and that we need to be exploring the boundary conditions — asking ourselves when is it good for users, and when is it bad, to reveal their personal information. I'd rather have entrepreneurs making high-profile mistakes about those boundaries, and then correcting them, than silently avoiding controversy while quietly taking advantage of public ignorance of the subject, or avoiding a potentially contentious area of innovation because they are afraid of backlash. It's easy to say that this should always be the user's choice, but entrepreneurs from Steve Jobs to Mark Zuckerberg are in the business of discovering things that users don't already know that they will want, and sometimes we only find the right balance by pushing too far, and then recovering.'"
Facebook has confirmed it is working on more changes to its privacy policy in response to feedback from users.
Tim wants us to tell him why he's wrong (Score:5, Interesting)
So here it is:
1. Users do not know the boundary conditions until someone's privacy has been abused - if they're paying attention and understand the issue.
2. At that point in time, most users will have already shared too much - and once their privacy has been breached/sold, there's no undo button.
3. Users have to spend time demanding their privacy rights which may or may not be given.
4. We don't need to research where the boundary conditions are because once you know who's likely to access what information, it's not that complicated.
The only question here is whether Facebook et al have a duty of care to their users. Morally they do, legally they generally don't and, financially, they're best of selling as much as they can get away with.
Witness the clash, and hopefully the prelude to the exodus. If Google had their act together, they could clean up. Perhaps it's a good thing they don't.
Re:Not So Much With The Internet (Score:3, Interesting)
It's become the fashion to lump everything together, as if performances, images, tangible goods, rights, efforts, and ideas are all exactly the same kinds of assets and should be treated exactly the same by corporations, governments, and individuals. That's happened because business students are taught to convert everything to dollars, assign a value to risk, and then simply slide the numbers around on an Excel spreadsheet until the biggest one pops out at the bottom.
The problem is that the dollar value they assign to risk is based on the imaginations of some not-very-creative people, and is only the risk to them, not to the end users. "Well, if we screw with the privacy policy, our risk is that we'll lose less than 0.5% of our users. That's equal to ad revenue of $3,000,000. The ROI on increased ads is projected to be $10,000,000. This change will pay itself back in months, so just do it."
What really has to happen is truckloads and truckloads of lawsuits have to be filed against them, by people whose privacy was violated. They have to learn that if they mess with our privacy, it will cost them billions of dollars in settlements and legal fees -- to the brink of bankruptcy, and even over the edge. That is the only time corporations will start respecting our rights -- when violating them is guaranteed to flush their bottom lines into the toilet.
Re:Not So Much With The Internet (Score:4, Interesting)
I agree except for one detail: just because the business nerds assign monetary values to everything, that does not mean the legal system has to. Throwing a few company directors in jail on criminal charges when their companies flagrantly infringe the privacy of others would probably be a better deterrent than some fine that is, again, just numbers on a spreadsheet that they pass to their legal and accounting people to deal with.
Re:Missing the point (Score:3, Interesting)
Further there's no need to share any information at all for GPS directions. We know where we are, what we're asking is where is the place we're going. Nobody else needs to know where we are. Any sharing of where you've been data is not necessary for the product to function.
The point Tim seems to be missing is not 'can sharing info be good' it's: sharing my personal info should be solely at my discretion, not yours. If I miss out on some amazing feature that's a choice I made. If other more adventurous folks volunteer and benefit then good for them and maybe I'll follow suit.
20th century anonymity an anomaly (Score:4, Interesting)
Similarly, anonymity was unique to the 20th century. In the 19th century, due to transportation constraints, everyone knew who you were and what you did. Welcome to Facebook and the 21st century.
My expanded list is as follows (and apologies -- I don't recall which of mine are original, but I believe the original Slashdot comment listed only three examples):
Re:Anonymity comes from Aliases (Score:3, Interesting)
All fine and good for a dot com US person (Score:3, Interesting)
In the real web 2.0 world you face spooks, army intel, gov workers, politicians, state and federal informants and corporate types.
What do they have in common in many parts of the world?
Your online blog can make your life difficult, end in a shallow grave ect. after simple web 2.0 online comments.
Much of the "web 2.0" is crawling with gov types trying to join different activist groups long term or make up their monthly arrest quotas.
Entrepreneurs will always sell your data for profit, pride, faith, patriotism or access.
So when US entrepreneurs make high-profile data handling mistakes it can have interesting flow on results.
Re:Missing the point (Score:3, Interesting)
And when we decide who we want to share data with, we dont want the company just deciding since it's Tuesday they can change their policy and go ahead and share^H^H^H sell our info anyways.
Perhaps a simple rule could be that users/customers would have to agree explicitly with any changes that would violate previous policy a user said "yes" to. And make it a criminal offence (as in: go to jail) if you ignore that rule - especially for large numbers of users.
For example, if a user previously agreed to a privacy policy that says "company will not share personal facts X/Y/Z with 3rd parties", then any policy change that would share personal facts X/Y/Z with 3rd parties (read: less restrictive in terms of sharing) should require additional, explicit approval from that user. No user approval for the changed policy -> no use of the less restrictive policy (at least for that user). Use of the less restrictive policy without explicit user approval -> criminal offense. With penalties etc. to be applied to the companies CEO's, not the techies implementing those changes. Same thing for new features that share data beyond what the user previously agreed to.
Why? Many sites have this "check back regularly on our privacy policy page" disclaimer, which is BULLSHIT. You have private data kept by many, many companies, and it is just unreasonable to expect people to re-visit privacy policies (or privacy-related user settings) on all those companies, let alone on a regular basis - and detect policy changes. If you change policies, ask users if they're okay with that. While waiting for a "yes", assume they're not. Ignore that -> face severe penalties.
Re:Down with Patriot Act, long live O'Reilly Act (Score:3, Interesting)