Forgot your password?
typodupeerror
Privacy The Internet

Cookieless Web Tracking Using HTTP's ETag 212

Posted by timothy
from the ah-there-you-are-again dept.
An anonymous reader writes "There is a growing interest in who tracks us, and many folks are restricting the use of web cookies and Flash to cut down how advertisers (and others) can track them. Those things are fine as far as they go, but some sites are using the ETag header as an identifier: Attentive readers might have noticed already how you can use this to track people: the browser sends the information back to the server that it previously received (the ETag). That sounds an awful lot like cookies, doesn't it? The server can simply give each browser an unique ETag, and when they connect again it can look it up in its database. Neither JavaScript, nor any other plugin, has to be enabled for this to work either, and changing your IP is useless as well. The only usable workaround seems to be clearing one's cache, or using private browsing with HTTPS on sites where you don't want to be tracked. The Firefox add-on SecretAgent also does ETag overwriting."
This discussion has been archived. No new comments can be posted.

Cookieless Web Tracking Using HTTP's ETag

Comments Filter:
  • Secret Agent (Score:5, Interesting)

    by Jeremiah Cornelius (137) on Sunday August 25, 2013 @11:32AM (#44669473) Homepage Journal

    Here we come. :-)

    Add this feature to a chaff-creating plugin, to crapflood servers with fake tags.

    • by AliasMarlowe (1042386) on Sunday August 25, 2013 @12:08PM (#44669691) Journal
      On all of our PCs, Opera and Firefox are set to clear their caches and delete all cookies etc. every time they exit.
      Also, I occasionally clear all private data while browsing in Opera, including the cache, cookies, history, and so forth (passwords are never saved by the browser). Obviously, I have to log in again the next time I visit slashdot.
    • Re:Secret Agent (Score:4, Interesting)

      by kasperd (592156) on Sunday August 25, 2013 @12:09PM (#44669701) Homepage Journal
      Can SecretAgent detect tracking through ETags? Or will it disable ETags across all sites and thus slow down browsing by effectively turning caching off?

      The way I'd detect it would be with some extra background probes after a page has been loaded. The background probes start once the browser has finished loading and has become idle. Then the browser could open another connection and request the same resources again without sending any information, that could be tracked. If it receives a different ETag or different content this time around, it empties the cache for that domain and disables caching for that domain for a few hours.
      • Re:Secret Agent (Score:5, Informative)

        by KiloByte (825081) on Sunday August 25, 2013 @12:37PM (#44669915)

        Or will it disable ETags across all sites and thus slow down browsing by effectively turning caching off?

        ETags are only one of many methods to achieve caching. Getting rid of them shouldn't have a big effect on caching.

        Other methods typically have privacy holes as well, but it's easier to deal with them, for example by rounding timestamps down to the last midnight. ETags on the other hand store an arbitrary attacker-provided string, which is an outright security vulnerability.

        • by Wrath0fb0b (302444) on Sunday August 25, 2013 @09:04PM (#44672933)

          ETags on the other hand store an arbitrary attacker-provided string, which is an outright security vulnerability.

          I hate to break it to you, but the entire browser is nothing but a device for storing (and then parsing!) arbitrary attacker-provided strings. It's even got a perverse sort of link-chaining mechanism where, after receiving one such string, it will go out and fetch (and parse!) another one at the attacker's choice of address.

          This is not a security vulnerability, it's the design of the system in which there was never a requirement to ensure that a client could visit a server multiple times without the server knowing (or inferring) that it was the same client. It's meaningless to say that a protocol is vulnerable because it doesn't meet a property that it was never designed for (any more than RSA is broken because it doesn't offer repudiation).

          Now a client can always elect to send random e-tags, slowing himself down (most dynamic content is not time-cached) and adding to the bandwidth load on the server. I'm sure someone will cook up an extension that does this, and we'll be back to where we were before this non-story.

          • by psydeshow (154300) on Monday August 26, 2013 @01:57PM (#44678427) Homepage

            ETags on the other hand store an arbitrary attacker-provided string, which is an outright security vulnerability.

            I hate to break it to you, but the entire browser is nothing but a device for storing (and then parsing!) arbitrary attacker-provided strings....

            This is not a security vulnerability, it's the design of the system in which there was never a requirement to ensure that a client could visit a server multiple times without the server knowing (or inferring) that it was the same client.

            Yep. Bingo.

            Safest solution is to write your own "browser" in PHP or something and keep the request headers limited to just GET and Host:, and don't download any linked stylesheets, scripts, images, favicons, objects, or embeds. Have fun with that!

            It *would* be nice if there was a paranoid mode in Firefox or Chrome that prevented cross-domain resources from being loaded. But that would break a bunch of sites, too, where some yokels bought the argument that speed is everything and spread their frontends over a bunch of different subdomains and third-party CDNs.

        • by allo (1728082) on Saturday August 31, 2013 @11:49AM (#44724401)

          Of course, you could use cache times and checking which images were loaded to store information as well. But one image has 1 bit of information, while one single e-tag (i.e. on one imagE) has many bits.

  • by Anonymous Coward on Sunday August 25, 2013 @11:35AM (#44669489)

    Changes were made in the past few years to make it much more difficult to clear the cache frequently and easily.

    You must jump through various menus and dropdowns. The team argued that this was progress, and it helped prevent inadvertant cache clearing. Their argument was very weak.

    It forces me to hassle with yet another plugin to make my very frequent cache clearing quicker. But at least it is now an icon on the toolbar, with no prompting.

  • Nothing new (Score:5, Informative)

    by deanrock0 (2977511) on Sunday August 25, 2013 @11:36AM (#44669503)
    Did they just invent ETag or what? This "feature" is known for a few years and there are existing implementation, including this one: http://samy.pl/evercookie/ [samy.pl] from 2010.
  • Tracking $$$$ (Score:3, Informative)

    by Ed The Meek (3026569) on Sunday August 25, 2013 @11:36AM (#44669505)
    Tracking information is worth billions of dollars. With that much money on the line - we'll be tracked like escaped inmates - one way or another.
    • by dave420 (699308) on Monday August 26, 2013 @10:13AM (#44676305)
      If we remove the ability for those people to track us, doesn't that mean that those Sagans of dollars need to be put back in to the industry in order for our current "internet experience" to remain as it is? I'm not saying where we're currently at is great or all, but if it is that large amount of money, we'll notice the difference if it leaves.
  • Secret Agent addon (Score:2, Informative)

    by Anonymous Coward on Sunday August 25, 2013 @11:43AM (#44669543)

    The addon's homepage appears to be this:
    https://www.dephormation.org.uk/?page=81

  • by Jah-Wren Ryel (80510) on Sunday August 25, 2013 @12:17PM (#44669753)

    The RequestPolicy add-on [requestpolicy.com] should handle this too. RequestPolicy blocks cross-site references by default and lets you whitelist individual cases. If you don't even talk to the tracker websites then they can't track you.

    If the main website you access tracks you via etags the risk is limited to tracking your actions on that website which you'd have problems avoiding anyway since they can track you via ip address or if you have an account on that website.

    • by Anonymous Coward on Sunday August 25, 2013 @12:29PM (#44669853)

      I use RequestPolicy, and it definitely isn't for most people. It increases the amount of effort needed to browse the web by a factor of ten.

      Every other site I go to is actually served from about two dozen separate locations. CSS comes from one domain, images come from as many as 6 domains, javascript comes from as many as 3 domains, and it isn't unheard of to see twenty different sets of trackers and widgets getting bolted on, not including the addidional baggage that they bring.

      It's fucking ridiculous.

      Oddly enough, sites hosting their own tracking will make RequestPolicy fail miserably, since it only deals with cross site refs. Such sites are the exception, though.

  • by Somebody Is Using My (985418) on Sunday August 25, 2013 @12:17PM (#44669759) Homepage

    I always imagine the webserver as having an internal conversation that goes sort of like this...

    Hey, a new visitor to the website? I wonder who he is?
    Well, I'll just drop a cookie on there to keep track of him... and, hmm, it seems he's blocking cookies.
    Oh well, let me just insert this bit of Javascript; that'll work just as well.
    Dear oh dear, it seems Javascript isn't working.
    No worries, I'll just insert a little 0-byte web-bug graphic and... wait? That's prevented as well?
    Damn it, Flash-cookie! That'll get him! WHAT?!?!? Disabled as well?
    E-Tag! That has to work, right?
    ARGH!!!!!

    Gee... I wonder if he's trying to tell me something like, oh I don't know, "I don't like being tracked".

    Nah, who doesn't like being pushed, filed, stamped, indexed, briefed, debriefed, or numbered? I wonder if there's some other way I can use...

    You might think at this point that companies and advertisers start getting the message. Instead, they just keep finding more and sleazier ways. All these technologies have valid uses but have been so abused by corporations and marketing that people increasingly don't trust it anywhere. It just further antagonizes the very people they are trying to connect with. And then they wonder why they lose the respect and trust of their customers, resulting in an ever-more aggressive relationship between the two.

    Some days I dream about what the Internet might have been like had Canter and Siegel been definitively smacked down back in '94, setting an inviolable precedent that the 'Net was not a platform welcoming /any/ advertising. What repercussions might that have had on the world as a whole?

    • by Anonymous Coward on Sunday August 25, 2013 @12:25PM (#44669823)

      Some days I dream about what the Internet might have been like had Canter and Siegel been definitively smacked down back in '94, setting an inviolable precedent that the 'Net was not a platform welcoming /any/ advertising. What repercussions might that have had on the world as a whole?

      Well the advertising giant Google would cease to exist, for one....

    • by dotancohen (1015143) on Sunday August 25, 2013 @12:28PM (#44669843) Homepage

      You might think at this point that companies and advertisers start getting the message. Instead, they just keep finding more and sleazier ways.

      Which companies? You do realize that this is a browser feature, right? Mozilla et. al. introduced this into the browser, not some third-party. Go look up the Bugzilla page and commit for this feature for the guy's name and contact info.

      • by BenoitRen (998927) on Sunday August 25, 2013 @01:17PM (#44670187)

        You do realize that this is a browser feature, right?

        *WHOOSH*

    • Some days I dream about what the Internet might have been like had Canter and Siegel been definitively smacked down back in '94, setting an inviolable precedent that the 'Net was not a platform welcoming /any/ advertising.

      The global economy would be smaller and internet access would be less available worldwide. Once the model of advertising-supported services arose, people in the third world could have nice things like e-mail and entertainment in spite of their countries' lack of means or an infrastructure where individuals could pay for whatever they used themselves.

      It's easy for someone in the West to say, "They should just bill you $20 a month for your usage of service x, and get rid of advertising", but try to be considerate of the rest of the world.

      • by Jah-Wren Ryel (80510) on Sunday August 25, 2013 @05:58PM (#44671999)

        Once the model of advertising-supported services arose, people in the third world could have nice things like e-mail and entertainment in spite of their countries' lack of means or an infrastructure where individuals could pay for whatever they used themselves.

        You know that is only true because advertising is imprecise, right? Those people have no money to buy crap anyway so advertising is completely wasted on them. If the advertising people could figure out a way to avoid paying for ads targeting people like that they would and that would quickly mean that the service providers would drop them too. Kind of like how streaming video sites use geo-location to block them now.

        So it isn't really a benefit so much as an accident that may be corrected at any time. We would be better off coming up with another way to get the poor online instead.

    • by Ksevio (865461) on Sunday August 25, 2013 @12:38PM (#44669917) Homepage
      Part of the issue is browsers/programs that portray cookies in a bad light. When malware scanners flag cookies as "harmful to your computer", people get nervous about all cookies and want them gone (then wonder why they have to keep logging in).
    • by bbn (172659) <baldur.norddahl@gmail.com> on Sunday August 25, 2013 @01:00PM (#44670075)

      E-Tag! That has to work, right?
      ARGH!!!!!

      Gee... I wonder if he's trying to tell me something like, oh I don't know, "I don't like being tracked".

      By this point you are being tracked as the guy that blocked everything else. There is only going to be one of you.

    • by mopower70 (250015) on Sunday August 25, 2013 @02:09PM (#44670565) Homepage

      You might think at this point that companies and advertisers start getting the message. Instead, they just keep finding more and sleazier ways. All these technologies have valid uses but have been so abused by corporations and marketing that people increasingly don't trust it anywhere.

      I'm honestly curious here. Advertising isn't going away. It's what keeps the Internet "free". So you're saying you'd rather have completely irrelevant advertising than stuff you may actually be interested in? When I'm in the market for any kind of product, I actively seek out sources of advertising to survey what's available. Being flooded with irrelevant information and advertisements (like happens on the radio and television) is personally unnecessary but financially necessary noise to provide the content I want. I'll take trackers any day over having to pay for every single site I visit.

      • by Anonymous Coward on Sunday August 25, 2013 @03:03PM (#44671011)

        Yes, that's what I'm saying. I don't want these people to know what I want. They have proven that they will take advantage of that, and try to make me impulse-buy things when I'm at my most vulnerable to targeted ads.

        Ads are not a good way to form a worthwhile opinion on what product is the best for you, the consumer. They're designed to drown out the competition and are practically worthless for making a judgement call, unless you happen to notice it's something you already wanted that has a special-offer.

        Without competition, ads are no more relevant than they were without being targeted. I've seen no evidence of more competition with more targeted ads. I just get the same products over and over, from whomever can afford the most ads. It's not a healthy situation for commerce.

        Plus, if it's something I'm going to buy, I'll buy it. I don't need an ad asking me to click on it while I'm doing other things. I'd rather the business model was revamped than the customer's privacy model.

        If ads can sustain the web as-is, then I don't see a need to "upgrade" them. And if they can't, then it's just another reason to revamp the business model instead of desperately clinging to it.

      • by Arrogant-Bastard (141720) on Sunday August 25, 2013 @09:51PM (#44673151)
        You know, we had a "free" Internet long before the advertising filth showed up and began polluting it. They are expendable, although they would certainly like you to believe that they're not. "Oh noooes the free sites could go away with advertising!!"

        Yes, they could. So what?

        Newcomers (anyone who didn't have an address ending in .ARPA is new) are directed to study the history of the 'net. Those of adequate perception will quickly realize that it was flourishing WITHOUT the hordes of imbeciles, WITHOUT the masses of illiterates, WITHOUT the tracking and ads and spam. Our mistake was not crushing these out of existence with ruthless ferocity as soon as they appeared: every ignorant newbie should have been flamed to oblivion, every spammer's business destroyed. We were far too nice and far too tolerant, and thus...look at what we have now. But it didn't have to turn out that way. And it still doesn't. The situation is fixable.

        Ads don't reach because I have those sites firewalled or null-routed. I don't care to look at them or have my extremely valuable time and resources wasted by them. Nor do I wish to be exposed to the malware and other attacks carried by an increasing number of them. I recommend this same approach to others: block them at your network perimeter: ALL of them. Yes, this will have consequences -- good consequences. It passes the "what if everyone did it? test because if that happened it would starve the ad networks of revenue and deprive of the resources they require to engage in ever-more-intrusive tracking and data collection on Internet users. Everyone won't do that, of course: but those who do will reap at least some of the benefits. Perhaps that will be enough. I certainly hope so.
  • by danceswithtrees (968154) on Sunday August 25, 2013 @12:24PM (#44669811)

    The ETag method is a clever solution to cookieless tracking. I find this method I stumbled upon a couple of weeks ago a bit startling. I had no idea the amount of information routinely sent from my browser/computer to web servers-- information about plug-ins, time zone, screen resolution, accepted headers, etc WITHOUT letting me know. It is enough to give more than 21 bits of identifying information and uniquely identifies me among the 3M visits.

    https://panopticlick.eff.org/ [eff.org]

    • by Anonymous Coward on Sunday August 25, 2013 @01:24PM (#44670241)

      The ETag method is a clever solution to keep the ammount of re-retrievals of unchanged data to a minimum.
      It gets abused by certain people for cookieless tracking

      There, fixed it for you. :-)

      I've been wondering for quite some time (IIRC years, in which I've simply blocked all of them) why ETags has not been getting more publicity as another "store am unique ID on the users machine" method.

      It looks like that both the security-hole researchers and browser-builders are not as clever as they want us to believe (or maybe something bad is going on ...)

    • And With PRISM's power's combined, welcome to the Panopticon. [wikipedia.org] -- Panopticlick's namesake. TADA: The world is now a giant prison.

      Bentham himself described the Panopticon as "a new mode of obtaining power of mind over mind, in a quantity hitherto without example."

      Indeed.

    • The ETag method is a clever solution to cookieless tracking. I find this method I stumbled upon a couple of weeks ago a bit startling. I had no idea the amount of information routinely sent from my browser/computer to web servers-- information about plug-ins, time zone, screen resolution, accepted headers, etc WITHOUT letting me know. It is enough to give more than 21 bits of identifying information and uniquely identifies me among the 3M visits.

      https://panopticlick.eff.org/ [eff.org]

      Yep. It's absurd, and unfortunately many "privacy-enhancing" tools (for example, anything that alters the user agent) can actually make a browser more unique rather than less-so.

      NoScript is an exception, and one that works very well. I know it's parroted on Slashdot a lot, but if you care about privacy and security on the web there isn't a single better option. Using Panopticlick on my browser as an example:

      Without NoScript: Your browser fingerprint appears to be unique among the 3,316,576 tested so far. Currently, we estimate that your browser has a fingerprint that conveys at least 21.66 bits of identifying information.

      With NoScript: Within our dataset of several million visitors, only one in 2,433 browsers have the same fingerprint as yours. Currently, we estimate that your browser has a fingerprint that conveys 11.25 bits of identifying information.

      Still not great, but a lot better than unique. It's quite unfortunate that the web evolved with the assumption that arbitrary code may be executed in the browser. If we had started out instead with an opt-in approach to Javascript, I think things would be quite a bit better now in terms of privacy and security than they currently are.

    • by kermidge (2221646) on Sunday August 25, 2013 @10:39PM (#44673323) Journal

      Well, I'm screwed. From panopticlick: "Your browser fingerprint appears to be unique among the 3,319,469 tested so far."
      And later, "21.66 bits of identifying information". Turns out much of it comes from the fonts at 18.69, but the plugins kill it, with the full 21.66.

      I like being an individual when I'm at home, so to speak, but when out and about on the 'Net I'd prefer being an anonymous face in the crowd. I have ghostery to prevent tracking via cookies and web bugs, but have no idea about these etag things, never heard of them before.

      Ah, crap. It's getting to the point that David Brin's idea of "The Transparent Society" is looking better all the time. Even so, one wants to say, "You first."

    • by allo (1728082) on Saturday August 31, 2013 @11:55AM (#44724439)

      yes and no.

      They are proud, they can extract that much information. Yes, this is bad. BUT, if you consider, that much of this information is unstable (think of installing a new font), the fingerprinting gets more complicated and you need some machine learning to extract the relevant data and/or match changed fingerprints, to correlate them to the same user. This is possible, but then a fingerprint is not an 100% unique thing, even when panopticlick believes it, because matching two slightly changed fingerprints lowers the chance of having the same user from 100% to 99% or something. And the more the fingerprint changes, the wore the result will be.

      So, maybe the best option is, to randomize many of these values on browser start.

  • by Anonymous Coward on Sunday August 25, 2013 @01:25PM (#44670249)

    Want to get back at the folks tracking? Blocking or changing the communications with thigns like Ghostery or SecretAgent is great. However, if there was software that connected to the tracking servers but never completed the TCP connection, thus leaving the tracker with a bunch of half open TCP connections, then one could effectively ddos the trackers. There are several other techniques along these lines that can be employed. What good is a tracking system that is clogged up with connections that never complete or fail in various unfriendly ways?

    Captcha: capacity

  • by ThatsMyNick (2004126) on Sunday August 25, 2013 @02:34PM (#44670775)

    It also seems to leak info between regular windows and incognito mode in chromium. I assume the cache is shared between the modes, and they need separate caches.

  • by Skapare (16644) on Sunday August 25, 2013 @02:40PM (#44670817) Homepage

    My browser passed because of the way I start it. A whole new user/home environment is dynamically created every time I start a browser. I originally did this so that as I browse hundreds of sites, I don't end up with extreme memory waste. This was done back in an older version that was quite memory leaky. It would build up too much in-process memory as I visited sites, and eventually crash. So I ended up with multiple browsers running (separate processes). At first that might seem to have used even more memory. But that was at the OS level where I did have more, including swap space. But it was at least finite since when I left some website, its browser actually exited, rather than just unlink fragmented virtual pages. Today I just haven't changed it now more because of the tracking breakage it creates. I can still be tracked within a site like Slashdot. Slashdot know what articles I read and what articles I ignore. Slashdot know what I post. But I am logged in, so "duh". No, it's not perfect at all, as the Slashdot advertisers can see my repeat appearances, too. But at least they can't so easily figure out what other sites I visit, besides the IP address (which I plan to work on some day).

  • by viperidaenz (2515578) on Sunday August 25, 2013 @03:23PM (#44671149)

    You can't correlate access across multiple URLs, since every URL has a different ETag.

    • by wonkey_monkey (2592601) on Sunday August 25, 2013 @03:57PM (#44671403) Homepage
      So you just put the same image on each page.
      • by viperidaenz (2515578) on Sunday August 25, 2013 @05:34PM (#44671883)

        But then you need to rely on the Referer header to find out where the user came from. You still have no idea who they are unless the HTML URL gives away the identity of the user. All you can tell is anonymous user A went to pages X Y and Z.

        Common "internet security" software packages blank out referer headers.
        Browsers won't send the referer header if the HTML page came from an HTTPS location and the other resource is not HTTPS

        • by null etc. (524767) on Sunday August 25, 2013 @09:25PM (#44673019)

          But then you need to rely on the Referer header to find out where the user came from

          There's this web technology called "query string parameters" that can be appended to any request for a resource on the web. A query string parameter containing a site identifier is more than enough to correlate with an etag identifier.

          • by viperidaenz (2515578) on Monday August 26, 2013 @12:28AM (#44674033)

            But you need to deliver the query string in the HTML page on a different URL, and any change you make to the query string, results in the browser treating it as a different resource and will not send the ETag value. So when you append the 'site identifier' the browser treats it as a different URL. The same base URL with a different query string will not resend the same ETag.

            Let me spell it out for you again: A URL includes the query string. If you change the query string, you've changed the URL.
            The browser stores the ETag against the local cache entry, which is associated with the URL - including the query string.

            All you will be able to tell is "Anonymous User A visited site X at Y" You can't track them through multiple URL's using solely the ETag.

            The best thing that can be done is a website can associate a browser with their a users session after they have logged out and cleared their cookies (unless they clear their file cache too, or the files in their cache have been purged)

            • by wonkey_monkey (2592601) on Monday August 26, 2013 @05:41AM (#44675067) Homepage

              The best thing that can be done is a website can associate a browser with their a users session after they have logged out and cleared their cookies

              So, kind of (exactly) like a cookie, then? What am I missing?

              Okay, the server can't immediately identify you if all you do is fetch index.html, for example*, but it's pretty trivial for a server to correlate the user who fetches index.html with the user who fetches an image in index.html milliseconds later. HTTP requests don't exist in total isolation by any means.

              *assuming it's not tagged, itself

  • by wonkey_monkey (2592601) on Sunday August 25, 2013 @03:58PM (#44671409) Homepage
    http://en.wikipedia.org/wiki/HTTP_ETag [wikipedia.org]

    ETags can be used to track unique users,[2] as HTTP cookies are increasingly deleted by privacy-aware users. In July 2011, Ashkan Soltani and a team of researchers at UC Berkeley reported that a number of websites, including Hulu.com, were using ETags for tracking purposes.[3] Hulu and KISSmetrics have both ceased "respawning" as of 29 July 2011,[4] as KISSmetrics and over 20 of its clients are facing a class-action lawsuit over the use of "undeletable" tracking cookies partially involving the use of ETags.

  • by guruevi (827432) <(eb.ebucgnikoms) (ta) (ive)> on Sunday August 25, 2013 @05:13PM (#44671777) Homepage

    Simply not allowing 3rd party URL's on any website. Sure it might break some ancient things but you shouldn't really be including iframe's, cookies, JavaScript or anything else from a 3rd party domain anyway.

  • by careysb (566113) on Sunday August 25, 2013 @06:48PM (#44672231)
    OK, so how does the NY Times track me. I'm running Firefox on Win 7, I've cleared my cache, I've cleared my cookies, I've cleared the Flash cookies, no luck.
  • by DERoss (1919496) on Monday August 26, 2013 @07:23PM (#44681503)

    For Mozilla-based browsers such as Firefox and SeaMonkey, the SecretAgent extension conflicts with the PrefBar User Agent menulist.

    Because some Web sites I visit are sensitive to what user agent they see, I unchecked (disabled) the "Rotate User Agent" checkbox in SecretAgent. Then, if I used the PrefBar User Agent menulist to spoof some other browser, it kept resetting to my actual user agent. Since I consider the PrefBar capability to be very important, I removed SecretAgent. The PrefBar capability was then restored.

  • by thoughtlover (83833) on Monday August 26, 2013 @08:51PM (#44681919)
    Evil. Seriously, this shit is getting messed up.
  • by allo (1728082) on Saturday August 31, 2013 @11:48AM (#44724391)

    My Blogpost in 2007 (sorry, its german):
    http://blog.laxu.de/2007/09/23/browser-raten-und-e-tag-cookies/ [blog.laxu.de]

Q: How many IBM CPU's does it take to execute a job? A: Four; three to hold it down, and one to rip its head off.

Working...