Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Privacy Advertising

A New Form of Online Tracking: Canvas Fingerprinting 194

New submitter bnortman (922608) was the first to write in with word of "a new research paper discussing a new form of user fingerprinting and tracking for the web using the HTML 5 <canvas> ." globaljustin adds more from an article at Pro Publica: Canvas fingerprinting works by instructing the visitor's Web browser to draw a hidden image. Because each computer draws the image slightly differently, the images can be used to assign each user's device a number that uniquely identifies it. ... The researchers found canvas fingerprinting computer code ... on 5 percent of the top 100,000 websites. Most of the code was on websites that use the AddThis social media sharing tools. Other fingerprinters include the German digital marketer Ligatus and the Canadian dating site Plentyoffish. ... Rich Harris, chief executive of AddThis, said that the company began testing canvas fingerprinting earlier this year as a possible way to replace cookies ...
This discussion has been archived. No new comments can be posted.

A New Form of Online Tracking: Canvas Fingerprinting

Comments Filter:
  • by ArcadeMan ( 2766669 ) on Tuesday July 22, 2014 @08:54AM (#47506945)

    Yeah, but the Amish also don't receive telemarketing calls or email spam.

  • by fuzzyfuzzyfungus ( 1223518 ) on Tuesday July 22, 2014 @09:13AM (#47507079) Journal
    Depending on what you mean by 'block', there may or may not be a properly satisfactory answer:

    'Block' as in 'make this specific mechanism fail' is the relatively easy question. If the attacker can't manipulate a canvas element and read the result, it won't work. So the usual javascript blockers or more selective breaking of some or all of the canvas element (the TOR browser apparently already does this for methods that can be used to read back the contents of a canvas element, so you can still draw on one but not observe your handiwork) will do the job.

    Unfortunately the attacker doesn't actually care about making your browser draw a picture, they care about achieving as accurate a UID as they can. Given that, you might actually make yourself more distinctive if your attempt to break a given fingerprinting mechanism succeeds. In the case of the TOR browser, for instance, attempts to read a canvas will always be handled as though the canvas is all opaque white. This does prevent the attacker from learning anything useful about font rendering peculiarities or other quirks of your environment's canvas implementation; but it's also a behavior that, for the moment at least, only the TOR browser has. Relatively uncommon. Possibly less common than the result that you'd receive from an unmodified browser.

    That's the nasty thing about fingerprinting attacks. Fabricating or refusing to return many types of identifying information is relatively easy (at least once you know that attackers are looking for them); but unless you lie carefully, your fake data may actually be less common (and thus more trackable) than your real data.
  • by Anonymous Coward on Tuesday July 22, 2014 @03:53PM (#47510041)

    Well, the other real issue here, is that such fingerprinting is in place specifically to work around the "limitations" of cookies.

    Which are those "limitations"? That users can delete them. Honestly, most of the people I've dealt with when they ask for "better" fingerprinting cite that very cause. Not that cookies are per-browser and not per-user (which is what they want to track and what would be understandable at least). Not that cookies don't work with embedded devices. Not all those real limitations, but the fact that users can opt to delete them.

    So, really, they're working against users directly, explicitly and consciously.

All Finagle Laws may be bypassed by learning the simple art of doing without thinking.