Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Privacy Electronic Frontier Foundation Security Software

Help EFF Test a New Tool To Stop Creepy Online Tracking 219

An anonymous reader writes "EFF is launching a new extension for Firefox and Chrome called Privacy Badger. Privacy Badger automatically detects and blocks spying ads around the Web, and the invisible trackers that feed information to them. You can try it out today."
This discussion has been archived. No new comments can be posted.

Help EFF Test a New Tool To Stop Creepy Online Tracking

Comments Filter:
  • Ghostery (Score:4, Insightful)

    by Anonymous Coward on Sunday May 04, 2014 @07:33PM (#46915637)

    Ghostery does a great job of this already... However, the problem with these types of tools is they frequently break some type of (needed) functionality on the site.

    7 caught on Slashdot right now.

  • by NapalmV ( 1934294 ) on Sunday May 04, 2014 @07:33PM (#46915639)
    How's this different or better than adblock / ghostery / flashblock / noscript / do not accept third party cookies ?
  • by Mashiki ( 184564 ) <mashiki@NOsPam.gmail.com> on Sunday May 04, 2014 @07:46PM (#46915703) Homepage

    Also, NoScript specifically breaks 3 out of 4 websites until you figure out which half-a-dozen domains must execute JavaScript for each damn website.

    I think you mean website developers are so reliant on JS these days, that they think they can't write a site without such heavy use of it that sneezing at it will break their site.

  • Re:Ghostery (Score:4, Insightful)

    by mysidia ( 191772 ) on Sunday May 04, 2014 @08:20PM (#46915813)

    However, the problem with these types of tools is they frequently break some type of (needed) functionality on the site.

    I imagine if any plugin gets /really/ popular, the tracking bugs will get modified so they work again, OR publishers/advertisers may start modifying their content to include tests to ensure the health of the tracking bug, before allowing the visitor to view content.

    Maybe you just get half a sheet of text, or the first 1.3 windowfuls, then the site will pick up on the tracking bug being broken, and stop rendering content -- while displaying an error about the need to disable such and such plugin to use the site, or waiting until "countermeasure against tracker bug blocking" succeeds.

  • Useless colors (Score:1, Insightful)

    by Anonymous Coward on Sunday May 04, 2014 @08:24PM (#46915833)

    Considering the very high rates of red/green colour-blindness, the visual warning is fucking useless. I do not understand how, in a predominantly male industry, programmers don't realise the uselessness of Red Green indicators. Is it so incredibly hard to do blue, red, black? Or some combo that will read for even fray scale colorblindness?

  • by Nemyst ( 1383049 ) on Sunday May 04, 2014 @09:09PM (#46915997) Homepage
    I disagree. If websites relied purely on a single javascript block, then filtering out the tracking, advertising and other bullshitting scripts would be a lot harder than it is now, where most of the external scripts are stuff you want to block. You can generally allow the site itself (and if it's big enough, perhaps a CDN and another domain or two) and you'll get the site without the shit.

    Also, hosting relatively large scripts like jQuery on their own, static path helps a lot for caching. You have one copy of the script for dozens of sites, instead of dozens of copies of the same file.
  • Re:Useless colors (Score:2, Insightful)

    by Anonymous Coward on Sunday May 04, 2014 @09:28PM (#46916043)

    It isn't that it's hard, people just don't know that some colours might be inconvenient. If you want to solve the problem, create an extension to remap the colors, either only on the webpage or the whole screen, into something visible. Not trivial, but certainly much easier than convincing web-designers that their colour-scheme isn't perfect.

  • by PopeRatzo ( 965947 ) on Sunday May 04, 2014 @10:27PM (#46916191) Journal

    I'm guessing that most web sites are made by young women who fancy themselves to be graphic designers

    Get the fuck out with your stupid techie misogyny.

    If your "guessing" involves generalization to the point of an ugly absurdity, you should check yourself. You make it sound like you have a particular beef, maybe with a particular woman (or women) and now you believe that all bad web code is caused by women. It's a bad place to be.

    If you want to say, "I have encountered some young women who fancy themselves graphic designers..." you would at least be on more reasonable ground, but then you need to ask yourself, "Does the fact that this group of people were women really have any impact on my statement?"

    Now knock it off. People get skeeved out by misogyny and it's pretty easy to pick up on, so the next time you're looking for a job you might just walk away wondering, "That didn't seem to go well, it's probably because of that woman who interviewed me. They're all whores you know".

  • by Anonymous Coward on Monday May 05, 2014 @06:08AM (#46917531)

    you guys are aware that scraping the logs of the webservers also gives you some overview of the usage of the site? Is reverse dns-lookup also considered tracking?

    my point: monitoring your own site to make it better is fair use, giving this data to other entities is not.

  • by PopeRatzo ( 965947 ) on Monday May 05, 2014 @08:24AM (#46918017) Journal

    I'm interested to know what theories other people have about the poor use of Javascript.

    You mean other than, "Bitches, man, they just don't know how to code, you know? *fistbump*"

"I will make no bargains with terrorist hardware." -- Peter da Silva

Working...