Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy Electronic Frontier Foundation Security Software

Help EFF Test a New Tool To Stop Creepy Online Tracking 219

An anonymous reader writes "EFF is launching a new extension for Firefox and Chrome called Privacy Badger. Privacy Badger automatically detects and blocks spying ads around the Web, and the invisible trackers that feed information to them. You can try it out today."
This discussion has been archived. No new comments can be posted.

Help EFF Test a New Tool To Stop Creepy Online Tracking

Comments Filter:
  • Ghostery (Score:4, Insightful)

    by Anonymous Coward on Sunday May 04, 2014 @06:33PM (#46915637)

    Ghostery does a great job of this already... However, the problem with these types of tools is they frequently break some type of (needed) functionality on the site.

    7 caught on Slashdot right now.

    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Sunday May 04, 2014 @06:43PM (#46915691)
      Comment removed based on user account deletion
      • As of very recently though, Ghostery takes a step further by providing surrogate scripts that replace the function needed by these websites, only without the tracking. It's really nice because you very rarely need to pause it or add exceptions now.

      • Re:Ghostery (Score:5, Informative)

        by Sir_Eptishous ( 873977 ) on Monday May 05, 2014 @09:40AM (#46919019)
        This is why I love slashdot. I had no idea Ghostery had such a conflict of interest. Thanks for that info.
        From WP: [wikipedia.org]
        "Evidon, the company owning Ghostery, plays a dual role in the online advertising industry. Ghostery blocks sites from gathering personal information. But it does have an opt-in feature named GhostRank that can be checked to "support" them. GhostRank takes note of ads encountered and blocked, and sends that information, though anonymously, back to advertisers so they can better formulate their ads to avoid being blocked.[4]"
      • by Burz ( 138833 )

        Use 'DoNotTrackMe' addon. [mozilla.org] The company that makes it is run by Moxie Marlinspike and has no connection to the ad industry.

    • Re:Ghostery (Score:4, Insightful)

      by mysidia ( 191772 ) on Sunday May 04, 2014 @07:20PM (#46915813)

      However, the problem with these types of tools is they frequently break some type of (needed) functionality on the site.

      I imagine if any plugin gets /really/ popular, the tracking bugs will get modified so they work again, OR publishers/advertisers may start modifying their content to include tests to ensure the health of the tracking bug, before allowing the visitor to view content.

      Maybe you just get half a sheet of text, or the first 1.3 windowfuls, then the site will pick up on the tracking bug being broken, and stop rendering content -- while displaying an error about the need to disable such and such plugin to use the site, or waiting until "countermeasure against tracker bug blocking" succeeds.

      • theres enuf sites that we can go else where.

      • Maybe you just get half a sheet of text, or the first 1.3 windowfuls, then the site will pick up on the tracking bug being broken

        If a web server is configured to deliver only the abstract to viewers behind user agents that include tracking countermeasures, then it will deliver only the abstract to search engines. They tend to retrieve pages with no JavaScript, no Referer, and no cookies.

        • by mysidia ( 191772 )

          then it will deliver only the abstract to search engines. They tend to retrieve pages with no JavaScript, no Referer, and no cookies.

          The IP address ranges that search engines crawl from are well known, and they can easily backdoor their countermeasures for search engines alone.

          Also, if I recall correctly; Google actually runs javascript.

          I'm sure any countermeasure will be designed so the major search engines can index their content

          • they can easily backdoor their countermeasures for search engines alone.

            That's called cloaking, and search engines severely penalize cloakers as they become aware of them.

            • by mysidia ( 191772 )

              That's called cloaking, and search engines severely penalize cloakers as they become aware of them.

              I see 'cloaking' like things all the time; where the real page comes up with a paywall if you try to access, and it is essentially never really penalized when done by the legitimate websites, so you're observation doesn't quite match reality.

              Also it's technically not cloaking if the page content when viewed by a user (without alterations by 3rd party software such as bug blockers or Greasemonkey scripts)

              • Is there any good way to filter sites that offer teasers and paywall additional pages so you don't need to wade through them? Because google top ranks those pages a lot, and it's made it a very inefficient way to find information. It's always high profile sites that used to be big players in the print domain, and I know they're paying google for the exposure. I'd switch to a different search engine if they were uncompromising with those types of teaser-paywall websites. They're just noise, as far as I'm

                • by mysidia ( 191772 )

                  Because google top ranks those pages a lot, and it's made it a very inefficient way to find information.

                  I agree; personally I think Google should have a database of paywalled domains and hide those pages by default Offering a link to 'show paywalled' sites., but otherwise hiding those results (unless they pay to be listed in the little text ads / sponsored search result panel above the normal results).

      • Maybe you just get half a sheet of text, or the first 1.3 windowfuls, then the site will pick up on the tracking bug being broken, and stop rendering content

        That makes it really easy to determine which web sites I should never visit because the sites purveyors are hostile to my best interests.

        I wish sites that are using creepy tracking bugs would act in a manner that made them unusable. It would make life so much easier.

        • by mwvdlee ( 775178 )

          Slashdot uses creepy tracking bugs.
          How come you're commenting here if you never visit this site?

          • by TheCarp ( 96830 )

            Tis true, looking at requestpolicy now it has not loaded content from:
            "scorecardsearch.com"
            "doubleclick.net"
            "gstatic.com"

            Not visiting sites that associate with creepy sites is indeed too limiting when there are such simple solutions

          • by Reziac ( 43301 ) *

            Cuz I have javascript off here, and all the creepy tracking sites blocked in HOSTS. I'm not too concerned about their cookies. I am pissed at the way they stall so many sites. So into HOSTS they went.

            Conversely I don't mind having slashdot's login cookie, which makes my life easier at no cost to me.

          • I'm wearing protection.

      • However, the problem with these types of tools is they frequently break some type of (needed) functionality on the site.

        These sites are not going to like the way that users are going to work around this; by visiting other sites.

        There is so little unique content being generated now, that will just go to other sites with more progressive privacy policies, or disable the app grudgingly if we're feeling lazy or can't find the content elsewhere, and hold it against them for all eternity.

      • A radio station I listen to recently rebranded. Their "improved" web site does not deliver content without the WebTrends tracking code being allowed through NoScript/Ghostery. I seems to do do some magic callback foo to achieve this. This behaviour seems to rapidly expanding on the site; I found a page today that required NetCensus tracking as well. Curiously I get more content if I block JS altogether (although not fully functional). http://doublej.net.au/ [doublej.net.au]

        • I wonder if it's be practical to screw the tracking up with false data?

          I'm thinking a browser plugin which has a list of tracking server addresses - and a few times each day will swap a randomly picked subset of the cookies for those sites with those of another randomly picked user of the plugin.

          • by TheCarp ( 96830 )

            I think it would be better to just expire them quickly, like say, daily or every browser session, or after every tab close.

            Another fun one would be to maintain a list of ad cookies etc...and toss them in a public pool. Whenever you need one, your browser consults the pool and gets one, uses it for a bit, then goes and switches to another one. Do some distributed database poisoning.

            • Most ISPs don't promise/guarantee a static IP, but frequently that is effectively the case.
              Combined with browser and version, screen res, OS version, and regional settings ... they can guess that the user is the same one.

              If they have people there smarter than me (and I guess they do) they'll be using that info to link potentially different cookies as suspect same cookies.

              • by afidel ( 530433 )

                And people said carrier grade NAT was a bad thing...

              • by TheCarp ( 96830 )

                > If they have people there smarter than me (and I guess they do) they'll be using that info to link potentially
                > different cookies as suspect same cookies.

                Actually if they have people smarter than me, then they realize its just random noise and the real customers are paying for volume of data with no way to judge its real quality anyway, so a little poison is just as good as clean grain?

                I mean sure they could climb for the higher hanging fruit, but...when they get paid as much even if they pick it up

        • I just tried it with Badger and Adblock - the site works fine. Usually I use Streamtuner for music and don't bother going to individual web sites, but then I mostly listen to European stations.
        • This behaviour seems to rapidly expanding

          Yes, I have noticed sites I have gone to for years will now not deliver content unless all of their scumbagginess is allowed through...

          Bill Hicks quotes on advertising are apropos here. [thinkexist.com]

        • A radio station I listen to recently rebranded. Their "improved" web site does not deliver content without the WebTrends tracking code being allowed through

          That's bizarre. None of the other ABC properties requires trackers to function. Nor even do the sibling digital radio sites, Triple J and Triple J Unearthed. It's weird that Double J is so completely locked down. ABC is not a commercial company, they're government funded non-commercial.

          I wonder if Double J is some kind of commercial partnership? Perhaps the ABC has outsourced it? It wouldn't be the first time they've done stupid things like that, but I thought they'd been burned enough to learn their lesson

          • I also listen via my ISP's mirror of the Double J stream. Unfortunately that stream does not carry useful metadata (song titles etc.) that VLC can pick up so when I occasionally want those I need the web site. I will not lose sleep over the loss of the site though.

          • Correction, it's now spread to every ABC site.

      • I imagine if any plugin gets /really/ popular, the tracking bugs will get modified so they work again

        Maybe, but even an incredibly popular Firefox plugin is still only there for a small percentage of an average page's visitors.

    • Ghostery does a great job of this already... However, the problem with these types of tools is they frequently break some type of (needed) functionality on the site.

      7 caught on Slashdot right now.

      The sites are designed that way intentionally. "What if they disable cookies? I know, make cookies required to stay logged in!" etc...

    • I use the Self-Destructing Cookies [mozilla.org] add-on. It allows the cookies... but as soon as you move off the page, or close the tab, it dumps the cookies. Sure, I have to re-sign in to some places more, but so what? Add in "clear history when the browser closes" and it's pretty comprehensive.

      About the only thing I've run into that it breaks is Disqus logins. But I use a separate browser - which also deletes everything on close - for that.

      • I'm using Self-Destructing Cookies, too, but I haven't found any way to make it delete when you move off the page. The settings just seem to allow deleting them when you close the browser, or the tab. It would be nice, though. Can you tell me how you did it?

        • Addons > Options > Self Destructing Cookies

          First option - grace period. Degault is to delete cookies 10 seconds after a tab is closed.

          Maybe you can set the option to notify you of cookie destruction (next option down) to test it?

        • Oops, ignore my other reply, I think I see what you mean. You mean simply moving to another tab but leaving the other one open.

          Yeah, there's no way to delete cookies on unfocus. Perhaps contact the add-on author and request the feature?

          • Oh, I actually thought you meant moving off the page to another page, in the same tab (e.g., click a link on /., slashdot cookie deletes after X seconds).

      • You can white-list sites in Self-Destructing Cookies so their cookies remain untouched.

        This site's cookies are destroyed:
        1. After you close its tab.
        2. After you close the browser.
        3 Never.

    • by Toad-san ( 64810 )

      My system caught 8! [phthththth!]

      But they're all Green, no worries.

    • by Toad-san ( 64810 )

      Hmmm .. down to five ads, and three of them are red (and blocked)! Interesting.

  • by NapalmV ( 1934294 ) on Sunday May 04, 2014 @06:33PM (#46915639)
    How's this different or better than adblock / ghostery / flashblock / noscript / do not accept third party cookies ?
    • by Mitreya ( 579078 )

      How's this different or better than adblock / ghostery / flashblock / noscript / do not accept third party cookies ?

      Maybe it can replace 2+ of them? That would be nice. Installing 4-5 tools for one task is a pain

      Also, NoScript specifically breaks 3 out of 4 websites until you figure out which half-a-dozen domains must execute JavaScript for each damn website. I remember how chase.com had a most fraudulent looking domain in order to let me login to my checking account.

      • by Mashiki ( 184564 ) <<mashiki> <at> <gmail.com>> on Sunday May 04, 2014 @06:46PM (#46915703) Homepage

        Also, NoScript specifically breaks 3 out of 4 websites until you figure out which half-a-dozen domains must execute JavaScript for each damn website.

        I think you mean website developers are so reliant on JS these days, that they think they can't write a site without such heavy use of it that sneezing at it will break their site.

        • I think you mean website developers are so reliant on JS these days, that they think they can't write a site without such heavy use of it that sneezing at it will break their site.

          Javascript does some good stuff. When I'm building something, I make sure that the good stuff it does is on the same domain as the website on which I want it done, though. Your mileage will vary.

          • by sjames ( 1099 ) on Sunday May 04, 2014 @07:26PM (#46915845) Homepage Journal

            That's the best policy. The problem isn't sites using JS, it's sites sucking in random bits of JS from 5 otrhert domains that each suck in yet more bits from 3 or 4 additional domains.

            Generally whjen I see that, I decide they're trying to convince me to just allow all witrhout seeing everything I'm allowing. That, in turn, tells me that that's is the last thing I should do so I leave the page and never go back.

            • That, in turn, tells me that that's is the last thing I should do so I leave the page and never go back.

              The problem with this is that more and more sites are, as you say, sucking in bits from other domains who in turn suck in bits from other domains. I have been using these sorts of tools for a while now and I can say that I'm seeing more of this type of behavior. It really is reprehensible and cowardly on the part of developers or, pardon my french, whoever the fucking idiot is who has to script/link/script/link the crap out of their site like that.

              I've started using FF with privacy tools for most sites

              • by sjames ( 1099 )

                If they keep that up, some bad guy or another will manage to get some bad JS into a LOT of big sites all at once.

          • All good in practice, but those site typically use javascript libraries (eg jquery) and they pull in the lib from the original site rather than host it themselves (does this help bandwidth, ie with caching of it?)

            i guess you can simply whitelist these library links, but there seems to be so many of them nowadays.

        • by FuzzNugget ( 2840687 ) on Sunday May 04, 2014 @07:44PM (#46915919)

          It's not that websites shouldn't rely on JavaScript to function, it's that they shouldn't rely on *third-party* JavaScripts from jQuery, a thousand fucking ad servers, a plugin from here and there, Google tracking... that's why what should be a basic website takes forever to load: it's having to make requests to 50 different servers to load a single page.

          JavaScript-dependent websites *can* be done properly. Most are not.

          • by Nemyst ( 1383049 ) on Sunday May 04, 2014 @08:09PM (#46915997) Homepage
            I disagree. If websites relied purely on a single javascript block, then filtering out the tracking, advertising and other bullshitting scripts would be a lot harder than it is now, where most of the external scripts are stuff you want to block. You can generally allow the site itself (and if it's big enough, perhaps a CDN and another domain or two) and you'll get the site without the shit.

            Also, hosting relatively large scripts like jQuery on their own, static path helps a lot for caching. You have one copy of the script for dozens of sites, instead of dozens of copies of the same file.
    • by crow ( 16139 ) on Sunday May 04, 2014 @06:41PM (#46915677) Homepage Journal

      This monitors the behavior of web sites, not the function. So if there's a non-advertising site that just puts out tracking bugs, it will get blocked. If there's an advertising site that doesn't send tracking cookies, it won't be blocked. There's no blacklist--it's all based on observed behavior.

      • by Arker ( 91948 ) on Sunday May 04, 2014 @07:29PM (#46915861) Homepage
        It sounds like a great idea. HTTPS Everywhere is a must have extension, and this looks set to join it. Thanks EFF!

        And in a related note, both of these fine extension works fine in Pale Moon, but refuse to install in Seamonkey, which is a deciding factor in which one I am going to use in the future. I dont know why it breaks in Seamonkey but if anyone does please chime in. Is it just a matter of a bad compatibility check or is there more to it?
    • by fermion ( 181285 )
      not sure. I use a cookie blocker that by default rejects all cookies. I can manually accept cookies, first party cookies are pretty easy to enable, for the session or persistent. With this extension the badger complains it can't do it's job. The difference is that the cookie software shows about 30 third party cookies for slashdot, not sure if the tracking is historical or only for a single page, but the badger shows 5. The difference is that my cookie software rejects all these third party cookies by d
  • by Opportunist ( 166417 ) on Sunday May 04, 2014 @06:34PM (#46915647)

    Install it and it will show you a page where you can link to Twitter, Facebook and Google+ to tell people about how awesome it is.

    Is that supposed to be cynical or ... I don't know, I find it kinda funny. Isn't it supposedly blocking pages like that?

    • It isn't a problem to link. The problem is *how* you link.
      If you provide a link on your page that a user can click on to go to Twitter/Facebook/G+ and the URL embeds a pre-written "I'd like to tell you about XYZ..." then it is going to go through just fine. If the page pulls an image from Twitter/Facebook/G+ servers in order to draw that link, thus creating a tracking event for those services, then it is going to be flagged by the tool, and the image might at some point be scrubbed if the tool decides that

  • by Anonymous Coward

    Does it not illustrate the problem perfectly when you browse to the EFF site pushing an alpha version of a tool to block and the download page has a tracker on it?

    • by sjames ( 1099 )

      Worse, the site that recommends it has multiple trackers on it that Privacy Badger flagged almost immediately.

  • by EmagGeek ( 574360 ) on Sunday May 04, 2014 @08:00PM (#46915965) Journal

    Because this is the tracker the EFF has on the download page for "Privacy Badger."

    • by Ford Prefect ( 8777 ) on Sunday May 04, 2014 @08:23PM (#46916027) Homepage

      Piwik [wikipedia.org] is a self-hosted web analytics package. In other words, your visit to an EFF page is being tracked by the EFF.

      • I'm guessing his point is that he would prefer it to not be tracked at all.
        • by lemur3 ( 997863 )

          when did being interested in user logs and usage info become "tracking" (which is, these days, almost universally considered bad)?

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          you guys are aware that scraping the logs of the webservers also gives you some overview of the usage of the site? Is reverse dns-lookup also considered tracking?

          my point: monitoring your own site to make it better is fair use, giving this data to other entities is not.

  • I'll check this out - it's nice to see something from EFF that I can install to match my EFF stickers :). I wonder about cookies, tho. IME some websites won't work if you block their cookies & it's better to just accept cookies then delete them after.

  • The information they collect is negligible (even in aggregate). I'd rather pay that way, then to pay cash for these services (as this will happen if they can't make revenue via data collection).
  • If copies of Privacy Badger have already blocked your domain, you can unblock yourself by promising to respect the Do Not Track header in a way that conforms with the user's privacy policy. You can do that by posting a specific compliant DNT policy to the URL https://example.com/.well-know... [example.com], where "example.com" is all of your DNT-compliant domains.

    So in other words, To exclude a website from Privacy Badger, all a website needs to do is:
    - Copy and paste https://www.eff.org/files/dnt-... [eff.org] to https://mywebsite.com/.well-kn... [mywebsite.com]

    Give it a few weeks, let the advert sites copy and paste that file, plugin will be useless.

    • I saw that too. Decided to stay with Disconnect + NS + RequestPolicy.

      There is only one thing you can take for granted for advertising: they will play dirty. If you offer them an olive branch, they'll beat you with it, then try to sell you a salve to make the pain go away. There's no difference in philosophy to that of the spammers of old (who the EFF used to support), and anyone who doesn't understand that has no business making a privacy tool.

    • https://addons.mozilla.org/en-... [mozilla.org]

      Nor is the company that makes it attached to the ad industry, unlike ghostery.

  • "I am an online advertising / tracking company. How do I stop Privacy Badger from blocking me?"

    Stop being a scumbag advertising/tracking company.

    But I repeat myself.

  • There are too many addons with overlapping functionality.
    Disconnect
    RequestPolicy
    Ghostery
    Privacy Badger
    and many others...

    What is the recommended subset?

Successful and fortunate crime is called virtue. - Seneca

Working...