Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Web Bug Detector 190

(H)elix1 writes: "I'm sure /. is about to be hit with this, but CNET just released a story about a web bug detector plug-in for IE called Bugnosis by the Privacy Foundation. An interesting toy, but the thing that grabbed my attention was the Web Bug Gallery. It would seem our beloved slashdot has them as well. Course, so did CNET, but that is a different story...." I think improved cookie-handling is much more useful in preventing tracking, but this is interesting because it provides visible feedback about tracking efforts.
This discussion has been archived. No new comments can be posted.

Web Bug Detector

Comments Filter:
  • by Anonymous Coward
    Heck, Netscape 4.75 has this, and as far as I recall, earlier versions did too. Lynx, of course, is still my first choice for browsing. Too bad the fscking text entry box is so lame.
  • by Anonymous Coward
    I wish people would realize that web bugs and cookies ARE NOT THE SAME THING. A bug is a spying device. It does not rely on cookies. It does not rely on images (although they are commonly used.) It does not rely on 1x1 pixel images (which in most cases are NOT used - hell, every banner ad from every banner ad company is a web bug!). JavaScript code, images, frames, shockwave can all be bugs. Why? Because one can create a page (say on server X), with references to ANOTHER server (say Y) containing the objects. So when a person visits the page on server X, server Y gets to know about it.

    Cookies definitely can exasperate the problem by providing additional information. But bugs are not reliant on cookies. You can block all cookies and block all images and you will not block all web bugs. The reason advertising companies like to use cookies is that you can track additional information easily, because the browser obligingly stores the data and spits it back on demand, even after you shut the browser down and start it back up, often hours, days or weeks later.

    For reference, check the Web Bug Report [] quote in the CNET article [] and you'll notice that the report shows the types of bugs (imgs, iframes, etc.) that are present. A very large # of them are not images...

  • by Anonymous Coward
    So a web site includes an inline image loaded from another site. And the graphic is so small you might not notice it. Whoop de do. People have been doing essentially the same thing with web counters for years. Now it's the /. scandal of the day?
  • by Anonymous Coward on Friday June 08, 2001 @05:08AM (#166598)
    This is a common misconception; the reality, however, is much more disturbing. The little blinky dot you humans call webbugs are actually tiny miniature CIA cameras implanted in your screens to take pictures of you surfing Slashdot naked. Us CIA guys only admitted to using DNABots when they were already obsolete, much like the obsolete Echelon system, which has been replaced by people using Windows XP. We find it's much easier to allow the citizens to administer their own surveillance device. Saves us mucho manpower.

    Therefore, buy XP and save the government valuable surveillance budget dollars.

    Agent Bitterman, Superspy
    President Chief Head Director of the Leadership Branch of the Executive Level of the CIA
  • by Wakko Warner ( 324 ) on Friday June 08, 2001 @04:31AM (#166599) Homepage Journal
    ...slashdot used to berate sites that used web bugs, but it looks like they have them too now...

    - A.P.

    Forget Napster. Why not really break the law?

  • Shameless plug: BannerFilter [] is a plug-in for Squid that filters ad banners and popups. It doesn't specifically target web bugs, but if you submit URLs, they'll be added.


  • One thing Mozilla really needs as far as cookie selection goes, is a distinction between session and stored cookies, which doesn't exist right now. I gladly accept session cookies, they're vaguely helpful, but most tracking and such is done with stored cookies, so I only allow stored cookies for select sites, like /. login.


  • A is True, and B is True, but A does not imply B.

    Why 'of course'? What benefit is it to VA that they know I read Freshmeat, Slashdot and Sourceforge but not QuestionExchange (mainly because of their sub-literate banner ad)? I've never noticed a difference in advertising content across the sites...
    the telephone rings / problem between screen and chair / thoughts of homocide
  • The third party cookie distinction is back in IE6 again (at least it is in the latest WinXP beta, and they share the same codebase). You can say Accept, Block, or Prompt separately for first and third party cookies, as well as accepting or rejecting all session (not persistent) cookies.

    So you can accept all first party cookies, and be prompted about any third party ones.

    - Steve
  • by SteveX ( 5640 ) on Friday June 08, 2001 @05:25AM (#166607) Homepage
    It's back in the current 6 betas.
  • I think we need a new moderation choice: 'Didn't get the joke'

  • by gelfling ( 6534 ) on Friday June 08, 2001 @07:00AM (#166609) Homepage Journal
    The Active X controls are required only for the somewhat unusual download and installation and then can be disabled according to the author.

    You only have to enable ActiveX control downloading in order to install
    Bugnosis -- you can disable it after installation. That makes it really no
    different than downloading an .exe from us. The Bugnosis control that we
    download isn't scriptable, so other Web sites and email users will find it
    harder to abuse.


    Prof. David Martin
    University of Denver Math/CS

  • The installation requires Active X controls = on. So that makes the cure worse than the disease. I'll trade some privacy for not opening up my machine to remote execution Active X shit.
  • Oh course, Lynx doesn't normally load images anyway, so it's reasonably immune to web bugs...
  • by AftanGustur ( 7715 ) on Friday June 08, 2001 @05:52AM (#166613) Homepage

    As /. logs witch moderators spend points on witch comments. Slashdot now has the IP address of the CrackSmoking dude who found this 'Informative'.

    echo '[q]sa[ln0=aln80~Psnlbx]16isb15CB32EF3AF9C0E5D7272 C3AF4F2snlbxq'|dc

  • <plug type=blatent>

    FilterProxy [] can remove web bugs by stripping them straight out of the html. Oh, and it removes ads too.



  • It's a pretty cool tool.

    Just one annoying thing:

    Every time it finds a web bug (definite web bug), it brings up the report. Makes reading /. annoying (since every page is bugged).

  • Okay, you can resize it to like a single line (or 1 pixel) at the bottom of the page if you want, so it's not that annoying.

    If you do the one pixel high thing, just watch the toolbar in IE5 for when the bug turns red if you want to know if you're being bugged...
  • Err, just right click and turn off pop-up if you don't like it, nevermind.
  • Yep, not really a bug unless it pulls a cookie from you--it'd be nice to have the a checkbox for this int he options.
  • Just right-click the report window and disable "popup when webbug found".
  • No identifying information, EXCEPT:


    that's pretty identifying if you're on a dedicated connection, i.e. surfing from work.
  • Humorous. So if they'd prefer to use the date that we have on our own computers, which can be way way off, rather than using their own more accurate datestamp?

    That tells them what timezone you're probably in....
    Then they could build demographics profiles -- for example, people who are on at 3 AM in the USA are probably students or security guards or something.

  • by Barbarian ( 9467 ) on Friday June 08, 2001 @04:46AM (#166622)
    A beta of IE5 between 5.01 and 5.5 had the same feature, "Accept third-party cookies" Always/Prompt/Never, but they took it out in 5.5
  • by Col. Klink (retired) ( 11632 ) on Friday June 08, 2001 @04:46AM (#166625)
    Of course /. uses web bugs. They still use GIFs, too. This is a "do what we say" website, not a "do what we do" one.
  • The real humor is that some moderators didn't recognize this as a well-known non sequitur and marked it "Informative". Next time you may have to actually include the smiley to help out some of our "special" moderators...

    Caution: contents may be quarrelsome and meticulous!

  • Actually that about the cookies isn't right. Looking at the OSDN image on Slashdot's page, OSDN can't pick up any cookies from it. Not unless the browser is failing to apply the same-domain rule, that is. You can do some things with Javascript to put cookie information into the image request query string, but the OSDN code doesn't do that.

  • Pull up the Tasks menu, Security and Privacy, Cookie Manager, and hit the Cookie Sites tab. Find the sites you want to allow cookies for and remove them from the list of blocked sites.

  • Well, yes. But then again, Slashdot could add a module to the Web server that logged their cookie info along with the hit data and timestamps into a file and e-mail that file to anyone they felt like, too. Some shell scripting and a cron job and it'd be completely automatic. That's not Web bugs leaking the information to a third party, that's the main site deliberately giving that information to a third party. I may have concerns about the main site doing that, but Web bugs don't add anything to that concern IMHO seeing as the conduit exists without Web bugs.

    And yes, I have thought about that sort of Apache ( and possibly IIS ) module. It's got applications for legitimate site statistics, not just unethical tracking.

  • Only point, though: if a site's coding custom Javascript to transfer their cookies to a third-party site, they're planning on synchronizing information in advance already. That or the ad site's handing them cut-and-paste code to use and they aren't checking it, and that can be seen in the HTML source. Pulled-in scripts where the JS in question doesn't appear in the page source won't work, because browsers enforce cookie-domain rules based on the source of the script, not the page pulling it in.

    As for difficulty in synchronizing, think about the trio of timestamp, source IP address and referring URL. Off the cuff estimate, I could probably get 95% accuracy from those on any given hit, and over the course of a few hits I'd get effectively 100% accuracy for any given surfer. All automatic once it's created, no effort needed on the part of the operators once the software's installed.

    As for what you call Web bugs being only for the info transfer, that depends on what sort of info transfer you're talking about. I can tell you right now that the OSDN 'bug' on Slashdot's pages doesn't do what you're suggesting, so right off there's a counterexample to your assertion. Ditto at least Hitbox's stuff. The only problem is that the illegitimate tracking ones and the legitimage statistics ones look almost identical in the code, until you start really digging into the Javascript ( if any ) and the back-end systems. That's a job that's too complicated usually for a simple plug-in.

    As for HTML e-mail bugs, that assumes that a) the user's using a mail reader that renders HTML and b) that mail reader's dumb enough to pull in content not contained in the e-mail message. If your mail reader's a Web browser, then you're obviously open to all the exploits that can be applied to a Web browser. That's why I don't use a Web browser to read mail. :)

  • NAT makes the IP address ambiguous, yes. That's why I specified that triple instead. To make the triple ambiguous you need to have two people behind the same NAT box hit the same URL within a second or so of each other. That is a lot less likely, and if they hit different URLs then I can match the referrer in my logs against the URL in their logs and disambiguate the sources. Ditto if they hit it at different times. As far as time synchronization, see NTP. Time sync within a few hundred milliseconds isn't hard at all.

    As for what the bug's there for, that's the whole point. "Page X on your site was viewed N times by M unique people." is a perfectly legitimate Web-site statistic. So is "Q people followed this path through your site and abandoned it at page B.". In fact a lot of sites could use smacked over the head with that latter statistic, to prove to the salescritters that huge Flash delays, overly-busy and confusing index pages and disruptive intersitial advertisements do indeed make people go elsewhere. Then one comes to Doubleclick and such, who use the same methods to record things like "Person Z browsed these pages on these sites today.". That's getting way past the bounds of acceptable, but it's being done by the same technique.

    Just calling it "info transfer" and then saying that all info transfer is bad because some of it's bad is missing the point. The problem isn't that information is being transferred, it's what information is transferred and what's done with it. Dropping the OSDN image, where no personal information or tracking data can be transferred through it because of the way it's coded, into the same category as Doubleclick's bugs, which do transfer a tracking cookie back to a company that's said flat-out that tracking personal habits is their business, is at best disingenuous.

    As for why the images are small and transparent, let me ask this: if the only purpose of the image is in fact to collect legitimate site statistics, what purpose does it serve to have it taking up more real-estate on the page and more bandwidth on the network than it absolutely has to to do it's job? Which leads right back to the same problem, that the logical, minimally-disruptive way of doing something legitimate is on the surface identical to what you'd do if you were trying to conceal evil intent. For myself I tend to be quick to block things I don't know, but it annoys me that I have to block or interfere with legitimate things in order to keep out the slime. I'd much rather LART the abusers off the net.

  • by Todd Knarr ( 15451 ) on Friday June 08, 2001 @07:56AM (#166637) Homepage

    There's also another point. All those Web bugs look identical from an HTML/HTTP point of view, but they're radically different from a data-collection point of view. Hitbox, for example, uses those bugs solely for site statistics. They can tell when two hits were from the same person and can tell a site things like how many people followed a given path through it, but they've no idea who a given person is and don't store any information on which paths a particular person followed in the database the sites access.

    Disclaimer: I only program the systems for Hitbox/WebSideStory. I don't represent them or their opinions, they pay the executives to do that.

  • There are various recommendations scattered througout this discussion for webwasher, adsoff, etc. It's hard to find 'em all.

    Reply to this message with the product name in your subject line and put a link in the body if you've got one.

    Persons wishing to add information about specific products can then reply to those messages. --Charlie

  • KookaBurra Software [] sells a product called "Cookie Pal" that allows you to filter cookies and responses to cookies in real time. Extremely configurable, shareware, inexpensive, works on MSWindows operating systems.
    It can work with Netscape and Explorer simultaneously. I've been using it on my windows boxen for years quite happily.

  • I parsed "Web Bug Gallery. It would seem our beloved slashdot has them as well." and thought, well, duh, that's common knowledge. Then I read the article, and realized that they meant the spy-type of bug, rather than the slopped-together-code-type.

  • Netscape couldn't overtake a browser that came with theOS, why do you think Mozilla will?

    Besides, IE5.x has had the same functionality. And, power users can get Guidescope ( or Junkbuster if they want to manage their cookies effectively.

  • by Quarters ( 18322 ) on Friday June 08, 2001 @06:51AM (#166644)
    You forgot raging about the MPAA, asking us to boycott movies, and then providing us with useless Katz reviews of movies *every* week.

  • CookiePal does just this, although it's a Windows only application.

    I can deny all cookies from a domain, accept all cookies for a domain or view the cookie and decide if I want to accept it. I can see all the cookies that are set and delete them also.

  • In the words of the kind Mr. Dobbs, "I don't practice what I preach, 'cause I'm not the kind of person I'm preaching to!"
  • "Every time someone asks when I will release slash to the public, I extend the date by one day."


    It's just a registry dump from my computer from this morning. I really need to automate it.

    Anyway, that's my list. Would love to compare.
  • by scotpurl ( 28825 ) on Friday June 08, 2001 @04:47AM (#166649)
    In the realm of cosmic irony, I installed the web bug tracker, then went into this full article, and promptly got the OSDN web bug.

    If you're among the folks like me that have to use IE, use that Restricted Sites setting under the security tab (and while you're in there, crank that restricted zone up to disallow derned near everything). Also set your browser to warn you when you get cookies. Add entire that want to set cookies to your restricted zone. None of the muss and fuss of an ad filter (which breaks everything when I have to VPN to the office).

    For the first couple of weeks, you'll be adding a few sites per week. I also added to mine the list someone posted of the sites that track users the most. I don't get cookies now, unless I'm actually shopping online. :-) If someone wants a copy of the list, I could find a home for it.
  • by blowdart ( 31458 ) on Friday June 08, 2001 @04:37AM (#166654) Homepage

    It uses a table, so the formatting on this will be way off

    Bugnosis analysis of: Articles: Web Bug Detector ( 230&op=Reply&threshold=-1&commentsort=0&mode=neste d&pid=18)

    Highlighted images may be Web bugs.

    Properties Contact Image URL

    Tiny, Once, Domain, TPCookie (anon=anon_id&-1-vGtvAizyjA&boxex&%27whatsnew%27%2 C%27slashdot-main%27%2C%27freshmeat-main%27%2C%27n ewsforge-newsvac%27%2C%27sourceforge-news%27%2C%27 linux-news%27%2C%27open-mag%27%2C%27questionexchan ge-top10%27%2C%27themes-new%27%2C%27thinkgeek-new% 27&exboxes&%27whatsnew%27%2C%27slashdot-main%27%2C %27freshmeat-main%27%2C%27newsforge-newsvac%27%2C% 27sourceforge-news%27%2C%27linux-news%27%2C%27open -mag%27%2C%27questionexchange-top10%27%2C%27themes -new%27%2C%27thinkgeek-new%27) http://sd-,992003991 337

    Property name Description

    Tiny image is tiny, so is probably not meant to be seen

    Protocols image URL contains more than one Web protocol name (e.g., "http:" twice)

    Cookie image URL overlaps with the cookie field too much

    Lengthy image URL is unusually long

    Domain image comes from a different domain than the main document

    Once image is used only once in the document

    TPCookie image comes from a different domain than the document and manipulates a cookie (Third Party Cookie)

    Recognized compares the URL against a set of recognized Web sites

  • by Russ Nelson ( 33911 ) <> on Friday June 08, 2001 @05:26AM (#166658) Homepage
    Of course Slashdot contains an OSDN webbug. Slashdot is owned by OSDN. Some people gotta turn their paranoia control WAY down, otherwise they're gonna start seeing black helicopters soon.
  • by umeshunni ( 37684 ) <<umeshunni> <at> <>> on Friday June 08, 2001 @04:30AM (#166659) Homepage Journal
    My netscape browser can detect any web bug ! it prints "Bus error (core dumped)" everytime it sees one !
  • by Grendel Drago ( 41496 ) on Friday June 08, 2001 @04:56AM (#166660) Homepage
    From :

    now = new Date();
    tail = now.getTime();
    document.write("<IMG SRC=' ex,");
    document.write("' WIDTH=1 HEIGHT=1 BORDER=0><BR>");
    <IMG SRC=" ex,992004976" WIDTH=1 HEIGHT=1 BORDER=0><BR>

    Yep, there they are. Web bugs if I've ever seen 'em...

    -grendel drago
  • by wiredog ( 43288 ) on Friday June 08, 2001 @04:38AM (#166662) Journal
    Three from our friends at k5 [].

    Oh My God! Rusty's tracking me! That Low-Life Capitalist Corporate Big Business Pig! What do he and Inoshiro want with me! Why can't you guys leave me alone!!!!

  • The ability to blank out ads by size sounded interesting...until he mentioned that the image is still downloaded. If it's still downloaded, it still registers on the server and it probably still has a cookie attached. I think I'll stick with Squid and ad-blocking Perl [].
  • Does [WebWasher] block them based on the tag attributes, or does it go ahead and load the image headers?

    It parses the HTML returned by a site and removes tags that would load banner ads and web bugs (among other things). If the size attributes are in the IMG tag, I'd assume it uses those. If those attributes aren't included, it would need to download the image and check its size before deciding if it should include the tag.

  • What the difference between these so called 'web bugs' and 'cookies'?

    Hell, if you link to an image off-site, someone can get your IP address, etc. [With a little bit of javascript and a redirect, you can get a whole crapload of information about the person that you're not supposed to]

    Personally, I refuse to download any software, not only because it's for IE, but because then the people I'm downloading from would know my IP address. [Can someone please tell me how people are supposed to send you content if you don't give them an IP address?]
  • Ahh.... no wonder...

    In that case, as it's mostly banner ads sized images and 1x1s, then iCab strips them out, along with known banner ad sites, images that are located in /ads/ directories, etc.

    [And it only took a few mouse clicks to turn on the filtering settings]
  • > Junkbusters is your friend.

    Another Junkbuster plug here.

    Everyone who shows up at my cubicle at work marvels at how "fast" my web surfing is.

    It's amazing what a difference it makes on some sites when you're downloading 3K of text content, 20K of surrounding Javshit (which I've disabled), and about 20K of site graphics, but at least I can skip the 60K of banner ads.

    (Most of the time, I surf with images off and skip the site graphics too :)

  • > You only have to enable ActiveX control downloading in order to install Bugnosis -- you can disable it after installation. That makes it really no different than downloading an .exe from us.

    So why won't they just let me download the .EXE and run it at my leisure?

  • Anyone ever notice how Netscpae has a feature in Edit/Preferences that says "Only accept cookies sent back to original server" well use it. Personally I use Junkbuster with about 3 sites allowed to send me cookies. Only problem I get with this is when I visit Slashdot I'm never truly logged in until I post since no info is sent back up until I go to post.

    There was a method about a year ago if I'm not mistaken between August - Novemember about an email trick or service to track whether someone read your email. Marketing companies are all run by Dr. Evil anyway so there isn't much you can do. You complain they remove X service and replace it with something more evil.
  • To all you who are off writing you panicky responses about evil cookies coming to get you, why don't you use a sane cookie filtering system like Junkbuster []?

    Don't like having DENY ALL/ALLOW EXPLICIT control? Or R/O cookies for certain sites? Than keep to your naked browsers with Javascirpt and other things turned on, and don't complain!

    Plus you get the added benefit of no ads.

  • You could author a simple script to do that. The problem is that some cookies you probably want to live. For example, I want my NYTimes cookie to live, so I dont have to login all of the time. Same with my slashdot cookie. I dont care if the NYtimes tracks my demo-data: He logins in, he views the front page, he views the tech page, he views the business page, he views the science page, he never clicks on an add. However, I don't want some pr0n site tracking my movements, nor some crappy software company that's going to correlate me with an email address that I registered to buy something with.
  • by selectspec ( 74651 ) on Friday June 08, 2001 @04:37AM (#166678)
    If I were designing a browser, I would have a cookie monitoring window, which would log cookie activity. One could author filtration scripts to block certain domains from cookie access, manually delete cookies from the monitor window, etc.
  • you'd rather download some unknown exe and run it on your system to install the software? that is more secure and better for privacy how?

    With ActiveX enabled, any website visited using IE can ask to run or install software by popping up a single dialog. It would not be difficult for a malicious site to see to it that the dialog pops up just as you're typing 'y' on the keyboard or just as you're about to click where the "yes" button will appear. By disabling ActiveX and only installing software manually (downloading a .exe installer and running it), you give sites you visit one less way to break into your system/account.
  • by oldstrat ( 87076 ) on Friday June 08, 2001 @04:43AM (#166684) Journal
    The author of the CNET article chould have taken one more step in research... and the author of the slashdot article should have verified.
    Contained a bug from the Open Source Development Network (

    SLASHDOT is part of the OSDN pages by VA Linux.
    It's not a 'bug'.

    Bugnosis isn't smart enough to tell the difference between a real bug and a simple page counter, and probably can't be. We should really worry about much more important things and stop feeding paranoia.
  • by cs668 ( 89484 ) <> on Friday June 08, 2001 @05:13AM (#166687)
    Cookies are simply a way of adding state to a stateless protocol. So for the most common example you could automatically remember your username to slashdot the next time you return.

    Most good browsers will let you set them to only receive cookies from the host you are connecting to. And cookies should only get sent back to the host that they came from.

    These "web bugs" allow a site to send information to a third party( eg Addvertiser, Government agency, ... ) by causing another http request to be made. THis request, although it is for an invisible image, could have peramaters. These parameters could send all of the info that one site has collected about you to another. That third party site could then also send a cookie for its own use to your system.

    I hope this makes sense, I am not quite awake.
  • Junkbusters is your friend. Tested it against the Washington Post example page. With out the Junkbusters proxy, four "bugs" found. With the Junkbusters proxy, zero "bugs" and fewer ads. ( You may need to spend some time getting your configuration the way you want it. There is a RPM package with some "improvements" and workable block/cookie files. Microsoft Windows users will have to create their own config files.
  • Right on.

    I'm thinking that the reverse approach might be helpful here.

    That is, instead of filtering to remove webbugs, they should be culled out carefully and rebroadcast to some zombies that will keep those nosy sites more than tickled with a flood of requests.

  • That would mean:

    a) Michael would actually have to do some investigating
    b) he would have to use IE.

    Two things that the Slashdot crew will never do.

  • by edibleplastic ( 98111 ) on Friday June 08, 2001 @05:04AM (#166692)
    Everytime something happens with Napster or the MPAA, someone on Slashdot says "well stop sitting there talking about it on Slashdot and actually *do* something! Go boycott them or donate to the EFF" blah blah blah. So maybe instead of just talking about privacy issues or the tyranny of gif patents, Slashdot could actually get off its duff and do it. I know how much time it takes to convert a whole website, but its something that could be done incrementally.
  • See my sig
  • Looks like it could be really cool if the weeny writting it would port it, or allow it to be ported. From the looks of things he's one of those rabid Mac users (there is no other system) :)

    "One World, one Web, one Program" - Microsoft promotional ad

  • Actually, I've been doing restricting sites in IE (at work) for some time in this manner.

    Windows stores these restricted sites in a location in the registry, here's an example:

    [HKEY_CURRENT_USER\Software\Microsoft\Windows\Cu rr entVersion\Internet Settings\ZoneMap\Domains\]

    I made a big list of these using one of those websites that list tracking networks and a short Perl script, then edited it for the particular machine I was on (Windows 2000 requires the header "Windows Registry Editor Version 5.00" whereas older versions of Windows require "REGEDIT4").

    You can export these lists and share them with everyone but be careful when you accept these as people can add themselves to unrestricted zones if you don't read the registry files (note the dword value at the end, should be "4").
  • Not knowing enough about the topic, I can only explain that my understanding is, web bugs use cookies, but not all cookies are for web bugs. Web bugs are things like little one pixel GIF files or banner ads. It's especially useful when you are talking about different website that contains web bugs from the same place, because the site that is serving up the web bugs can track you across web sites using cookies they've placed on your machine.

    As far as downloading, people can still send you things if they don't have YOUR IP address - some kind of proxy system would do.

  • It's already there:

    "Offtopic" - for jokes in which the moderator didn't understand the reference to the original article.
    "Flamebait" - for (political) humor with which the moderator disagrees.
    "Troll" - for misunderstood deadpan humor

    I run across this meta-moderating all the time.

  • Why? The market share of other browsers is too small to worry. But, don't worry, other browsers will have their own ways of dealing with this (like Konqueror and Mozilla do).

    I've installed the webbug detector but am about to uninstall it as it merely seems to be an annoyance designed to make me aware and complain to the offending site, but does nothing (that I can tell) to protect me from these evil creatures. I don't like to be annoyed...

  • Just right-click the report window and disable "popup when webbug found".

    I realized that was possible, but my point really was that the software did not protect but merely detect. So, in addition to being easy prey to webbuggers I can choose to be alerted when being bugged. Whoop-dee-doo.

    Call me when the program stops, deflects or damages (say by corrupting the database?) the webbuggers.

  • My little web site [] is hosted from a slow 128k frame relay link. Doing this gets the server on my LAN, which really is needed to enable me to spend my free time working on it. As the traffic has grown (now about 150k pageview/month), my low bandwidth link couldn't keep up. The simple solution was to move the images to a higher bandwidth server.

    If you poke around in the html you'll see that the images are hosted at "", and of course my site is "". Saddly, this web bug thingy will probably tell you that I'm conspiring with to track you, when in fact they're just my ISP providing some server space for the images. There are not web bugs on my site.

    I really ought to set up the image server with a domain name like That costs extra (ISPs love to find things to charge for that don't cost anything)... but the cost isn't the primary concern. My little ISP has changed admins and they're not as stable as one might expect paying for frame relay service. I'll probably move to a new ISP soon, and that'll be a good time to set up a proper name for the image server.

    The point is that it makes a lot of sense for a site to host bandwidth hogging files from a different server. In my case, it's to facilitate spending my creative energy in my free time on the site (didn't do much on it for a couple years without direct access to the server). I regularily poke around looking at people's html source, and I've seen several major sites use a different server for images, PDF files, etc. It's not an uncommon practice, and there's a lot of good reasons to do it other than tracking users. From what I can see, it looks like the folks at the Privacy Foundation [] aren't aware of this.

  • After I made that post, I did quite a bit more reading about their little plugin, and it looks like I was not entirely correct.

    They classify each image according to a variety of criteria, including the size (pixels, not bytes), if it was from a different domain, if it sent cookies, and some other things I don't recall at the moment. They classify each image based on the number of criteria that are matched, and each image is either a web bug (red), warning (yellow), or not significant (or something like that). They don't document exactly what the criteria are, but it looks like they won't consider an image a web bug unless it's "tiny"... again, no specific documentation of what size an image must be to be considered tiny. The images on my site probably fall into the warning or non-issue categories.

    It didn't go so far as to actually set up a machine (or virtual machine with vmware) and actually install windows, IE and their plugin.

  • by Rushuru ( 135939 ) on Friday June 08, 2001 @05:14AM (#166712)
    There are some proxies out there that filter banner ads / cookies / and web bugs.

    One of the most interesting ones is webwasher ( - for windows & linux, free for personal use, not open source).
    Webwasher does not use regular expressions to filter images: it filters them by size. Most banner ads have a standard size (for ex 468x60). Webwasher has a list of known banner sizes and filters all images which match the list of sizes. And it's efficiency is very impressive!

    Thus, using webwasher, it's very easy to filter all web bugs which are usually 1x1

    Alas, webwasher is not opensource and has some issues. But I think that the idea behind this product is great and I'd love to see it implemented in an opensource proxy :)

    The way webwasher handles cookies is also very interesting: you can specify 3 sorts of cookies
    - the good ones (allow them, keep them)
    - the neutral ones (allow them, delete them after 24 hours)
    - the bad ones (always block)

    The default policy for unknown cookies is to set them to neutral; that lets the user visits site normally (without the occasional glitches that happen when you block all cookies with sites that won't let you browse without allowing them), without compromising the privacy of the users for cookies are deleted after 24 hours.
  • by Self Bias Resistor ( 136938 ) on Friday June 08, 2001 @05:02AM (#166714)

    First post insanity aside (trust me, it's only fun for about 5 minutes and bad for your karma because moderators despise it), there's this quote featured in the CNN article [] (yes, I do actually read the related articles before posting flamebait):

    "Our goal with the software is to reveal how Web bugs are tracking all of us on the Internet and to get companies to 'fess up' about why they are using them," Richard Smith, the Privacy Foundation's chief technology officer, wrote in his privacy tip sheet.

    "Any company that uses Web bugs on their site should say so clearly in their privacy policies and explain the following: why they are being used, what data is sent by a bug, who gets the data, and what they are doing with it," he added.

    There are two things that I'd like to point out about those statements. First of all, companies with web sites are (in most countries) legally required to tell you about what kind of data they collect and what they do with it. The majority of such privacy statements either consist of the usual "we don't collect any information that can personally identify you" variety or they are hidden beneath so many links at the very bottom of the most obscure pages in the site that your average user never reads them.

    Second of all, I agree with your point regarding the suggestion that companies should be required to thoroughly explain what kind of bugs they use (if any), what's sent and received and where the data goes. I personally think it's a great idea. And it's all well and good for sites that deploy their own web bugs. But what about the web sites who use web bugs belonging to other websites (e.g sites who use DoubleClick web bugs, or Slashdot using a web bug from OSDN)? The application should be the same, of course, but how is that handled from a legal perspective? Who is responsible for the "bug"? The company who wrote/owns it, or the company that deploys it? Answers to any of these questions are more than welcome (particularly by someone involved in the legal profession), as I'm sure that there's at least some of us Slashdot readers that would like to know.

    Self Bias Resistor
    "Imagination is more important that knowledge." - Albert Einstein

  • by Lizard_King ( 149713 ) on Friday June 08, 2001 @05:16AM (#166719) Journal
    As any open source fanatic will tell you, it is imperative that you read the HTML source of every page that you view.
    We don't need no stinkin' Bug Detector!

    --- note sarcasm ---
  • My wife has a modified Iopener that is in our kitchen. I use it occasionally to read Slashdot.

    The LCD screen on it displays the Slashdot web bug as a 1 pixel white spot above the banner. If Slashdot didn't have a black background, I wouldn't have seen it.

    I find it curious, that with all the discussion on privacy and our rights on line, that Slashdot would use web bugs. I imagine that when it comes right down to it they had to make a choice: no web bug or money, and they went for the money.

    With all the talk about the higher priciples of Information Wants To Be Free, Privacy, Rights, Free Software, Etc., the inclusion of this tracking technology into Slashdot really shows that the Dollar is really more powerful than some would like to admit.

  • From the article: Why is this important?

    If sent the infomation to without going through Alice's computer, then all could learn is that someone created a login at with the e-mail address

    Wait a minute. knows all of the information that they discuss (ip, browser type, etc). What prevents from transmitting this information to through a separate channel without Alice's knowledge? -bs

  • I found that if the image is being loaded as a Javascript object, it will not detect Bugnosis.
  • by dmmjr ( 182944 ) on Friday June 08, 2001 @08:57AM (#166730) Homepage
    Yep, we consider the OSDN image to be a Web bug, because it acts as a surreptitious information conduit between, the reader's computer, and Information sent through this path picks up both slashdot and OSDN cookies, so it bypasses the "same domain" rule preventing one domain from manipulating cookies set at another. Of course there's no way for Bugnosis to understand the business relationship and contracts that may restrict the use of the conduit (P3P will help with this). What's absolutely clear is that a facility designed for displaying images is being run in reverse to transmit information without the user's permission or knowledge.

    Many people have been asking (cursing, etc. :) for Mozilla, Mac, Opera etc. support. I think it would be great to investigate, and I have a student trying to learn something about Mozilla now. We just don't have the expertise yet. I'd be very interested in hearing from potential contributors. Heck, just a plugin or diff that shows how we can tap into browsing events and access the DOM in Mozilla could make it possible for us to proceed. Frankly, IE support was pretty easy because of all the books and sample code out there. Besides, we had just finished a long-winded report [] on IE browser extensions & their privacy practices when we started this project, which made Bugnosis pretty easy to envision.

    We decided not to make Bugnosis a Web bug blocker, just a good analysis and exposition tool. See, the problem with many "privacy enhancing technologies" is that they put the burden on users to protect themselves. I firmly believe that being concerned about privacy shouldn't mean that you have to make it a huge personal priority, say, by committing time to downloading, maintaining, and upgrading yet another piece of software. Privacy should just be built in. Bugnosis shows how the current infrastructure is being used, and so contributes to the debate on what reasonable standards should be. In the privacy arms race, I'd much rather be a reporter in the trenches than an arms manufacturer -- even defensive arms.

    Any CS students interested in working with us? We'll be setting up at Boston University in the fall.


  • by Jodrell ( 191685 ) on Friday June 08, 2001 @04:27AM (#166732) Homepage
    One of the cool things about Mozilla (and its Linux [] and Windows [] derivatives) is the opportunity to only accept cookies from the current page. I'm sure that when Mozilla is released and starts to take chunks out of IE's dominance, people will start to use this feature and web bugs will become less useful.
  • Is bugnosis open-source?

    And if it's not, how do I know that it's not spying on me?

  • IE on Mac OS has this in 4.5, if not earlier. Heck, Lynx has this.

    But if you want excellent cookie control-- not to mention some real control over Java[Script]* then the browser to have is Konqueror.
  • I'm still thinking about the consequences. A few years ago every idiot i ran into tried to convince me of disabling Cookies while I still think it's a great idea.

    Now I find myself left wondering wether it's ok for one website to transmit this sort of information to another website. I'm even wondering why they try to sneek it into the client like this instead of just sending each other grepped weblogs.

    What's useful about this?

    And what are the privacy implications?

  • by closedpegasus ( 212610 ) on Friday June 08, 2001 @05:09AM (#166738)
    What's the big deal with web bugs, anyway? As long as the tracking that's being done is for use by the site I am visiting, I see no problem with's just a tactic for getting usage statistics about your site. And what's wrong with that? When you go to a store, there are video cameras watching you, and records of your sales, etc...why shouldn't a website know which pages were visited? As long as the information being collected can't be used to uniquely identify me, I see no problem with it. A web bug can't collect any more information than your standard log file, and maybe get access to your cookies. But it can only access cookies *that were set by it in the first place*. Web sites don't have the luxury of talking face to face to everyone who comes to the site, like a retail store does. Somehow, they need to monitor what's going on, and a web bug is one way to do a good job of it. One could easily add the same code the web bug executes to the top of every page...and I don't think there would be any problem with that. Web bugs are just a more elegant solution -- you can abstract out all those tracking functions, and use it as a module.
  • So /. has been bugging me this whole time. You think you know somebody and then something like this happens. My most paranoid fantasies are coming true.

    In other news: "Do Nothing" Congress Becomes "Highly Ineffectual" Congress []

  • by update() ( 217397 ) on Friday June 08, 2001 @05:59AM (#166742) Homepage
    Richard Smith writes:
    "Any company that uses Web bugs on their site should say so clearly in their privacy policies and explain the following: why they are being used, what data is sent by a bug, who gets the data, and what they are doing with it," he added.

    The submitter writes:
    It would seem our beloved slashdot has them as well.

    Of course, a number of Slashdot readers were already familiar with this topic -- those of us who sometimes read at -1 have seen this subject raised and modded down, and then addressed by Slashdot editors who are then modded down by angry trolls. Or you can read about it on one of the troll web sites.

    And this is the way all information about Slashdot is handled. Why did moderation go completely nuts a month ago? The only official word was in a -1 post from Michael buried in a -1 thread. Beyond that, you have to read (site whose name I won't mention to avoid getting 200 idiot sporks and crapflooders on my case) to find out what's going on. As always, security through obscurity doesn't work; it only confines the information to the people you least want to have it.

    The bottom line, though, is that it comes down to trust. There's never been an official explanation of what the web bugs here do but while I don't, for instance, trust the editors to have any concept of what it means to be logically or ethically consistent, I do believe that they wouldn't do anything outrageous to my privacy.

    Unsettling MOTD at my ISP.

  • For Windows, try AdSubtract []. This is a proxy that strips out ads, cookies, background images, videos, pop-up windows, java/script and hides referrers. You can configure it globally, or on a per-site basis.

    I use it for the cookie-blocking, but the ad-blocking is a nice side effect. I let ads through for those sites that I regularly visit and aren't riddled with seizure-inducing 150x600 pixel monstrosities. Hmm... come to think of it, only four sites I visit these days even fits into that category!

    It keeps stats. I block about 300 cookies, 40 popups, and 700 ads over the course of a day.

  • OSDN _used_ a webbug - a 1x1 pixel trans gif.... It could very well be just a page counter. Looking at the one I see right now.

    IMG SRC=" ments,992005157" WIDTH=1 HEIGHT=1 BORDER=0

    reguardless of what it is doing, that looks like one to me. guess I could check the source and see what it is up to...

    Anyhow, web bugs - like cookies or anything else - can be used for both good or evil. There was no judgment here, just a chuckle at who they listed as sites with web bugs.

  • Cookies are not the big deal. I can block those. Its the 1x1 gifs that kick off an HTTP request, with additional params that bother me.

    Look at a few and you will see...

    http://svr/path/[*.dll|.gif|etc]?param0=xxxx (amps)param1=xxxx...

    That, my friend, gives you something far better than just a server log entry. And there is no blocking it... unless you start taking notes and set up your host table to say * is at

  • by MeowMeow Jones ( 233640 ) on Friday June 08, 2001 @04:54AM (#166752)
    There is a little-known feature in the Apache Webserver that quietly logs your IP address as you view pages from it.

    Trolls throughout history:

  • How did that post get into the Web Bugs thread ?? Did IE have a nervous breakdown?
  • If I were designing a browser, I would have a cookie monitoring window, which would log cookie activity.

    If I had a choice I would prefer a browser that helps me to manage the various cookies (or better cookie-requests) rather than showing me all those cookies in a monitor window.

    Cookie management here denotes something which allows to:

    • Reject cookies by originator, lifetime, or purpose, the latter one being particularly difficult to implement,
    • Accept cookies explicitly in certain situations, e.g. when clicking the save-my-settings button somewhere, and
    • Surf the Net undisturbed by cookie request dialog windows.

    Compared to Netscape-style cookie warnings, such management would be actually usable and useful. It would give the user actual control instead of a simple cookies/no cookies choice. And such a scheme would preserve the option of using cookies where they offer some added value to the user, like in personalisation of sites.

    Personally, I don't want to monitor cookies, I just want to ignore most of those having a lifetime of more than a few days. Web browsers should support this type of control.

  • Webwasher does not use regular expressions to filter images: it filters them by size.

    OK, so how does webwasher know how large the image is before it sends the http request? If it gets the image and then refuses to bother you with it if it's deemed a web bug, then webwasher is worthless. Indeed, it's less than worthless, since it luls you into a false sense of security. Once that http request is sent, the web buggers have your personal data (whatever info was sent in that http request) plus your IP address (so they can send you the image). After that, they don't give a rat's ass if you do or do not view the 1x1 gif.

  • by SpeelingChekka ( 314128 ) on Friday June 08, 2001 @05:20AM (#166767) Homepage

    What bothers me most is the scale on which the tracking is done; since so many sites use particular ad agencies (say doubleclick) they can build a list of many of the sites I've visited. For example, say I browse a gay porn site, then I browse a Quake3 games site, then I visit Amazon to look for comic books. Double-click need only have an information-supplying affiliation with one of those that may have my "real" personal details, name etc (for example Amazon), from that they can build a fairly extensive database of what I do online. All without my consent, which is against the law in my country, but in the US it seems companies can do this openly with no fear, so I'm guessing its not illegal in the US.

  • by SpeelingChekka ( 314128 ) on Friday June 08, 2001 @05:48AM (#166768) Homepage

    I see no problem with's just a tactic for getting usage statistics about your site. And what's wrong with that

    You missed the point. Thats fine, there is nothing wrong with that, but that is not the issue here. Web bugs are not attempt to gather statistics at a specific site, web bugs are attempts to track surfing across multiple unrelated sites. For example, say I visit a gay porn site, which have some doubleclick ads with hidden bugs in. Then off I go to to order a book about fly fishing, and unbeknownst to me, once again doubleclick has web bugs on Amazons site. So now a company (doubleclick) has a database linking the same user to those two completely unrelated activities. Now all doubleclick needs to do is establish some sort of affiliation with Amazon, and whammo, doubleclick suddenly knows my name, and has a database indicating that I have bought books on fly-fishing, like gay porn, browse slashdot, am anti-Microsoft, enjoy reading The Onion every Wednesday, whatever, they have a huge database on me. All without my consent or knowledge (which happens to be illegal in my country, but it would seem not in the US.) Sure you can say "don't use cookies" or "delete your cookies regulary", but what the fuck, thats not a solution, thats purely symptomatic treatment of the REAL problem, which is that these companies should be strictly prohibitied from doing this sort of thing in the first place. Either way, more than 80% of people are not even going to know how to delete their cookies or will just be too ignorant of the problem to care. Americans seem to love treating the symptoms of a problem but ignoring the actual problem itself.

    And you may not think doubleclick would be able to collect much info - but trust me on this - double is EVERYWHERE. It is virtually impossible to do casual web browsing for more than a few hours without getting doubleclick cookies. Try it. Delete all your cookies, browse for a while (casual browsing, e.g. some slashdot, maybe some cnn or other news sites, maybe some gaming sites etc), and see what cookies you have. Chances are extremely good you have,,,, and a few of the other very common ones.

    We're not talking about web statistics or cookies here. Get the facts straight.

  • Webwasher does not use regular expressions to filter images: it filters them by size.

    Excellent! Does it block them based on the <IMG> tag attributes, or does it go ahead and load the image headers? Guidescope [] uses a central database of image URLs that users have chosen to block individually. Now if I can find a way to chain Webwasher and Guidescope together, my solution will be complete.

  • by jonathanjo ( 415010 ) <jono AT fsf DOT org> on Friday June 08, 2001 @05:34AM (#166775) Homepage

    Yet another reason iCab [] is my favorite browser.

    It has the most sophisticated filtering system I've seen. You can filter cookies using many criteria, including (my favorite) blocking cookies that come from a different domain from the main page. AND you can filter IMAGES by size, w/ options to exclude sizes including 1x1px (this blocks most web bugs) as well as most common advertisement sizes, like the ubiquitous banner. What you get instead is a blank banner-(or whatever-)sized space with an icon of a coffee filter in the corner. Hee!

    And speaking as a web designer, the feature doesn't compromise the legitimate use of spacer GIFs.* Page design is preserved, and who cares if the 1-px. GIF is actually loaded or not.

    *Yes, I know that with CSS we shouldn't need spacer GIFs. I will rejoice when browser support for CSS is consistent enough for us to rely on them. Meanwhile, though, clients still tend to expect web pages to be as as precisely designed as print, and sometimes you gotta cheat. But that's another discussion.

  • by turbine216 ( 458014 ) <(moc.liamg) (ta) (612enibrut)> on Friday June 08, 2001 @04:26AM (#166782)
    that little /. bug is intended to merely collect your anatomical information and take a little something we like to call a "DNA fingerprint". makes it easier for everyone to know what kind of As-Seen-On-TV products you might wanna buy. _______________________________________________

  • But I was hit with a strong sense of irony when I saw "Microsoft" and "Web Bug" and thought that someone had developed a plug-in that would tell you if the page you were viewing was written in bad html.

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst