Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Your Rights Online

Browser Spyware: Watching Where You Linger 395

An Anonymous Coward writes: "Just when you you'd installed Junkbuster and thought it was safe to go back onto the web, the BBC runs this story which tells you that webshites will soon(?) be able to tell whether you are reading the page, what parts of it are of interest to you, etc. Guess we can expect porn sites to be the first to take advantage of this." Or perhaps someone else is already doing this, and hasn't told you.
This discussion has been archived. No new comments can be posted.

Browser Spyware: Watching Where You Linger

Comments Filter:
  • Sinister... (Score:2, Funny)

    by jedwards ( 135260 )
    "I can tell because when you read a webpage, you do one of a couple of things. You either shovel the mouse off to the right ..."

    I guess I shall just have to become left handed then.
  • marketing (Score:3, Offtopic)

    by Spagornasm ( 444846 ) on Monday September 10, 2001 @10:43AM (#2273268) Homepage
    Get ready for the "marketing geniuses" to take advantage of this...by having new windows pop up right when you move your mouse to the back button...

    Anyone else up for using keyboard shortcuts now?
  • The nightmares will begin tonight of a microsoft paperclip assistant style popup that pops up right as I read something. Of course I'm probably dooming us all by mentioning this idea, ah the irony.

    Must hit post before I think about repercusions.

  • by firewort ( 180062 ) on Monday September 10, 2001 @10:43AM (#2273273)
    What matters here is who they tell, and who they sell it to.

    I can't stop them from tracking (yet.) I do turn off all activeX, ask on cookies, no scripting, etc... but if they can get around my disabled browsing habits, then what matters is who they tell.

    Time to go back to safeweb, as well.

    • Enough... (Score:3, Interesting)

      First spyware and then web bugs. What needs to happen is that the public has to say "Enough is Enough" and not use products or services that violate their privacy or utilize these types of tools.

      Unfortunately, the average person takes what is available to them simply because of the convienience of doing so. Apathy sucks, doesn't it.

      Anybody up to writing an HTTP proxy or filter that strips out this info as it is being returned to the offending site? I guess it should then redirect the user to a site informing them of what has or was about to happened. Maybe the internet community should develop an RBL-like list for websites that pull this stunt? Anyone up for an RFC?

      Here's a thought...remember Dr. Hawking's fear that machines may someday subjugate us? Image a concious website that maniputes us into doing whatever it wants us to do or believe. Damn...my computer is calling me again....

  • Deus Ex (Score:3, Funny)

    by FortKnox ( 169099 ) on Monday September 10, 2001 @10:44AM (#2273279) Homepage Journal
    Or perhaps someone else is already doing this, and hasn't told you.

    Somebody was up late playing Deus Ex last night, right timothy??
  • ....be able to tell whether you are reading the page, what parts of it are of interest to you, etc. Guess we can expect porn sites to be the first to take advantage of this." Or perhaps someone else is already doing this, and hasn't told you.

    Does anyone actually *READ* porn sites? Maybe the keyboard needs a 'moisture detector' to see when and if the user is drooling, then send the result back to the spy server.
    • The don't read the pages really, but the will scroll down looking at the pictures much like they would when they're reading text. The difference is it would appear that they might be re-reading the same area several times. I think it could be very interesting to see where the mouse travels on a porn site...
  • by smartin ( 942 ) on Monday September 10, 2001 @10:47AM (#2273304)
    I wish sites would realize that pissing off their viewers with popups and big honking ads, does not make the viewer more likely to visit the advertisers site or buy their product. It has quite the opposite effect. I've stopped going to some sites that I like for the simple reason that I really F*ing hate popups!
    • by Greyfox ( 87712 ) on Monday September 10, 2001 @10:58AM (#2273389) Homepage Journal
      Konqueror and Mozilla both allow you to disable popups while allowing JavaScript to run. I believe that at least Konqueror and possibly Mozilla as well will allow you disable or enable features on a site by site basis. The web has become a whole lot less obnoxious since I set Mozilla up to disable popups and animation. I highly recommend running a browser that will let you do this. Mozilla is now fast enough that I can actually tolerate using it and has been since a CVS build about a month and a half ago.
      • Mozilla definitely does allow you to disable popups. See http://www.mozilla.org/projects/security/component s/configPolicy.html

        Even more off-topic:
        Does anyone know how to make Mozilla lie about what User-Agent it is? My bank software rejects Mozilla, claiming it's not compatible. I'm pretty sure it is, and I want to try to make Mozilla claim to be IE on that domain.
        • On the topic of pop-ups, I've read through the page you cited, but I still have one more question: does Mozilla have the ability to enable pop-ups only from clicking on a link? Disabling pop-ups entirely is irritating as many genuinly useful sites use pop-ups when a link is clicked. It seems that the Mozilla solution is to add each legitimate site by hand; hardly an optimal solution.

          FWIW, OmniWeb [omnigroup.com] has this feature.

          - j

          • Disabling pop-ups entirely is irritating as many genuinly useful sites use pop-ups when a link is clicked.
            If you go to the link [mozilla.org] given in the parent post, you'll see that it can be configured on a site-by-site basis.

            Most pop-up ads come from one of the usual banner-ad sites, not the actual website, so this feature works pretty well.

            Here's my user.js file - you may find it useful. I allow pop-ups by default, except for the listed sites.

            // Stop animated gifs after one iteration.
            user_pref("image.animation_mode", "once");

            // Stop windows from popping up when they finish loading pages.
            user_pref("mozilla.widget.raise-on-setfocus", false);

            // Block these sites from opening their own windows
            user_pref("capability.policy.strict.sites", "http://www.car-truck.com http://www.cnn.com http://www.dictionary.com http://media.admonitor.net http://popup.zmedia.com http://ad.doubleclick.net http://www.netsol.com http://rd.yahoo.com");
            user_pref("capability.policy.strict.Window.open", "noAccess");
      • In Mozilla you can disable any javascript method or property on a site by site basis.

        So you can disable window.open, OnClose and other annoying methods.

        Deny scripts access to data on your browser, screen dimensions etc.

        See here [mozilla.org] for info on how to do it.

  • Unlike other forms of spyware, this would be easy to resist... Wouldn't people who were concerned about their privacy just get in the habit of swirling their mouse around while reading web pages?

    I can't give a logical reason why this particular technology disturbs me more than other types of spyware, but for some reason the idea of my mouse movements being tracked just makes my skin crawl... Does anyone else have that sort of gut-level revulsion?
  • by hardaker ( 32597 ) on Monday September 10, 2001 @10:47AM (#2273306) Homepage
    If you carefully configure your web browser I would think you could avoid being tracked:
    • Turn off javascript support. This is likely how their doing their "what part of the page you're looking at" tricks (watching the scrollbar usage).
    • Don't accept cookies. Don't go to sites that force you to accept them.
    • Turn off auto-loading of images. This is the one that no-one does, but with the increasing frequency of single pixel tracking images, it might be a wise thing to do. Junkbuster is certainly a good alternative, but it won't catch everything.
    • Konqueror has the ability to change your user agent. It'd be cool to write a "random" mode to it where it randomly selected from it's list of user agents to send to the remote site ;-)

    • by Grishnakh ( 216268 ) on Monday September 10, 2001 @10:52AM (#2273344)
      It seems like it'd be a good idea if Konqueror added an option to ignore single-pixel tracking images... should we submit this to bugs.kde.org?
        • It seems like it'd be a good idea if Konqueror added an option to ignore single-pixel tracking images... should we submit this to bugs.kde.org?

        It's a good point, however I don't think it'll help. Many sites are finding otherways of getting around that like using forms parameters within the URL itself. Eventually they'll get intelligent and name the larger images with a tracker extension, but still return the same image. IE, src="logo.jpg-234987575" and merely have their nifty web server strip the extension off (and use it) before returning the image to the caller. You don't need 1x1 imagse when you can use real images.

      • by mosch ( 204 ) on Monday September 10, 2001 @11:31AM (#2273613) Homepage
        No, because single pixel gifs have legitimate purposes too. Not to mention the fact that any image can be a "tracking" image.

        Example: Let's say you want to draw a horizontal bar with a rounded edge, ala slashdot. You can make an image that has the rounded edge, then a seperate image that's simply a one pixel gif of the same color, that you then stretch by using height and width attributes on the img tag.

        This will prevent the color differences between the two images, as they'll both be using the same graphics library to display. This however also minimizes download time, because all you really need to make a colored bar is one pixel of the exact color you want.

        Be less paranoid.

        • Example: Let's say you want to draw a horizontal bar with a rounded edge, ala slashdot. You can make an image that has the rounded edge, then a seperate image that's simply a one pixel gif of the same color, that you then stretch by using height and width attributes on the img tag.

          I believe the problem is with grpahics that are 1x1 pixel and not scaled by the img attributes. So I would block any gif that's 1x1 and not scaled in the HTML (or any graphic that's explicitly scaled to 1x1). These are the dangerous ones, and your example does not fall into this category.

          • I would block any gif that's 1x1 and not scaled in the HTML

            That's still a broken concept. If browsers start blocking unscaled 1x1 images, they'll scale their 1x1 image to be 1x2, and then what? Block all 1x2 images as well? It's a slippery slope.

            These are the dangerous ones, and your example does not fall into this category.

            The fact that they're small does not make them "dangerous". Any image, whatsoever, can be used as a bug. You could use my example of a large horizontal bar as a bug, or the title graphic of the page, or the image used on the search graphic. Every single image on a page can be a "bug". Additionally, every link can contain trackable identifiers if the website designer so wishes.

    • by UM_Maverick ( 16890 ) on Monday September 10, 2001 @10:59AM (#2273398) Homepage
      have you actually used the web lately? Your ideas are great in theory, but in practice they take you back about 6 years. E-commerce goes out the window w/out cookies. Many sites become unusable w/out javascript (Not just sites that do "onclick=location.href", but there are many sites that actually use javascript *well*). Turning off images means that you won't see half of most sites...and the list goes on...

      Now I know what you're going to say: "If site X won't let me browse my way, then I don't need site X". Well, damn near every site out there is becoming site X. Whether you like it or not, that's the way the world is moving, and you can either accept their way of doing things, or stay in 1995.

      Hmm...just re-read that, and it sounds like a flame...I really didn't intend it to be...just meant it to be more of a wake-up call.
      • I totally agree. You cannot use Yahoo! mail attachments without javascript popup windows. You cannot use freshmeat (efficiently) without cookies. It will be too hard to browse cartoonnetwork.com without images.


        And single pixel images are used in many sites. Again, freshmeat uses single pixel images for thin lines. (I also use them too).


        Anyway, forget it. Web is no longer a medium to distribute content, but now formatting and layout.

        • have you actually used the web lately?

        not in years, no.

        • E-commerce goes out the window w/out cookies.

        The way I figure it, if you have to buy something then you're stuck turning on those features. However, since you're submitting your address, credit-cards and other personal-info to them it's unlikely you'll care much about mere tracking information. They've already got you, essentially.

        • Many sites become unusable w/out javascript

        Actually, I've been amazed at the number that do work. You're right, of course, many require them. And I do have Konqueror configured to allow JS on some sites. However, by default preference setting is "off" for any "untrusted" site. The sites that I generally turn them on for are E-commerce (as you mentioned above) and other account-type sites where I have accounts located there.

      • by cyberdonny ( 46462 ) on Monday September 10, 2001 @11:23AM (#2273568)
        have you actually used the web lately? Your ideas are great in theory, but in practice they take you back about 6 years. E-commerce goes out the window w/out cookies. Many sites become unusable w/out javascript (Not just sites that do "onclick=location.href", ...

        Actually, I usually surf with javascript turned off, and the sites where this causes problems can be counted on the fingers of one hand. And for those rare sites I have the choice of

        • not there going again
        • just allowing those sites in my konqueror browser's javascript ACL.
        Of course, if you're in the habit of surfing to porn sites, you might be somewhat more dependant on javascript...

        ...but there are many sites that actually use javascript *well*).

        Actually, using javascript well should mean to not make an obligation out of it, but to use it solely to provide additional and optional functionality. The site should still stay useable even if the user doesn't want or isn't able to use javascript. You know, blind people who are bound to surf using lynx (because their braille lines, or text-to-speech engines only support text browsers) cannot just turn on javascript, even if they wanted!

      • by Ed Avis ( 5917 ) <ed@membled.com> on Monday September 10, 2001 @11:38AM (#2273647) Homepage
        We really need a browser that lets you *selectively* disable Javascript. I think the default setting should be to have JS turned on, but with a few particularly obnoxious features (popping up new windows, adding hooks to the scrollbar or mouse movement) turned off. You should be able to adjust these preferences on a site-by-site basis.
        • Try Galeon [sourceforge.net]. In the Preferences you can disable popups and disable status bar rewrites. You can also turn on and off javascript from a menu option (and like all GTK menu options, you can bind any key you want to it). In the latest CVS, you can even disable/enable popups from a menu item. Useful.
      • I have ACLs for cookies, the sites that actually have some legitimate reason to use them are allowed, the ad-tracker sites are dissallowed. Works great. You can do this in Opera and Konqueror and (I think) Mozilla.


        I turn off image loading regularly, and the number of sites that are worth loading and won't work without images I can count on one hand. There's... my bank. Hrmm... can't think of even one more right off, although there probably is.


        Same comment for javascript. It's always been more abused than used, and except for my bank I can get by just fine with it turned off.


        Now I know what you're going to say: "If site X won't let me browse my way, then I don't need site X". Well, damn near every site out there is becoming site X. Whether you like it or not, that's the way the world is moving, and you can either accept their way of doing things, or stay in 1995.

        How many web sites are there out there? Now how many of those are actually worthwhile? Big difference. If you think looking for content, rather than glitzy layout, is "stay[ing] in 1995" then you are the one that needs a wake-up call.


        Doing whatever everyone else is doing, just so you can feel like you're current, is not a commendable or desirable habit.



      • Several answers (Score:5, Interesting)

        by Croaker ( 10633 ) on Monday September 10, 2001 @12:36PM (#2273960)

        I have a mutli-level armored approach to browsing:

        1. I installed Bugnosis [bugnosis.org] which is designed specifically to deal with single pixels images that might be web bugs.
        2. I use Proxomitron [cjb.net] to do Javascript filtering. It cuts out the worst examples of Javascript annoyances (popups, leaving the page triggers, etc.) The filters are editable, so you can customize them yourself to filter out things like this spy script.
        3. I route everything through Junkbuster, which gets rid of the ads that Proxomitron misses.

        All of the above besides Junkbuster are Windows-only. The first one is specific to IE, but I end up using that anyhow, since it's the most stable Windows browser.

        I can browse most sites that don't do stupid shit like refuse to serve pages to me if they cannot detect my browser (in which case, they are probably crap, anyhow). For shopping sites, I can just add the site to Junkbuster, or bypass the protection through Proxomitron. I am pop-up ad free, and I give out minimal information about myself. The other better way of browsing I could see would be to use an anonymous proxy, which would protect my IP addess.

        Of course, this would bet better implemented via the browser. I was using Konqueror a lot at home under Linux, but it began crashing too much for my tastes. There, I've just stuck to using Mozilla with Junkbuster. Javascripts still sometimes get through, though.

    • I find if I close my eyes while I browse, all the big bad men who spend all their lives tracking me go away.

      That or type left handed - that always throws them off. Gotta run before they figure out my whereabouts from this post.

      Chet
  • by BillyGoatThree ( 324006 ) on Monday September 10, 2001 @10:48AM (#2273314)
    For crying out loud, /., lighten up. Remember back in '95 when you couldn't turn on the TV or read a news magazine without some lame story about online stalking or pedophiles in chatrooms? And we all mocked them by saying "that's no different than real-life, what's all the hullabaloo"?

    "Brick and mortar" stores do exactly this same thing. Many have cameras, the rest use "secret shoppers" (people who look like they are shopping but are really watching YOU) to discourage shoplifting, check competitor prices AND research in-store "migratory patterns". For instance, haven't you ever noticed that ALL grocery stores have the fresh fruits and vegetables right by the door?

    This isn't "Your Rights Online". This is "Translating Nothing Cares About In RealLife Into A Scare Story About 'The Net' In Order To Attract Eyeballs To Slashdot."
    • Now put those cameras in your house, and a couple of secret siblings too, and see if it's still okay.
    • by laetus ( 45131 ) on Monday September 10, 2001 @11:43AM (#2273673)
      Just because a store researches something doesn't mean they're going to make the shopping experience better for the consumer.

      Case in point: The grocery store you referenced. Haven't YOU ever noticed that the dairy, bread, and fresh vegetables/fruits are scattered at different corners of the store.

      And you know why, to make you wander the other aisles to get you to buy crap you didn't originally walk in to get.
      • Duh. The comic shop I used to work at (Action Comics. Quite an original name, no?) did this with the new comics. They were along the back wall so people would have to walk past all of the games/toys/cards and other assorted whatnot to get their biweekly X-Men fix. This isn't bad for the consumer. Sure you've got to walk an extra 15 feet or so (God forbid) but you also get to be exposed to different crap that you might not have been exposed to before (impulse buy!). Unfortuantely, this did not work for Action and they are now long out of business.
    • I could almost buy the tracking "migratory patterns" argument- but they get that without needing any of the other spyware. Hell, the server tracks that as usage log information. The other reasons are just non-valid (Checking competitor prices? Go hit their site. Shoplifters? Don't make me laugh...).

      There is NO good reason for the spyware. If the hit info isn't giving them things they like, maybe there is a reason for it. Could be they're doing something wrong- or maybe they bet on the wrong thing...
    • How did this get modded up? Isn't this obvious troll material.

      Please, someone bounce it back down.

      Now, getting to what the article actually says: I'm getting closer and closer to the opinion that we're in the middle of a war on privacy (to use a US-world-view phrase). It started out with the usual garbage about how companies needed to know how good their advertising was (to which I ask "why?").

      But, this clearly crosses the line. No one needs to know that on a page with 7 stories, I spent more time looking at the one on penguins. There is no good excuse for this.
  • "Cheese"? (Score:3, Funny)

    by Anoriymous Coward ( 257749 ) on Monday September 10, 2001 @10:50AM (#2273330) Journal
    The system developed by the team at MIT is called Cheese, since they are following the mouse, like a mouse follows cheese.

    Wouldn't a better title have been "Cat"? Or perhaps "Rodent Stalker"?
    • Re:"Cheese"? (Score:2, Insightful)

      by Fortissimo ( 45876 )
      This one struck me as odd, too. Don't believe I've ever seen a mouse "follow" cheese. Wasn't even aware that cheese could move. The MIT folks may be brilliant, but they ain't creative.
    • In the spirit of FBI's tactless naming scheme, how 'bout Mousivore(TM)?
  • This is garbage (Score:2, Interesting)

    by Velex ( 120469 )

    Oh, come on. This is pure garbage. How much info could one possibly glean from whatever javascript the researchers were using to capture the mouse movements? For me, whom uses the keyboard excessively and only moves the mouse when I'm sure I want to click on a link, there isn't anything that they can possibly gather. Besides, if they want to monitor my mouse movements, maybe they can see how quickly my reflexes to close pop-up windows before I even know what's in them come into play.

  • Eh? (Score:4, Interesting)

    by stripes ( 3681 ) on Monday September 10, 2001 @10:51AM (#2273337) Homepage Journal
    "I can tell because when you read a webpage, you do one of a couple of things. You either shovel the mouse off to the right so that it is out of the way, or you will walk down the page with your mouse," he told the BBC's Go Digital programme.

    Yeah....or I'm one of the 5% of the computer market with a Mac and I'm one of the 90% of Mac users that have discovered that when I type the mouse goes away. So I press down arrow and *poof* I don't need to move the mouse out of the way, and my finger is right where I need it to scroll down to read more of the story.

    (Or I could turn off JavaScript, which is a good idea because it gets rid of a lot of irritating popup and popunder ads -- which is a pretty good idea, even 'tho it breaks a few sites)

  • I, for one, won't run a client that allows a site to profile me in this way.

    If I understand this correctly, this technology would require the client to send data to the server about mouse movements, etc, for tracking purposes.

    So I could simply elect not to use this type of software, correct?
    • and it's enabled by default on all major commercial browsers. Yes, you can turn it off, but then you'll miss out on the gee-whiz stuff that sites put up in lieu of content.

      <rant>
      What really pisses me off is sites that have information that I want (in HTML) but won't give it unless I pass through their flash corridor.
      </rant>
    • by stikves ( 127823 ) on Monday September 10, 2001 @11:20AM (#2273545) Homepage
      No it is not necessary. The site can have two "frames". One of them would be the main frame filling the entire window, the other will be the tracking frame, which is insivible (or 1 pixel high).


      Then the javascript code in the main window will fill a string with your mouse movement like:


      (100,100)-(110,100)-(110,109)-...


      After the buffer is filled enough, it will update the hidden frame with a code like:



      TrackerFrame.URL = "http://server/track.cgi?" + str;



      That's it. That's all. Your tracking is complete.

    • Since the crucial part is not the javascript getting mouse positions, the crucial part is javascript communicating those positions back to the web server.
      Mozilla already has fine-grained control over which sites you allow to send cookies to. Someone could add another fine-grained feature to control what sites you allow javascript to send http GET/POST commands to. It could also set which javascript commands you want to enable. This is already the case with window.open()

      I also thought maybe we could make the broswer show you what info it is posting back, and let you approve it. But then, sites would just encode it so its not human-readable.

      So this is a complicated issue, but one we can deal with, since we have Open Source Mozilla.
  • Making web sites easier to use and catoring the content to what the users expect isn't a bad thing. Microsoft does very similar things in their GUI design. They get a large group of people together and have them do common tasks and watch every mouse movement and click and find ways to speed up the process.

    Sure, I don't want someone tracking me, but keeping aggregate data wouldn't be bad. By doing this maybe they can speed up access to information instead of having me hunt around for what I want.
  • by daviddennis ( 10926 ) <david@amazing.com> on Monday September 10, 2001 @11:01AM (#2273423) Homepage
    While reading the article, I left the mouse in the main browser window and used the keyboard to scroll. So if their system was used, it would make it appear that I was not reading the article, even though I did in fact read it.

    Really, if you stay on a page for more than a few seconds, you're probably reading it. And that would surely be simple enough to determine, although you'd have to figure out a bulletproof way to put up an invisible frame in order to send the information to the mother ship. It would probably be easiest done in Java, which can do that without pulling up a web page, but many people have non-working Java, so even that's not foolproof.

    Unfortunately for the people who created this model, once people become aware of how it works, it will no longer function. People who would formerly hover the mouse over a link would simply refrain from doing so and therefore give the system no useful data. I also suspect individual personal styles are going to be different enough to stymie them in the end. I am not convinced that people only visit links directly if they have been to the site before, for example.

    For the person who said a scroll mouse would defeat this system, I'm sure signals from the scroll wheel can be read as well.

    When I am hesitating between multiple items, I will often put them in my cart, look at the total and then remove the one that makes the total too high, or that I'm unsure about. Anything I put in my cart and took out, and any abandoned shopping cart contents, would be a ripe selling weapon that can already be used without relying on this technique.

    I think this one's too flaky for practical use. But as always, we'll see.

    D
  • This is yet a little more frightening...

    I think that the idea that some AI code can tell what I'm truly interested in and what I'm going to buy is ridiculousness. While it may be true that most people do work in similar ways with the interfaces of web-published documents, what goes on in the individual mind during the process is certainly unknowable.

    This technology sounds like it could cause more harm than good. I can see this sort of thing narrowing the scope yet again of what content is available online.

    This will lead to the customisation of individual users' content without them even being aware that it is happening. "Can you imagine if I can actually tell that you wanted to press a link but didn't". (What?! Maybe there's a reason why I didn't!?)

    It's bad enough that content it already spoon-fed to most people already - does it have to be chewed for us now first too? And when the people are only exposed to the things that the corporations will believe that we're interest in, it will lead further to the atrophy of the collective consumer consciousness.

    Fortunately for me, I'm still using the 10th Edition of the Newspeak Dictionary... perhaps I'm a dying breed. *shrug*
  • by update() ( 217397 ) on Monday September 10, 2001 @11:08AM (#2273470) Homepage
    The story is interesting, and but the description of it here seems so far off that I briefly wondered if I'd hit the wrong link.

    Look, since day one of the commercial web, sites have obsessively tracked how many hits they get, where they're coming from, how a user moves through the pages, where they spend time and how often they return. (As if Andover/OSDN isn't doing all of those things -- or is this like with web bugs where we're just supposed to care about them on other sites?) That's one of the great edges the net was going to have over other media. To the degree that people are bothered by that and to the degree that they're technically sophisticated, they turned off cookies and otherwise interfered. And what does Junkbuster have to do with anything?

    What this seems to be is an incremental advance in tracking how pages are read -- there's a little added feedback about mouse movements and maybe scrolling. As always, if this takes off it will be trivial to block for those who know and care about such things. And everyone else has far more important privacy invasion being done to them.

  • by martyb ( 196687 ) on Monday September 10, 2001 @11:10AM (#2273478)

    Though what they propose probably has some application to the majority of users, I'm just as sure there are others who would not fit their expectations:

    • Keyboard-centric:Though most users primarily use a mouse, I've found in many cases it is much faster for me to keep my hands on the keyboard and navigate with page-up/page-down and cursor keys. Menu navigation can be much quicker too as I can make choices with keyboard shortcuts and mnemonics without first having to wait for each menu and submenu to paint.
    • Large display: Use a 21" monitor running at 1600 x 1200. That means there are many pages where there's no need to scroll; and those that need it, well, just use the page-down or arrow keys.
    • Touch screens There's no "hovering" or mouse trail; just TAP and you are there, with no record of any "path" across the screen. This will become more prevalent with PDAs.

    Besides, cheese is often placed in a mousetrap. This kind of technology feels like users are the ones being tempted by the cheese; what kind of trap are we getting into?

    • Recently I started using a pen tablet, got totally hooked and use it for 99% of my pointer input now. I'd be interested to know how their system works with one for the same reason you mention with a touch screen. Once you use a tablet for a little while your brain figures out the aspect ratio and you can pull the pen out of the input field and put it back down somewhere else with decent accuracy. As a result the pointer disappears and reappears across the screen. Anyway, just one more wrinkle for them to iron heh...

      LEXX
  • by Compulawyer ( 318018 ) on Monday September 10, 2001 @11:12AM (#2273495)
    I have noticed that when I log into Excite, some pages I view have been loading a 1 X 1 Applet that is transmitting information (at least time spent on the page) back to servers. As far as I am concerned the only uses for a 1 X 1 ANYTHING on a web page are no good.

    I have not yet grabbed the applet and tried to decompile it (mostly for lack of time), so I do not know exactly what it is doing in addition to sending time information, but it struck me as extremely obnoxious.

    I am stuck using Win98 and Netscape 4.7 at work, so I cannot use a more enlightened browser that selectively grants/denies JavaScript and Java access by domain name. So...I am stuck being watched to a certain extent.

    Is it just me or is anyone else sick and tired of being treated like some company's asset? I am tired of the companies I deal with trying to suck every possible dime out of the relationship they have with me -- ESPECIALLY when it comes to selling my personal information.

  • How difficult is it to configure one's web browser so that it rejects most of the scripting junk out there? If you are using IE, check out the security zones feature that allows you to toggle scripting, cookies, and so forth depending on to which of four security zones a particular site belongs. I'm sure the free browsers have something much more sophisticated. Use it!

  • Wouldn't it be illegal if I tried to insert something into their web server to spy on what information they're collecting about me while I'm viewing their web page? Is my computer not protected by the same laws that theirs are?

    I suppose you could argue that I'm leaving myself open to such invasion if I don't disable scripting - so why doesn't that argument hold when a web site doesn't close known security holes? At least there're valid reasons for wanting to leave scripting enabled!

    Hmm - if I declare my actions in browsing their website - mouse movements other than intentional feedback like clicking on a link - to be copyrighted material, could I get protection from the DMCA? Then that script to spy on me would be a tool designed to crack my copy protection scheme (which would consist of recording all mouse movements to a file with "(C) 2001" at the top and encrypting it by XORing with a 'secret' key). The fact that they intercept it before I record it just means that they have found a technical means of bypassing my protections).
  • by pjrc ( 134994 ) <paul@pjrc.com> on Monday September 10, 2001 @11:24AM (#2273577) Homepage Journal
    ... as the author of a modestly-sized website (about 100 pages), it would be nice to know which parts people are actually reading. Actually, what I've often wanted to know is what parts confuse my readers and where they need more help. Sometimes I get this via email questions, but still it's very hard to know what to do to improve specific parts of the site.


    Of course, there probably would be abuses of privacy by "marketing firms", but in the case of website that actually try to provide really useful information, this sort of feedback could really help direct the very limited time and effort towards improving the parts of the site that really need it. In my own case, it's often the classic example of a long-time expert not being able to identify with the pains of brand new users.


    Of course, there is the traditional usability study approach. Maybe someday I'll spend some money and do it.

  • "So when you act like you know where you are going on a place where you have no reason to know, then we know you have been there before."

    I don't know about the rest of you, but when I visit any website, even one I've never seen before, I use my eyeballs before I move the mouse--it's naturally much more efficient that way. In fact, most of the time it's my scrollwheel that's moving, and not the mouse itself.

    Honestly, I consider my mouse movement patterns almost completely useless, and I have no idea what good a website that "changed according to mouse behaviour" could possibly do me. Well, maybe links that I almost never hover could be tucked away; but I doubt ads would be included in that bunch.

    Eye-tracking has much greater potential...

  • This collection technique is implemented using current technology and does not require any additional software on the user's browser.

    This suggests that it's probably done with Javascript. People that care about privacy, security, and avoiding annoyances, haven't had Javascript enabled in 5 years. Although that technically makes it "opt-out", turning off Javascript is such a basic an almost automatic thing that web users do, that it's practically "opt-in."

  • 1) WebWasher (www.webwasher.de). It blocks cookies, scripts animations, web bugs, referer URLs, images, and a whole host of other things. It is highly configurable can be used as a proxy server if you have an in-house LAN connected to a shared broadband connection, and is much more powerful than Guidescope (which Junkbuster recommends for use under Windows).

    2) Ad-Aware 5.6 (www.lavasoftusa.com). Run this at least once a week. It will find any ad tracking cookies, spy-ware and various other privacy invading data/programs that get left on your machine. The new version scans your memory, your registry, and your entire HD (very quickly). It finds and removes everything privacy invasion related.
  • I guess that filtering the javascript involved would do the trick, or selectively writing a filter with Proximitron to catch the cookies, etc..

    This shouldn't be too hard to defeat, regardless.

    What gall for trying though. It reminds me of a Gibson story, (fuzzy on the details) but essentially "sensing" the patterns in someone's data enabled the corporations of the future to do precise targeting of consumers. Scary how we inch towards that every passing year.

    Hotblack_Desiato

  • Oh brother ... (Score:3, Informative)

    by Christianfreak ( 100697 ) on Monday September 10, 2001 @11:42AM (#2273668) Homepage Journal
    Typical /. "Big brother is watching us" paranoia. Come on! Did no one read the article? Some interesting points about it:
    • No client software required: In other words its a stupid Javascript. Translation you can turn it off
    • They only tested 17 people. Translation either the MIT student doing this is an idiot or the BBC article is hype. I vote for "C" both.

    This is not Your Rights Online nor is it news. Lets go back to bashing M$oft.

    Rant Mode OFF.
  • I don't browse webshites [sperel.com].
  • Ok folks, before everyone goes ballistic about the latest way to monitor what goes on in a browser (I'm probably too late), consider this. If they really see how we ignore banner ads and slam close popup windows, is this a bad thing? Maybe the Evil Marketing People(tm) will finally realize what doesn't work with ads and quit doing them. Maybe they'll realize that more-intrusive-ad!=more-attention.

    Sometimes you have to look at things for what they can do positively, not just negatively.
  • 1) Voice navigation - I think that I finally found an everyday use for this...

    2) Run your own Spider - Jam the recording site with "Noise" web traffic associated with your cookie/session. A good spider/robot could simulate mouse coordinates, etc.

    Just a couple of quick thoughts. I'm sure there are more...

    jeremiah cornelius

  • by Heem ( 448667 )
    Why do people need all this fancy mouse tracking.
    (grumpy old man)
    In MY day, we looked at the logs to see what people are looking at
    (/grumpy old man)

  • More details (Score:2, Informative)

    by Trollsfire ( 243025 )
    The article is a little short on details as to how the technology works, and there has been some speculation already. However, this being academic research, let us not forget that more details are (often) readily available. The Project Proposal [mit.edu] (pdf format, 138K) and a brief paper [mit.edu] (pdf format, 77K) are available from MIT's web site.

    Their stated motivation is:

    Content providers have a vested interest in the results of mouse movement data analysis. Our system provides the means to find out exactly how users mavigate their page and thus affords an extensive user model.


    The technique they used was to "add Javascript externally to an existing web page." They mention using barnesandnobel.com, amazon.com, and ashford.com explicitely, but more had to be used given the nature of the tasks given. This seems to imply that they are able to, as a third party, add the javascript tracking to already existing sites. However, they also may be using the fact that they control the testing environment to do this, such as by inserting the code using an http proxy. Details related to how the code was introduced are not given, and would be necessary to determine how much of a privacy threat this is.
  • This is Stupid! (Score:2, Interesting)

    by PotatoHead ( 12771 )
    Reality check here...

    People are going to collect information on the sites you visit. If you don't like it, there are some easy ways to get around the problem.

    Personally, I don't mind most sites looking my stats over. This sort of thing keeps a lot of sites free. There are worse options like interrupted browsing. All they have to do is remove the page from direct access and lots of bad things happen. Let the marketing departments pay for something easy that those of us who want to can get around. The alternatives are harder and costly.

    1. Fast connection means nothing because you have to wait along with everyone else for the ad server to show you the ad, then the page....

    2. Searching becomes harder.

    3. The web becomes less cross-platform as the ads require tools not avaliable everywhere.

    So,

    Use an anon service and surf that way if it is a problem.

    Or here is another option. Enable your usual blocking tools hit the page and copy the page to local storage and read as long as you want.

    I will do this anyway from time to time because I want to archive some content for reading later offline or on a PDA.

    Big deal.

  • Or perhaps someone else is already doing this, and hasn't told you.

    Lets see... I think its doubleclick.com that places ad banners that track people across server boundaries then sells the results. Web servers moniter traffic and analyze logs to find out which pages are getting hit most frequently. I'd bet with a little bit of creative Java (or maybe even JavaScript) you could tell how far down the page someone is(can anyone verify or disprove this?), from that figure out their reading speed, what sections of the page they weren't interested enough to read, and which they just skimmed, and who knows what else. Combine these technologies and there you have it, you have an exact picture of what interests a specific person. Throw in an IP address, and maybe some demographic information and you have an awesome marketing tool, with no new technology involved. I wouldn't say "perhaps", I think its a sure bet someone is already doing this.

  • The targets specified in this article were obviously computer illiterate run of the mill folk living the average life. The kind of person that actually reads the help files that come with microsoft products. Also, the kind of person that can't read anything without a physical reference point, i.e. a mouse cursor. For people like myself... who don't hold on to the mouse unless the intention is on clicking what is being looked it... it wouldn't work. And I dont' shove my cursor off to the right, It's not that big, it does fine right where it's at.
  • Difficult? (Score:2, Insightful)

    by jallen02 ( 124384 )
    This would take me a week to implement in JavaScript. Set up listeners for nearly and and all mouse events. Log them using a Javascript Object. Serialize it to XML or some tighter data format. Analyze later. The tricky part is the analysis and figuring out exactly what you want to have listeners for. Still.. not that difficult at all.

    Jeremy
  • "I can tell because when you read a webpage, you do one of a couple of things. You either shovel the mouse off to the right so that it is out of the way, or you will walk down the page with your mouse," he told the BBC's Go Digital programme.

    Upon reading this I looked at my mouse cursors position. It was dead in the middle of the screen, over some of the text I had read before I scrolled down using my mouse wheel, and had been there since I opened the page. (I take the second case he descibes as 'you use the mouse to hold the vertical position on the page where you're currently reading', as oppose to 'you use your mouse wheel to scroll')

  • From the article:

    "I can tell because when you read a webpage, you do one of a couple of things. You either shovel the mouse off to the right so that it is out of the way, or you will walk down the page with your mouse,"

    I'm presuming here that what the person means by walking your mouse down the page, is that you are "reading" the text with your mouse pointer (like using your finger in a book). Many people here mention that they get around this by using their scroll wheel. They can probably track scroll wheel movements pretty easy. A simplistic method would be in javascript. You should just need something like:

    if (browser is IE)

    document.onmouseover = call function here;

    if (browser is Netscape)

    document.addEventListener("mouseover", call function here, true);

    in your scripting area. I think this will take care of 'tracking' your mouse anywhere on the screen. So if the mouse is anywhere over the document, an event is fired off calling the function. I'm sure you've seen a site that has those anoying 'mouse trails' that can follow your cursor...similar concept. It's not limited to links, so provided your mouse pointer is anywhere over the page, it will track it. If you are using the scroll wheel, the page moves under the mouse...but the pointer ends up over a different section of page. Thus it looks like the mouse has moved. So the function could start a timer every time it is called. This could give you an idea of how long they spend viewing a portion of the screen before moving on (scrolling down, etc.).

    Now, you could probably circumvent this by putting the mouse cursor off of the browser window altogether and use the arrow keys to scroll. Put you'll probably need to tab between the links in order to get to the one you want. This selects each link, which again should be viewable through a javascript event (can remember the handler off top of head, onfocus perhaps?) tagged to each link.

    Other parts of the article mention being able to provide you with a site that tailors itself to you on the fly. Simple server-side scripting will do this. However, I fear sites becoming over-zealous with a feature like this. Many sites end up only providing you with common content it thinks you want, while hiding the content it thinks you don't want. This is to presumably speed up my experience because I wont have to see the other site information downloaded (quicker access over those modem links). After a while, I might not know what said site has to fully offer, as I get 'stuck in a rut' so to speak. They would need a 'show everything site has' (site map) link on everything single page to help offset this. Unfortunately, many sites don't adhear to this simple requirement. Consequently, many users never use certain sites to their full potential.

    - A non-productive mind is with absolutely zero balance.

    - AC
  • of course this was made by the 'academics'.. again MIT creates something without thinking should they .. this is always the case though. To many people think more important of can we do this rather than should we do this.

    Why isn't MIT trying to figure out how to make SMTP a secure method of communication. Or adding a better way of removing spam off mailservers.

    You know what I'd rather see, is a way of an end user setting up server side spam filters so that one does not have to download spam email to the machine and have the email client do the filtering. This would eliminate 50% of my junk email and probably yours too.

    Why cant they create something useful to the users.. guess this means that there needs to be a privacy project started on sourceforge... Whats a good name for that???

  • They're going to have a tough time with me, considering I usually switch between an average of 10 browser windows at one time...
  • Doesn't the latest version of Opera support mouse gestures a la Black & White? Wouldn't this wreak havoc on any data they gather using this mouse position tracking system? I can just see hordes of Opera-using /.ers descending on the first website to employ this methodology for the sole purpose of screwing up the stats...

  • by eram ( 245251 )

    I found the Web page of project "Cheese" [mit.edu] at MIT. They don't seem to be using their own mouse tracking technique yet. The publication [mit.edu] that the researchers have produced doesn't provide much more information than the BBC article.

  • by malkavian ( 9512 ) on Monday September 10, 2001 @02:59PM (#2274630)
    Well, one thing that strikes me about this is:

    For all this data collected from all the surfers to a busy site, where on earth are they going to store it all for any length of time??

    I work for a company with a sizable web traffic (250 million pageviews/month). The bane of my life is the logs. Processing them, and storing them for the length of time to draw meaningful trends takes a huge amount of space. All of which needs to be on a RAID, just in case..
    Then, of course, there's the software to mine this collection of data, the amount of time required to search the disks for the relevant data, and the setting of the resolution of the data capture from the mouse (needs to be pretty fine resolution to achieve any meaningful results)...
    Just think, if they adopted this scheme, it'd be great fun to write a device driver for a pseudomouse that sat the cursor over the web browser, and randomly moved it around, generating millions of data events, all of which get logged on the web site archives...
    It's fine to do this for a small scale site, with plenty of funding, but I think there'd be huge problems with the sheer logistics of collecting and analysing this data for anyone without almost bottomless pockets as far as funding goes...
    Personally, I don't reckon this will be a big brother tech anytime in the near future...

    Cheers,

    Malk
    • For all this data collected from all the surfers to a busy site, where on earth are they going to store it all for any length of time??

      Duh. Most of the posts crying "invasion of privacy" have been far off the mark. This isn't technology for tracking individual users -- maybe it could do that, but recording every mouse movement individually would overload most servers. It's an attempt to collect stats on what parts of the page draws attention. Occasionally someone would use that to improve their web site. Mostly, advertisers are going to try to use it to find out if anyone even _looks_ at their ad. I don't think tracking mouse movements will do that too well, but in the absence of equipment to spot where your eyes are looking, they'll record the mouse movements and try to deduce something, then some dumb suits in marketing will take this faulty data as gospel.

      And the real problem arises if this is actually accurate enough to reveal that no one looks at the ads... The first generation of spyware revealed that no one clicks on banner ads -- and millions of $ were pulled out of internet web sites and put into TV and magazine ads instead. No one looks at those either, but there is no way of showing just _how much_ we don't look at them. Improve the tools for measuring user interest in ads, and you are going to lose even more ad $...

If you have a procedure with 10 parameters, you probably missed some.

Working...