Forgot your password?
typodupeerror
Privacy The Military United States Your Rights Online

Homeland Security Department Testing "Pre-Crime" Detector 580

Posted by timothy
from the when-dowsing-meets-voight-kampff dept.
holy_calamity writes "New Scientist reports that the Department of Homeland Security recently tested something called Future Attribute Screening Technologies (FAST) — a battery of sensors that determine whether someone is a security threat from a distance. Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance. In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."
This discussion has been archived. No new comments can be posted.

Homeland Security Department Testing "Pre-Crime" Detector

Comments Filter:
  • sensors... (Score:5, Insightful)

    by adpsimpson (956630) on Tuesday September 23, 2008 @11:03AM (#25121825)

    Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance

    ...And most importantly, skin colour?

    Seriously, is there anything a device like this can do that's either more useful or less invasive than a human watching people walking past and profiling/screening them on what they can see?

  • Err (Score:5, Insightful)

    by InvisblePinkUnicorn (1126837) on Tuesday September 23, 2008 @11:04AM (#25121831)
    Does this sound idiotic to anyone else? Of course it's going to work for people who are told how to act in order to get the device to flag them.
  • by fprintf (82740) on Tuesday September 23, 2008 @11:04AM (#25121855) Journal

    The summary talks about the sujects being told to act suspicious. So, if you are told to be suspicious does this make any difference from someone who is actually planning something nasty? I suppose it is difficult to find subjects who are unaware they are being observed, and yet also intent on doing something bad. Nevertheless, I'd hypothesize there might be significant, observable differences between the two groups.

  • Doesn't matter (Score:5, Insightful)

    by MadMidnightBomber (894759) on Tuesday September 23, 2008 @11:05AM (#25121873)

    "In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."

    None of that matters - what's important is the false positive rate, ie. the proportion of people with no malicious intent who get flagged up. If it's as high as 1% the system will be pretty much unworkable.

  • Really? (Score:4, Insightful)

    by gehrehmee (16338) on Tuesday September 23, 2008 @11:05AM (#25121875) Homepage

    In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,

    Isn't this a little off-base? People who are really about to commit a crime, as a rule, will be explicitly trying not to look suspicious.

  • by Anonymous Coward on Tuesday September 23, 2008 @11:10AM (#25121991)

    You will always get these sorts of results with forced actions. If I made a happiness detector (via facial expressions), and told half of the group to smile, and the other half not to, I bet it would pick that up. Now, what if half the group were given personal responsibility toy, and the other half were given a cuddly teddy bear? I bet it wouldn't be accurate anymore...

    A better test would be to give the group water bottles. Most of the group are given real water in bottles. A few of the group are given water bottles filled with vodka. All subjects know what they are carrying. The goal is to finish an AA meeting, drinking your drink. If you get through the meeting, you are given a reward (say $20). If you don't, you owe $20.

    What's the bets that would be much harder to figure out?

  • by Joce640k (829181) on Tuesday September 23, 2008 @11:14AM (#25122065) Homepage

    All we've got is a device which can spot normal people trying to be visibly "suspicious".

  • by dbyte (1207524) on Tuesday September 23, 2008 @11:15AM (#25122101)
    God help the nervous flier :)
  • Re:sensors... (Score:5, Insightful)

    by $RANDOMLUSER (804576) on Tuesday September 23, 2008 @11:16AM (#25122109)
    Why yes, yes there is. It can randomly spurt out false positives, subjecting people to random stops and questioning. It can still miss the real terrorists who are doing their damnedest to look normal and unthreatening. It can further the "show us your papers" society we've been building and seem so enamored of. It can supply the mindless thugs at security checkpoints an ironclad "the machine says so" excuse to hassle harried, irritated travelers. It can further the "security theatre" in all aspects of everyday life. In short, it can do nothing positive.
  • hmm (Score:3, Insightful)

    by lucky130 (267588) on Tuesday September 23, 2008 @11:18AM (#25122171)

    I notice both of those success rates are less than 100%. Personally, I don't want to be one of those innocent 20+% that gets harassed.

  • Re:Err (Score:5, Insightful)

    by Yvanhoe (564877) on Tuesday September 23, 2008 @11:18AM (#25122179) Journal
    If I recall correctly, the last time I traveled to USA, I had to fill a form stating that the intent of my travel was not to kill the US president. People who create such forms would probably fund a research on a "suspicious person detector"
  • Re:sensors... (Score:5, Insightful)

    by electrictroy (912290) on Tuesday September 23, 2008 @11:21AM (#25122241)

    Good point. A real terrorist doesn't show signs of distress, because he doesn't consider his actions immoral. He thinks killing IS the moral thing to do.

  • Re:sensors... (Score:5, Insightful)

    by Otter (3800) on Tuesday September 23, 2008 @11:22AM (#25122259) Journal

    ...And most importantly, skin colour?

    That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.

  • Re:sensors... (Score:5, Insightful)

    by moderatorrater (1095745) on Tuesday September 23, 2008 @11:26AM (#25122343)
    He'll still show signs of stress, though. Just because you think it's right to get into a fight doesn't mean that the adrenaline doesn't start pumping.

    The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?

    And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.
  • Re:sensors... (Score:4, Insightful)

    by arth1 (260657) on Tuesday September 23, 2008 @11:29AM (#25122399) Homepage Journal

    That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.

    So, who are the non-humans that calibrate the systems?

  • Re:sensors... (Score:5, Insightful)

    by Otter (3800) on Tuesday September 23, 2008 @11:29AM (#25122419) Journal
    Absolutely untrue. Suicide bombers fail as often as they do (in Israel, Iraq, Sri Lanka,...) because they're usually bug-eyed, sweating, twitching, and frequently high. Highly trained operatives might be reliably calm, but the run-of-the-mill terrorist is usually pretty obvious, although they can still often kill people before someone can stop them.
  • Re:sensors... (Score:4, Insightful)

    by Aphoxema (1088507) * on Tuesday September 23, 2008 @11:30AM (#25122423) Homepage Journal

    It's okay since only a few people will get hurt in the process.

  • by theverylastperson (1208224) on Tuesday September 23, 2008 @11:32AM (#25122469) Homepage
    Awesome, now we have a great tool to accuse people with. How can anything with an accuracy of 78% be worth using? On a grading scale it's a C+. How many innocent people (22%) will be caught up in this mess? If the government is trying to create a rebellion by the people, then this is a perfect method.

    How about hiring intelligent guards? Or people with common sense?

    If we spent 10% of what we spend on this kind of crap on actually solving the real problems we face, then we might actually get somewhere. But as long as we live in this ultra-paranoid world filled full of invisable terrorists then we'll never get the chance to overcome the real problems. What a shame and what a waste.
  • Re:sensors... (Score:2, Insightful)

    by TheVelvetFlamebait (986083) on Tuesday September 23, 2008 @11:40AM (#25122623) Journal

    But maybe a lot of people will be saved from a lot of hurt at the end of it. Did you consider that possibility?

  • Re:Not even close (Score:1, Insightful)

    by Anonymous Coward on Tuesday September 23, 2008 @11:40AM (#25122629)

    The other 22% weren't very good actors.

    Another way of looking at this is that it had a 78% false positive rate. They were told to *act* suspicious. None of them actually intended to do anything wrong.

  • Re:sensors... (Score:5, Insightful)

    by Chris Burke (6130) on Tuesday September 23, 2008 @11:44AM (#25122699) Homepage

    The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?

    And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.

    You've hit that on the head. About 200,000 people go through Chicago O'Hare, just that single (though large) airport, every day. And so far, zero terrorist attacks launched out of O'Hare. The odds that a person this machine flagged being an innocent is ridiculously high, even if it is has high specificity.

    Also, aside from the raw statistics of the thing, there's another compounding factor that makes this even more useless*, which is it's rather simple for terrorists to game the system with dry runs.

    Terrorist organizations already tend to use people not on our radar for attacks, so if they get pulled out of line on a dry-run, we won't have anything on them and it'll look like yet another false positive. Our young jihadi goes through the line with a bunch of his buddies, and everyone who gets pulled out of line doesn't go through the next time. Once you've discovered the group of people who aren't detected by the terrorist detector/profilers/crystal ball, the hot run can proceed with little fear of getting caught.

    * For the stated goal, of course, not the goal of Security Theater for which a magical terrorist detector is great.

  • Re:sensors... (Score:1, Insightful)

    by Anonymous Coward on Tuesday September 23, 2008 @11:44AM (#25122703)
    Got it - the ends justify the means. Somewhere, someplace, I think I've heard that before. Oh yeah, it was that Machiavelli dude - "The Prince" wasn't it? You're going against the groupthink here on the slash if you really think it is cool to hurt some folks short term to protect others long term...
  • As a Marine (Score:3, Insightful)

    by wernox1987 (1362105) on Tuesday September 23, 2008 @11:45AM (#25122719)
    I never doubted what I was doing was right, or justified, but that didn't stop my heart from pounding out of my chest.
  • by TheMeuge (645043) on Tuesday September 23, 2008 @11:47AM (#25122753)

    So this device was 80% successful at picking up suspicious activity from PEOPLE WHO WERE ASKED TO LOOK SUSPICIOUS.

    Wow, amazing! Something any police officer who has served a couple of years would be able to do with 100% (or nearly so) accuracy.

    What is missing is an assay of how many people it would flag if they were told to behave as if they were SCARED. You know... scared of being flagged for behaving abnormally, strip-searched, tortured, and never seeing their families again. Something tells me that the rate of false positives on this machine will overshadow the rate of false negatives by a very large margin.

  • Re:Err (Score:3, Insightful)

    by oldspewey (1303305) on Tuesday September 23, 2008 @11:48AM (#25122767)

    Does this sound idiotic to anyone else?

    Yes, it's completely idiotic. What these geniuses have done has nothing to do with security - they have created a bad, amateur acting detector that boasts ~80% accuracy.

  • Re:sensors... (Score:4, Insightful)

    by ShieldW0lf (601553) on Tuesday September 23, 2008 @11:49AM (#25122803) Journal
    Are you questioning the judgment of the Miniluv? [wikipedia.org]?

    That is a clear case of crimethink [wikipedia.org].

    I'm calling thinkpol [wikipedia.org].
  • Re:sensors... (Score:5, Insightful)

    by Kozar_The_Malignant (738483) on Tuesday September 23, 2008 @11:50AM (#25122821)

    Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.

    It isn't the idea of developing an immature technology that upsets people. It is our well-justified fear of the government deploying immature technology. I'd rather not be subjected to a public beta-test of a thoughtcrime detector.

  • Absurdities (Score:5, Insightful)

    by mlwmohawk (801821) on Tuesday September 23, 2008 @11:57AM (#25122953)

    We lose more people to premature death each and every year because we have no health care than we have to terrorism in the whole of the 21st century.

    fear, fear, fear, be afraid, fear, fear, be afraid.

    A young girl waring a proto-board with blinking LEDs could have ben shot dead because of the hysteria.

    fear, fear, fear, be afraid, fear, fear, be afraid. fear, fear, fear, be afraid, fear, fear, be afraid.

    You can't say we have nothing to fear, but we have a lot of real and pressing things that need to be focused upon.

    fear, fear, fear, be afraid, fear, fear, be afraid. fear, fear, fear, be afraid, fear, fear, be afraid. Threat level purple.

    The U.S.A. has to re-grow our spine. We have nothing to fear but fear itself. Unfortunately, the current powers that be like to rule by exploiting and enhancing the terror of terrorists.

  • Re:sensors... (Score:1, Insightful)

    by Anonymous Coward on Tuesday September 23, 2008 @11:58AM (#25122975)

    In short, it can do nothing positive.

    In other words, whether it works or not, it increases the power of the government over its subjects, and when it doesn't work, it requires more tax dollars to build ever-more-accurate scanners (that are never accurate enough to actually improve security, because that'd put everyone out of work :), and in so doing continue to feed the feedback loop that leads to ever more iron-fisted forms of totalitarianism.

    In short, it can do everything it's designed to do.

  • the end of liberty (Score:5, Insightful)

    by globaljustin (574257) <justinglobal@@@gmail...com> on Tuesday September 23, 2008 @12:00PM (#25123015) Homepage Journal

    All we've got is a device which can spot normal people trying to be visibly "suspicious".

    You are correct. From TFA:

    Some subjects were told to act shifty, be evasive, deceptive and hostile. And many were detected.

    It is absolutely ridiculous to think that they have produced any kind of test results that would indicate a functioning system. This is government and business at its absolute worst.

    Not only is DHS trying their damnedest to become big brother, they are doing it in the most incompetent way possible.

    This tech will never, ever work. All it can measure is physiological attributes. Correlation is not causation. Just because some percentage of people who are intending to commit a crime have certain physiological characteristics does not mean that anyone with those characteristics is a 'pre-criminal' and should be questioned. I weep for the future.

    And even if, in some far-flung scenario, it did become functional it would still be illegal. It is invasion of privacy. Our thoughts and intentions are private. They mean nothing until we act on them. Human thought is vast and unlimited, part of our nature is boiling down the infinite array of ideas we have into action in the physical world where there are consequences. Everyone has the right to think whatever they want. When they act on it, then that action enters the territory of having (potentially bad) consequences.

    What this evolves into is thought control and that is the end of liberty.

  • Re:sensors... (Score:5, Insightful)

    by rtb61 (674572) on Tuesday September 23, 2008 @12:01PM (#25123033) Homepage

    The biggest problem with this, is between 78% and 80% of people told to act suspiciously can fool the system into believing they are intending to commit crime, logically those same people should be able to act in the opposite fashion to fool the system into believing they are not, I mean really, what are they thinking the logic of their analysis represents.

    Apparently excuses for legal pre-emptive arrests for unsavoury people is the new focus, much like the no fly lists. A list of politically undesirable people who will be arrested, searched, interrogated, transferred to a prison facility whilst their identities are confirmed (which I am sure will take no longer than 24 to 48 hours). All this will be done at a range of designated choke points, like train and subway stations and, maybe even toll booths.

    Adjust your political alignment or you will find you, your family, your friends subject to random humiliations, violent arrests, searches including sexual groping and destruction of private property, of course your will be released and it will all be done with a masquerade of legality. I believe some journalists have already experienced exactly this type of pre-emptive arrest at the RNC convention, I don't believe they were particularly impressed with the concept.

  • Re:sensors... (Score:4, Insightful)

    by jd (1658) <{moc.oohay} {ta} {kapimi}> on Tuesday September 23, 2008 @12:02PM (#25123043) Homepage Journal
    There isn't necessarily a "fight" scenario. The individuals could very easily train themselves into regarding their acts as no different from a postman delivering a parcel. Assuming, of course, the person IS the one with the hostile intent - if the system is remotely effective, groups could be expected to migrate to unwitting "volunteers". Of course, such systems may be jammable, depending on how they work. It doesn't matter if vulnerabilities appear to be theoretical - organizations that are willing to, well, burn money (literally at extreme temperatures and pressures) are likely to find exploits because a populace deluded into thinking they are safe would logically be easier to manipulate and control by fear.

    It's easy to move heat around, so any simple thermal camera can be tricked into thinking the person looks normal. This is only useful if the camera is simple. The heat has to go somewhere, so you'd see some point being much hotter than expected, but any software designed to reject absurd anomalies would reject such a point as impossible.

    Facial expressions would logically require a course at an acting school or a few minutes with a bottle of latex and a blow-drier to create a fake facial skin. Criminals would not require the skill of Hollywood. They would only need to fool automatic face recognition and facial expression recognition software. At worst, they'd also need to fool low-res, low frame-rate CCTV operators at range. Most LARP groups have experience at producing very realistic face masks. Learning from them would produce someone who could (if they wanted to) be totally secure against CCTV systems. Many ethnic profilers could logically be fooled with similar methods.

    As for false positives - anyone who is ill will show higher-than-normal heat, as will anyone who has gone jogging or exercising. Anyone caught in a hot car due to snarled-up roads will be hot and show an angry, hostile expression. Many in New England are permanently in a state of anger. So, in all probability, 90% of all city-dwellers and New Englanders will be classed as potential terrorists. Of course, I've always been somewhat suspect of Philadelphia cheese, but that seem to be taking the complaint a bit too far.

  • by clone53421 (1310749) on Tuesday September 23, 2008 @12:06PM (#25123133) Journal

    what right does the public have to know my face?

    The same "right" that you have to ride an airplane.

    what right does the airline have to know my face?

    Well, last time I checked they still required all passengers to prove their identity before letting them on an airplane.

  • by magus_melchior (262681) on Tuesday September 23, 2008 @12:09PM (#25123161) Journal

    I don't know about you, but telling people to "act suspicious" smacks of teaching FBI recruits how to defend themselves against anyone who attacks them armed with fresh fruit. It would be hilarious if the administration (and at times, the Republican Party leadership) were not acting as if Flying Circus was a documentary.

    Notice that there's no discussion of probable cause, let alone 4th Amendment protections. Notice there's little, if any, discussion of how actual terrorists act before the critical steps of their mission. Notice there's never any discussion of the root causes of terrorism among the ones in power, just the extermination of those who practice terrorism-- and even this is largely lip service and security theater.

    I wish I were joking, but this administration is trying to lock-in some very dangerous precedents. Paulson introduced a bailout plan to Congress that included a provision that said, in essence, "don't oversee the Treasury Secretary or the Fed chairman". Not long after that, Dick Cheney himself tries to persuade Congressional Republicans to support the administration's plan. Cheney has a history of preferring to cut Congress out of the loop, legally or otherwise, ever since Nixon resigned. How much do you guys want to bet that the author of the proposed bailout plan isn't Paulson alone? How much do you want to bet that Paulson will use the bailout money and unsupervised power to nationalize even more corporations?

  • Re:sensors... (Score:5, Insightful)

    by moderatorrater (1095745) on Tuesday September 23, 2008 @12:09PM (#25123179)

    but using this to help narrow who to watch would be what this should be used for.

    I can't disagree more strongly. When the flood the false positives start coming in, they'll quickly start dismissing them. As another poster pointed out, Chicago O'Hare alone has 200,000 people go through it every day; when several thousand of them are flagged as suspicious, you can bet that security will stop caring pretty quickly.

  • Re:sensors... (Score:3, Insightful)

    by maxume (22995) on Tuesday September 23, 2008 @12:12PM (#25123235)

    Did you read a book over the weekend? Did it hurt?

    Orwell got lots of easy stuff right (people like authority...a call for a leader starts with a desire to follow), but he missed the boat on just how easy it has become (and is becoming!) to use computers to not merely threaten to monitor anybody at any time, but to monitor everybody all the time.

    Unfortunately, sarcastic bitching is not the solution.

  • Re:sensors... (Score:5, Insightful)

    by Bandman (86149) <.bandman. .at. .gmail.com.> on Tuesday September 23, 2008 @12:17PM (#25123325) Homepage

    Exactly. Pair this up with the red light cameras, and you've got enough income to drive any city out of recession.

    "I didn't run that red light"
    "No, but you wanted to"

  • Re:Err (Score:4, Insightful)

    by CaptDeuce (84529) on Tuesday September 23, 2008 @12:17PM (#25123327) Journal

    What do those 78% and 80% mean, you ask? Let's look at The Fine Article:

    Some subjects were told to act shifty, be evasive, deceptive and hostile. And many were detected.

    Answer: it's a bad acting detector.

    Seriously, a better test would be to ask test subjects to do something relevant such as, say, defeat the detector (duh!). If the subject fails, something unpleasant, yet harmless, will happen; a device that emits a startling noise and perhaps belch some smelly smoke. Imagine a grown up version of the game Operation [youtube.com] (I hate that game). Better yet, have the subject carry the device on their person. The nature of the device would be demonstrated to the subject beforehand, just as a domestic animal is allowed to experience the shock from an electric fence to establish the proper respect for the deterrent.

    I'm getting nervous just describing the damn thing.

  • by davidsyes (765062) on Tuesday September 23, 2008 @12:20PM (#25123377) Homepage Journal

    And then the DHLS agents can all be Commanders Braxtion, zipping through the timelines, arresting or aborting people, or arresting and coitus-interrupting the would-be parents, all stating like the Vidal Sassoon (And, THEY'LL tell two friends, and so on and so on and so on... commercial):

    "I am Commander Braxton, of the DHLS Timeship Aeon. You are being arrested for crimes you WILL commit...", or,

    "I am Commander Braxton, of the DHLS TimePos ICOS. You sex act is being disrupted to delay or prevent arrival of the bastard/bitch YOU MIGHT give rise to..."

    So, what of a Kenyan sprinter, having lost his run, is sweating profusely, dons his sweat pants, runs to a store outside the track event, and is frustrated because he doesn't know what to do? In SOME countries, he might be assumed to be en route to a criminal activity (if the cameras and human monitors don't take into account he's a jogger...)

    But, what of a hunter who takes down the helpless elk, only to watch a ranger or warden haul it away, or, worse, has some teens or (gasp, senior citizens bored with life) come along and take that elk away? But, the shooters head to town for some beers. If caught on a camera, would THEY be suspected of prepping to kill or harm anyone?

  • Re:sensors... (Score:5, Insightful)

    by db32 (862117) on Tuesday September 23, 2008 @12:21PM (#25123413) Journal
    Actually, the real purpose is to pull out those kids who are nervous about leaving home for the first time going to college or something. That way they can scare them into not turning into one of those dirty liberal elitist intellectuals that would dare question the authority of the system.
  • Re:sensors... (Score:2, Insightful)

    by maxume (22995) on Tuesday September 23, 2008 @12:23PM (#25123459)

    If there is a 5% chance that a brown Muslim is a terrorist and a 0.00001% chance that a white Christian is a terrorist, you still can't usefully screen something like 100 people based on race and religion (because you overwhelm any sort of useful secondary screening with false positives).

    If the vast majority of Muslims were terrorists, it would be simplest to simply not allow Muslims to fly (at least not on the same planes as everybody else). Since only a tiny minority of Muslims are terrorists, "Muslim" is not a useful piece of information.

  • Re:sensors... (Score:5, Insightful)

    by cmr-denver (1315039) on Tuesday September 23, 2008 @12:25PM (#25123489)
    Actually, better yet, don't tell them it's a dry run ahead of time. Have them go through security to be inside by a specific time. Then call them, and say "It's a go" or "Nevermind, enjoy your trip." After a couple of "Nevermind" runs and not getting pulled over, you should know who to send...
  • Re:sensors... (Score:5, Insightful)

    by gstoddart (321705) on Tuesday September 23, 2008 @12:29PM (#25123575) Homepage

    but he missed the boat on just how easy it has become (and is becoming!) to use computers to not merely threaten to monitor anybody at any time, but to monitor everybody all the time.

    Given that he published it in 1949, he can be forgiven for not foreseeing modern computers.

    In terms of showing how pervasive and evil a surveillance society can be, he's still highly relevant.

    Pointing out just how eerie something like an automated "future crimes" concept is hardly just sarcastic bitching -- I'm betting an awful lot of people read that summary and thought "holy crap!!", I sure as hell did. Because, the sheer idea of being detained or hassled because some computer suggested you might be stressed is nuts. It's scary to think this could give them any grounds to act on anymore more than a very cursory level -- I mean, talk about your unreasonable search, and people being told they need to get the rubber glove treatment because some computer program identified them as stressed is lunacy.

    Time was when one would have through it impossible for the USA to degenerate into a place where this would be happening. Now, it's hard to think of how one would stop it. Spending billions of dollars to make all of the scary stuff in Orwell come true is frightening to some of us.

    Cheers

  • Re:sensors... (Score:5, Insightful)

    by ByOhTek (1181381) on Tuesday September 23, 2008 @12:32PM (#25123623) Journal

    If the terrorists know it's a dry run, then their responses will be different - amongst other things, if they are caught, there will be no evidence or deniability.

    Still, I can't see this as having a low false positive rate.

    - Guy goes home to his beloved but too-oft left alone wife is nervous over the obvious.
    - Gal had to much to drink last night and woke up with someone... unusual. Worried about a few things that could really change her life
    - [insert various nervousness-inducing-mental-conditions-here] sufferer forgot to take his/her medicine.
    - First time flier.

    The list can go on.

  • Re:sensors... (Score:3, Insightful)

    by compro01 (777531) on Tuesday September 23, 2008 @12:36PM (#25123703)

    You assume that the immature technology in question is even based on a workable premise and isn't just a massive pit for money, time, effort, and pain with no hope of producing anything useful.

    They told all the people specifically to "act suspiciously", and the damn thing still failed at detecting them 22% of the time!

  • by Chyeld (713439) <chyeldNO@SPAMgmail.com> on Tuesday September 23, 2008 @12:42PM (#25123819)

    I've stolen this from Cory Doctorow

    Paradox of the false positive
    Statisticians speak of something called the Paradox of the False Positive. Here's how that works: imagine that you've got a disease that strikes one in a million people, and a test for the disease that's 99% accurate. You administer the test to a million people, and it will be positive for around 10,000 of them - because for every hundred people, it will be wrong once (that's what 99% accurate means). Yet, statistically, we know that there's only one infected person in the entire sample. That means that your "99% accurate" test is wrong 9,999 times out of 10,000!

    Terrorism is a lot less common than one in a million and automated "tests" for terrorism - data-mined conclusions drawn from transactions, Oyster cards, bank transfers, travel schedules, etc - are a lot less accurate than 99%. That means practically every person who is branded a terrorist by our data-mining efforts is innocent.

    In other words, in the effort to find the terrorist needles in our haystacks, we're just making much bigger haystacks.

    You don't get to understand the statistics of rare events by intuition. It's something that has to be learned, through formal and informal instruction. If there's one thing the government and our educational institutions could do to keep us safer, it's this: teach us how statistics works. They should drill it into us with the same vigor with which they approached convincing us that property values would rise forever, make it the subject of reality TV shows and infuse every corner of our news and politics with it. Without an adequate grasp of these concepts, no one can ever tell for sure if he or she is safe.

  • by kent_eh (543303) on Tuesday September 23, 2008 @01:21PM (#25124473)
    When I am bored (standing in an endless lineup, waiting for a delayed flight, etc) I often look at my surroundings.
    I used to install video equipment, so I look at the installed video monitors and cameras.
    Is noticing security cameras (and the quality of their installation) in an area suspicious?

    I am a model railroader.
    Is is suspicious that I take pictures of trains and their environment so that I can build more accurate models?

    I studied architecture for a time.
    Is it suspicious that I spend a lot of time looking at (and sometimes photographing) interesting buildings?

    Am I acting suspicious when I notice a guard of some sort watching me doing the above, and that I am curious as to how he might react to my perfectly harmless activities in these highly paranoid times?
  • Re:sensors... (Score:3, Insightful)

    by poetmatt (793785) on Tuesday September 23, 2008 @01:22PM (#25124489) Journal

    It's about as accurate as a lie detector. You know, because we all know lie detectors are so perfect. It's not like people know how to game them or anything. The "you can't hide your true intentions; your body will know" part is a 100% fallacy and guaranteed to not be accurate.

    I'm disappointed; gov't spending on some stupid shit here really.

  • by geekgirlandrea (1148779) <andrea+slashdot@persephoneslair.org> on Tuesday September 23, 2008 @01:30PM (#25124679) Homepage
    Nothing. Just like there's nothing to stop the TSA from arresting someone with a phobia of flying (or crowded airports, or fascism...) on the grounds that they "look nervous". You didn't seriously think this had anything to do with catching terrorists, did you?
  • Re:sensors... (Score:5, Insightful)

    by mazarin5 (309432) on Tuesday September 23, 2008 @01:30PM (#25124683) Journal

    Actually, the real purpose is to pull out those kids who are nervous about leaving home for the first time going to college or something. That way they can scare them into not turning into one of those dirty liberal elitist intellectuals that would dare question the authority of the system.

    Because nothing turns a kid into a conservative like a bad run-in with the cops, right?

  • Re:sensors... (Score:2, Insightful)

    by LaskoVortex (1153471) on Tuesday September 23, 2008 @02:27PM (#25125605)

    magical terrorist detector

    Since the test subjects were told to "act", I'd say its a "magical actor detector". These guys need to learn to do science.

  • by Chyeld (713439) <chyeldNO@SPAMgmail.com> on Tuesday September 23, 2008 @02:44PM (#25125875)

    Out of the 10,000 people indicated as having the disease, only one did. If the purpose of the test is to find those with the disease, then it's wrong 9,999 times out of 10,000 when it reports someone has it.

    Our lovely machine that is currently 78% accurate on 'mal-intent' (sic) detection is going to incorrectly tag 22 people out of every 100 as having mal-intent. With the gp's quoted figure of 200,000 people traveling through O'Hare every day, that means potentially 46,000 people a day incorrectly tagged as terrorists. Not one of them actually a terrorist, just someone caught as a false positive.

    One airport. One day. 46,000 people whose lives have just been screwed over in some manner. And no guarantee that the one terrorist that might show up once every billion of people is going to be caught by the machine.

  • by ichimunki (194887) on Tuesday September 23, 2008 @02:49PM (#25125973)
    That means that your "99% accurate" test is wrong 9,999 times out of 10,000! [snip] If there's one thing the government and our educational institutions could do to keep us safer, it's this: teach us how statistics works

    They can start by teaching Cory Doctorow how to count. The hypothetical test is wrong 9,999 times out of 1,000,000. Assuming, of course, that the test only produces false positives, and not also false negatives. That's what 99% accurate means.

  • Re:sensors... (Score:3, Insightful)

    by ckaminski (82854) <ckaminski@poboxUUU.com minus threevowels> on Tuesday September 23, 2008 @02:59PM (#25126171) Homepage
    Oh he saw it perfectly. Orwell's protagonist was caught by a complicit Human agent, the shopkeeper. Orwell's message wasn't about fearing machines and their overwatching, but fearing the culture that their use necessarily created. Who watches the watchers? Who prevents abuse?
  • by Chris Burke (6130) on Tuesday September 23, 2008 @04:17PM (#25127549) Homepage

    With the gp's quoted figure of 200,000 people traveling through O'Hare every day, that means potentially 46,000 people a day incorrectly tagged as terrorists. Not one of them actually a terrorist, just someone caught as a false positive.

    One airport. One day. 46,000 people whose lives have just been screwed over in some manner. And no guarantee that the one terrorist that might show up once every billion of people is going to be caught by the machine.

    Right, and since it's obviously impossible to do a truly thorough security screening (interview, background check, etc not that it's likely to even show anything) on 46,000 people a day, the "second-tier" check will be cursory at best, and thus even if you did pick the terrorist out of the line, they are still more than likely going to be considered to be one of the other 45,999 false positives and allowed to go on their merry way.

    The reason this works with medical tests is because the person receiving the tests is not taking them under duress, they want to get diagnosed, and the thing they are looking for is physical and not nearly as nebulous as "the intent to do harm". The tests are just a method of initial screening. E.g. they can use a cheap and simple blood test to rule out some kinds of cancer, but if it gives a positive result they will do a biopsy and know for sure. It's having a "gold standard" diagnosis that makes screens for rare diseases possible and false positives a minor problem.

    There is no "gold standard" for suspected terrorist. Hell we've held people for years outside the reach of the law and still not been able to come to solid conclusions about some people. This idea can never work.

  • by gobbo (567674) <wrewrite AT gmail DOT com> on Tuesday September 23, 2008 @04:50PM (#25127975) Journal

    Sociopolitical fear is a strategy to push the population to the political right.

    The old saw about a conservative being a liberal who's been mugged holds true; all you have to do is mug their minds and they'll cave in.

    It's a sleight of mind in risk assessment: the real risks are automobiles, heart disease (i.e. a botched food system), botched health care, botched education, natural disasters, and crime/poverty. Well, everyday accidents too, but that's just natural selection. Terrorism is about as much of a risk as wayward lightning strikes.

    An increasing real risk in the world is war. Guess who is promoting this risk?

  • by daemonburrito (1026186) on Tuesday September 23, 2008 @05:48PM (#25128755) Journal

    Isn't that a bit circular?

    FWIW, I'm glad she exists. We need more voices that are not afraid of point out that the Friedman meme that laissez-faire capitalism spreads human freedom may not be accurate. It was heresy up until just a few years ago to question that popular opinion. Anything that upsets the true believers is fine with me (btw, I'm not a fan of Klein's).

Truth is free, but information costs.

Working...