Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy The Military United States Your Rights Online

Homeland Security Department Testing "Pre-Crime" Detector 580

holy_calamity writes "New Scientist reports that the Department of Homeland Security recently tested something called Future Attribute Screening Technologies (FAST) — a battery of sensors that determine whether someone is a security threat from a distance. Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance. In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."
This discussion has been archived. No new comments can be posted.

Homeland Security Department Testing "Pre-Crime" Detector

Comments Filter:
  • sensors... (Score:5, Insightful)

    by adpsimpson ( 956630 ) on Tuesday September 23, 2008 @11:03AM (#25121825)

    Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance

    ...And most importantly, skin colour?

    Seriously, is there anything a device like this can do that's either more useful or less invasive than a human watching people walking past and profiling/screening them on what they can see?

    • Re:sensors... (Score:5, Insightful)

      by $RANDOMLUSER ( 804576 ) on Tuesday September 23, 2008 @11:16AM (#25122109)
      Why yes, yes there is. It can randomly spurt out false positives, subjecting people to random stops and questioning. It can still miss the real terrorists who are doing their damnedest to look normal and unthreatening. It can further the "show us your papers" society we've been building and seem so enamored of. It can supply the mindless thugs at security checkpoints an ironclad "the machine says so" excuse to hassle harried, irritated travelers. It can further the "security theatre" in all aspects of everyday life. In short, it can do nothing positive.
      • Re:sensors... (Score:5, Insightful)

        by electrictroy ( 912290 ) on Tuesday September 23, 2008 @11:21AM (#25122241)

        Good point. A real terrorist doesn't show signs of distress, because he doesn't consider his actions immoral. He thinks killing IS the moral thing to do.

        • Re:sensors... (Score:5, Insightful)

          by moderatorrater ( 1095745 ) on Tuesday September 23, 2008 @11:26AM (#25122343)
          He'll still show signs of stress, though. Just because you think it's right to get into a fight doesn't mean that the adrenaline doesn't start pumping.

          The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?

          And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.
          • Re:sensors... (Score:5, Insightful)

            by Chris Burke ( 6130 ) on Tuesday September 23, 2008 @11:44AM (#25122699) Homepage

            The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?

            And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.

            You've hit that on the head. About 200,000 people go through Chicago O'Hare, just that single (though large) airport, every day. And so far, zero terrorist attacks launched out of O'Hare. The odds that a person this machine flagged being an innocent is ridiculously high, even if it is has high specificity.

            Also, aside from the raw statistics of the thing, there's another compounding factor that makes this even more useless*, which is it's rather simple for terrorists to game the system with dry runs.

            Terrorist organizations already tend to use people not on our radar for attacks, so if they get pulled out of line on a dry-run, we won't have anything on them and it'll look like yet another false positive. Our young jihadi goes through the line with a bunch of his buddies, and everyone who gets pulled out of line doesn't go through the next time. Once you've discovered the group of people who aren't detected by the terrorist detector/profilers/crystal ball, the hot run can proceed with little fear of getting caught.

            * For the stated goal, of course, not the goal of Security Theater for which a magical terrorist detector is great.

            • by Bandman ( 86149 ) <bandman@gm a i l .com> on Tuesday September 23, 2008 @12:11PM (#25123217) Homepage

              Fair warning, you should go trademark the phrase "magical terrorist detector" before I do.

            • And then the DHLS agents can all be Commanders Braxtion, zipping through the timelines, arresting or aborting people, or arresting and coitus-interrupting the would-be parents, all stating like the Vidal Sassoon (And, THEY'LL tell two friends, and so on and so on and so on... commercial):

              "I am Commander Braxton, of the DHLS Timeship Aeon. You are being arrested for crimes you WILL commit...", or,

              "I am Commander Braxton, of the DHLS TimePos ICOS. You sex act is being disrupted to delay or prevent arrival of

              • Re: (Score:3, Funny)

                by Idiomatick ( 976696 )

                Wth kinda of teenagers STEAL a dead elk from a bunch of guys with guns no less. I mean an elk weighs what 800lbs? These are some well prepared kids if they can run off with fresh kills like that, were they waiting in the woods in camo or something?
                 
                (aside from that i totally agree)

            • Re:sensors... (Score:5, Insightful)

              by db32 ( 862117 ) on Tuesday September 23, 2008 @12:21PM (#25123413) Journal
              Actually, the real purpose is to pull out those kids who are nervous about leaving home for the first time going to college or something. That way they can scare them into not turning into one of those dirty liberal elitist intellectuals that would dare question the authority of the system.
              • Re:sensors... (Score:5, Insightful)

                by mazarin5 ( 309432 ) on Tuesday September 23, 2008 @01:30PM (#25124683) Journal

                Actually, the real purpose is to pull out those kids who are nervous about leaving home for the first time going to college or something. That way they can scare them into not turning into one of those dirty liberal elitist intellectuals that would dare question the authority of the system.

                Because nothing turns a kid into a conservative like a bad run-in with the cops, right?

            • Re:sensors... (Score:5, Insightful)

              by cmr-denver ( 1315039 ) on Tuesday September 23, 2008 @12:25PM (#25123489)
              Actually, better yet, don't tell them it's a dry run ahead of time. Have them go through security to be inside by a specific time. Then call them, and say "It's a go" or "Nevermind, enjoy your trip." After a couple of "Nevermind" runs and not getting pulled over, you should know who to send...
            • Re:sensors... (Score:5, Insightful)

              by ByOhTek ( 1181381 ) on Tuesday September 23, 2008 @12:32PM (#25123623) Journal

              If the terrorists know it's a dry run, then their responses will be different - amongst other things, if they are caught, there will be no evidence or deniability.

              Still, I can't see this as having a low false positive rate.

              - Guy goes home to his beloved but too-oft left alone wife is nervous over the obvious.
              - Gal had to much to drink last night and woke up with someone... unusual. Worried about a few things that could really change her life
              - [insert various nervousness-inducing-mental-conditions-here] sufferer forgot to take his/her medicine.
              - First time flier.

              The list can go on.

              • Re: (Score:3, Insightful)

                by poetmatt ( 793785 )

                It's about as accurate as a lie detector. You know, because we all know lie detectors are so perfect. It's not like people know how to game them or anything. The "you can't hide your true intentions; your body will know" part is a 100% fallacy and guaranteed to not be accurate.

                I'm disappointed; gov't spending on some stupid shit here really.

            • by Chyeld ( 713439 ) <{moc.liamg} {ta} {dleyhc}> on Tuesday September 23, 2008 @12:42PM (#25123819)

              I've stolen this from Cory Doctorow

              Paradox of the false positive
              Statisticians speak of something called the Paradox of the False Positive. Here's how that works: imagine that you've got a disease that strikes one in a million people, and a test for the disease that's 99% accurate. You administer the test to a million people, and it will be positive for around 10,000 of them - because for every hundred people, it will be wrong once (that's what 99% accurate means). Yet, statistically, we know that there's only one infected person in the entire sample. That means that your "99% accurate" test is wrong 9,999 times out of 10,000!

              Terrorism is a lot less common than one in a million and automated "tests" for terrorism - data-mined conclusions drawn from transactions, Oyster cards, bank transfers, travel schedules, etc - are a lot less accurate than 99%. That means practically every person who is branded a terrorist by our data-mining efforts is innocent.

              In other words, in the effort to find the terrorist needles in our haystacks, we're just making much bigger haystacks.

              You don't get to understand the statistics of rare events by intuition. It's something that has to be learned, through formal and informal instruction. If there's one thing the government and our educational institutions could do to keep us safer, it's this: teach us how statistics works. They should drill it into us with the same vigor with which they approached convincing us that property values would rise forever, make it the subject of reality TV shows and infuse every corner of our news and politics with it. Without an adequate grasp of these concepts, no one can ever tell for sure if he or she is safe.

              • Re: (Score:3, Insightful)

                by ichimunki ( 194887 )
                That means that your "99% accurate" test is wrong 9,999 times out of 10,000! [snip] If there's one thing the government and our educational institutions could do to keep us safer, it's this: teach us how statistics works

                They can start by teaching Cory Doctorow how to count. The hypothetical test is wrong 9,999 times out of 1,000,000. Assuming, of course, that the test only produces false positives, and not also false negatives. That's what 99% accurate means.

            • Re: (Score:3, Interesting)

              by ericfitz ( 59316 )

              A dry run would not work.

              If the attackers knew it was a dry run, then they would not exhibit the signs of stress that the machine detects, therefore all would test negative.

              If the attackers did NOT know it was a dry run, then they must also carry attack devices with them through the screening process, and be at risk of detection of the devices or by an observant screener or secondary screening.

              Plus, they must either carry out the attack, making their future use moot, or have the attack called off at the las

          • Re:sensors... (Score:4, Insightful)

            by ShieldW0lf ( 601553 ) on Tuesday September 23, 2008 @11:49AM (#25122803) Journal
            Are you questioning the judgment of the Miniluv? [wikipedia.org]?

            That is a clear case of crimethink [wikipedia.org].

            I'm calling thinkpol [wikipedia.org].
            • Re: (Score:3, Insightful)

              by maxume ( 22995 )

              Did you read a book over the weekend? Did it hurt?

              Orwell got lots of easy stuff right (people like authority...a call for a leader starts with a desire to follow), but he missed the boat on just how easy it has become (and is becoming!) to use computers to not merely threaten to monitor anybody at any time, but to monitor everybody all the time.

              Unfortunately, sarcastic bitching is not the solution.

              • by interstellar_donkey ( 200782 ) <<pathighgate> <at> <hotmail.com>> on Tuesday September 23, 2008 @12:22PM (#25123417) Homepage Journal

                Unfortunately, sarcastic bitching is not the solution.

                No, but it does make it a little easier to handle as the problem gets worse.

              • Re:sensors... (Score:5, Insightful)

                by gstoddart ( 321705 ) on Tuesday September 23, 2008 @12:29PM (#25123575) Homepage

                but he missed the boat on just how easy it has become (and is becoming!) to use computers to not merely threaten to monitor anybody at any time, but to monitor everybody all the time.

                Given that he published it in 1949, he can be forgiven for not foreseeing modern computers.

                In terms of showing how pervasive and evil a surveillance society can be, he's still highly relevant.

                Pointing out just how eerie something like an automated "future crimes" concept is hardly just sarcastic bitching -- I'm betting an awful lot of people read that summary and thought "holy crap!!", I sure as hell did. Because, the sheer idea of being detained or hassled because some computer suggested you might be stressed is nuts. It's scary to think this could give them any grounds to act on anymore more than a very cursory level -- I mean, talk about your unreasonable search, and people being told they need to get the rubber glove treatment because some computer program identified them as stressed is lunacy.

                Time was when one would have through it impossible for the USA to degenerate into a place where this would be happening. Now, it's hard to think of how one would stop it. Spending billions of dollars to make all of the scary stuff in Orwell come true is frightening to some of us.

                Cheers

          • Re:sensors... (Score:5, Insightful)

            by rtb61 ( 674572 ) on Tuesday September 23, 2008 @12:01PM (#25123033) Homepage

            The biggest problem with this, is between 78% and 80% of people told to act suspiciously can fool the system into believing they are intending to commit crime, logically those same people should be able to act in the opposite fashion to fool the system into believing they are not, I mean really, what are they thinking the logic of their analysis represents.

            Apparently excuses for legal pre-emptive arrests for unsavoury people is the new focus, much like the no fly lists. A list of politically undesirable people who will be arrested, searched, interrogated, transferred to a prison facility whilst their identities are confirmed (which I am sure will take no longer than 24 to 48 hours). All this will be done at a range of designated choke points, like train and subway stations and, maybe even toll booths.

            Adjust your political alignment or you will find you, your family, your friends subject to random humiliations, violent arrests, searches including sexual groping and destruction of private property, of course your will be released and it will all be done with a masquerade of legality. I believe some journalists have already experienced exactly this type of pre-emptive arrest at the RNC convention, I don't believe they were particularly impressed with the concept.

          • Re:sensors... (Score:4, Insightful)

            by jd ( 1658 ) <imipak AT yahoo DOT com> on Tuesday September 23, 2008 @12:02PM (#25123043) Homepage Journal
            There isn't necessarily a "fight" scenario. The individuals could very easily train themselves into regarding their acts as no different from a postman delivering a parcel. Assuming, of course, the person IS the one with the hostile intent - if the system is remotely effective, groups could be expected to migrate to unwitting "volunteers". Of course, such systems may be jammable, depending on how they work. It doesn't matter if vulnerabilities appear to be theoretical - organizations that are willing to, well, burn money (literally at extreme temperatures and pressures) are likely to find exploits because a populace deluded into thinking they are safe would logically be easier to manipulate and control by fear.

            It's easy to move heat around, so any simple thermal camera can be tricked into thinking the person looks normal. This is only useful if the camera is simple. The heat has to go somewhere, so you'd see some point being much hotter than expected, but any software designed to reject absurd anomalies would reject such a point as impossible.

            Facial expressions would logically require a course at an acting school or a few minutes with a bottle of latex and a blow-drier to create a fake facial skin. Criminals would not require the skill of Hollywood. They would only need to fool automatic face recognition and facial expression recognition software. At worst, they'd also need to fool low-res, low frame-rate CCTV operators at range. Most LARP groups have experience at producing very realistic face masks. Learning from them would produce someone who could (if they wanted to) be totally secure against CCTV systems. Many ethnic profilers could logically be fooled with similar methods.

            As for false positives - anyone who is ill will show higher-than-normal heat, as will anyone who has gone jogging or exercising. Anyone caught in a hot car due to snarled-up roads will be hot and show an angry, hostile expression. Many in New England are permanently in a state of anger. So, in all probability, 90% of all city-dwellers and New Englanders will be classed as potential terrorists. Of course, I've always been somewhat suspect of Philadelphia cheese, but that seem to be taking the complaint a bit too far.

            • Re: (Score:3, Funny)

              by amRadioHed ( 463061 )

              Many in New England are permanently in a state of anger.

              Seriously? Have you ever been to New England, or do you just read too much Steven King?

        • Re:sensors... (Score:5, Insightful)

          by Otter ( 3800 ) on Tuesday September 23, 2008 @11:29AM (#25122419) Journal
          Absolutely untrue. Suicide bombers fail as often as they do (in Israel, Iraq, Sri Lanka,...) because they're usually bug-eyed, sweating, twitching, and frequently high. Highly trained operatives might be reliably calm, but the run-of-the-mill terrorist is usually pretty obvious, although they can still often kill people before someone can stop them.
        • As a Marine (Score:3, Insightful)

          by wernox1987 ( 1362105 )
          I never doubted what I was doing was right, or justified, but that didn't stop my heart from pounding out of my chest.
      • It can randomly spurt out false positives, subjecting people to random stops and questioning. It can still miss the real terrorists who are doing their damnedest to look normal and unthreatening.

        Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.

        • Re:sensors... (Score:4, Insightful)

          by Aphoxema ( 1088507 ) * on Tuesday September 23, 2008 @11:30AM (#25122423) Journal

          It's okay since only a few people will get hurt in the process.

        • Re:sensors... (Score:4, Interesting)

          by gnick ( 1211984 ) on Tuesday September 23, 2008 @11:38AM (#25122609) Homepage

          Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.

          It's not that they oppose the development of the technology. It's that they're fed up with privacy invasions and random harassment and see this device as a means of propagating both. Even if this thing threw up 50% correct red-flags, you'd see objections.

          Besides, Big Brother paranoia plays very well here - Especially when it's accurate.

        • Re:sensors... (Score:5, Insightful)

          by Kozar_The_Malignant ( 738483 ) on Tuesday September 23, 2008 @11:50AM (#25122821)

          Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.

          It isn't the idea of developing an immature technology that upsets people. It is our well-justified fear of the government deploying immature technology. I'd rather not be subjected to a public beta-test of a thoughtcrime detector.

        • Re: (Score:3, Insightful)

          by compro01 ( 777531 )

          You assume that the immature technology in question is even based on a workable premise and isn't just a massive pit for money, time, effort, and pain with no hope of producing anything useful.

          They told all the people specifically to "act suspiciously", and the damn thing still failed at detecting them 22% of the time!

      • by TheMeuge ( 645043 ) on Tuesday September 23, 2008 @11:47AM (#25122753)

        So this device was 80% successful at picking up suspicious activity from PEOPLE WHO WERE ASKED TO LOOK SUSPICIOUS.

        Wow, amazing! Something any police officer who has served a couple of years would be able to do with 100% (or nearly so) accuracy.

        What is missing is an assay of how many people it would flag if they were told to behave as if they were SCARED. You know... scared of being flagged for behaving abnormally, strip-searched, tortured, and never seeing their families again. Something tells me that the rate of false positives on this machine will overshadow the rate of false negatives by a very large margin.

      • Re: (Score:3, Informative)

        by Atrox666 ( 957601 )
        As long as you act exactly like everyone else and never have a troubling thought I don't see what the problem is.
    • Re: (Score:3, Funny)

      by nedlohs ( 1335013 )

      It can't be sued for being racist...

    • Re:sensors... (Score:5, Insightful)

      by Otter ( 3800 ) on Tuesday September 23, 2008 @11:22AM (#25122259) Journal

      ...And most importantly, skin colour?

      That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.

      • Re:sensors... (Score:4, Insightful)

        by arth1 ( 260657 ) on Tuesday September 23, 2008 @11:29AM (#25122399) Homepage Journal

        That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.

        So, who are the non-humans that calibrate the systems?

    • by arth1 ( 260657 )

      Quite frankly, I don't think that DHS are for less invasive procedures.

      Anyhow, if this thing can be refined so it accurately detects people intent on deception, it will mean that few politicians or lawyers ever will be able to fly. It'll get nixed, no worries.

  • Err (Score:5, Insightful)

    by InvisblePinkUnicorn ( 1126837 ) on Tuesday September 23, 2008 @11:04AM (#25121831)
    Does this sound idiotic to anyone else? Of course it's going to work for people who are told how to act in order to get the device to flag them.
    • by Joce640k ( 829181 ) on Tuesday September 23, 2008 @11:14AM (#25122065) Homepage

      All we've got is a device which can spot normal people trying to be visibly "suspicious".

      • by HTH NE1 ( 675604 ) on Tuesday September 23, 2008 @11:37AM (#25122569)

        All we've got is a device which can spot normal people trying to be visibly "suspicious".

        Doc Brown: Get yourself some fifties clothes.
        Marty McFly: Check, Doc.
        Doc Brown: Something inconspicuous!

      • the end of liberty (Score:5, Insightful)

        by globaljustin ( 574257 ) on Tuesday September 23, 2008 @12:00PM (#25123015) Journal

        All we've got is a device which can spot normal people trying to be visibly "suspicious".

        You are correct. From TFA:

        Some subjects were told to act shifty, be evasive, deceptive and hostile. And many were detected.

        It is absolutely ridiculous to think that they have produced any kind of test results that would indicate a functioning system. This is government and business at its absolute worst.

        Not only is DHS trying their damnedest to become big brother, they are doing it in the most incompetent way possible.

        This tech will never, ever work. All it can measure is physiological attributes. Correlation is not causation. Just because some percentage of people who are intending to commit a crime have certain physiological characteristics does not mean that anyone with those characteristics is a 'pre-criminal' and should be questioned. I weep for the future.

        And even if, in some far-flung scenario, it did become functional it would still be illegal. It is invasion of privacy. Our thoughts and intentions are private. They mean nothing until we act on them. Human thought is vast and unlimited, part of our nature is boiling down the infinite array of ideas we have into action in the physical world where there are consequences. Everyone has the right to think whatever they want. When they act on it, then that action enters the territory of having (potentially bad) consequences.

        What this evolves into is thought control and that is the end of liberty.

    • Re:Err (Score:5, Insightful)

      by Yvanhoe ( 564877 ) on Tuesday September 23, 2008 @11:18AM (#25122179) Journal
      If I recall correctly, the last time I traveled to USA, I had to fill a form stating that the intent of my travel was not to kill the US president. People who create such forms would probably fund a research on a "suspicious person detector"
    • Re: (Score:3, Interesting)

      Does this sound idiotic to anyone else?

      Yep. But this is slashdot. Tot he powers that be it probably shows "great promise" and, since it is a machine, would be "unbiased."

      All the things it is tagging as "suspicious" could also be explained by a bad phone call just before you come in range. Maybe your wife just caleld to say she's leaving you for your sister. Again.
    • Re: (Score:3, Funny)

      by gsslay ( 807818 )

      Does this sound idiotic to anyone else?

      Yes indeed it does.

      Testing on my new device starts tomorrow. It has a remarkable 98% accuracy in identifying people told to dress completely in purple and sing "I Love You, You Love Me". Even at a distance. As long as the terrorists play along (and who wouldn't?) we'll win this war on terror any time soon. And even if they don't, think of all the Barney impersonators we'll get off the streets. It's an everybody-wins scenario.

    • Re: (Score:3, Insightful)

      by oldspewey ( 1303305 )

      Does this sound idiotic to anyone else?

      Yes, it's completely idiotic. What these geniuses have done has nothing to do with security - they have created a bad, amateur acting detector that boasts ~80% accuracy.

    • Re:Err (Score:5, Interesting)

      by DriedClexler ( 814907 ) on Tuesday September 23, 2008 @12:06PM (#25123129)

      Yes, it does sound idiotic. My reaction was: ROFLcopter at the idea that you can successfully "tell people to act suspicious". Um, if it were possible in the first place for people to notice and control the aspects of themselves that make them look suspicious, others wouldn't be suspicious of those aspects in the first place!

      Think about it: people become suspicious of others based on criteria X,Y,Z because meeting X,Y,Z reveals a higher probability of intent to cause harm. But anybody trying to cause harm will suppress any *controllable* sign that they are trying to cause harm before it's too late to stop. So the only remaining criteria people use in dermining whether they'll be suspicious of someone are those that are very difficult if not impossible to control. As a bad example: someone will only look around to see if he's being watched (which looks suspicious), if he's about to do something objectionable (like picking a lock). But he can't suppress that because then he takes the chance of someone noticing him picking the lock.

      A better test would be to set up a scenario like a line at the airport where the screeners have to keep out dangerous items. Then, have a few of the participants try to smuggle items through, and get a huge reward if they succeed, while the screeners get the reward if smugglers don't succeed. Then, put a time limit on, so the screeners have to be judicious about who they check, so they only check the most suspicious. Oh, and make it double-blind as much as possible. Then, the people trying to smuggle will have the same incentive structure that real smugglers have, and thus will give off all the real-world signs of planning something objectionable.

      But then, that would be too much work.

    • Re:Err (Score:4, Funny)

      by Dahamma ( 304068 ) on Tuesday September 23, 2008 @02:08PM (#25125283)

      Especially since their suggestion for acting suspicious was to wear a top hat, fake moustache, and black cape.

  • by fprintf ( 82740 ) on Tuesday September 23, 2008 @11:04AM (#25121855) Journal

    The summary talks about the sujects being told to act suspicious. So, if you are told to be suspicious does this make any difference from someone who is actually planning something nasty? I suppose it is difficult to find subjects who are unaware they are being observed, and yet also intent on doing something bad. Nevertheless, I'd hypothesize there might be significant, observable differences between the two groups.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      You will always get these sorts of results with forced actions. If I made a happiness detector (via facial expressions), and told half of the group to smile, and the other half not to, I bet it would pick that up. Now, what if half the group were given personal responsibility toy, and the other half were given a cuddly teddy bear? I bet it wouldn't be accurate anymore...

      A better test would be to give the group water bottles. Most of the group are given real water in bottles. A few of the group are give

    • Re: (Score:3, Interesting)

      Wouldn't "suspicious" also be highly subjective? Many times that's more reflective on the prejudices of the observer. So let's take a programmer who's been up all night trying to solve a problem. He's disheveled, unshaven, and probably unkempt. He's deep in thought and in his own world. He starts talking to himself about the problem. Is he suspicious?

      • Re: (Score:3, Funny)

        by HTH NE1 ( 675604 )

        Wouldn't "suspicious" also be highly subjective? Many times that's more reflective on the prejudices of the observer. So let's take a programmer who's been up all night trying to solve a problem. He's disheveled, unshaven, and probably unkempt. He's deep in thought and in his own world. He starts talking to himself about the problem. Is he suspicious?

        Is he sitting on a park bench? Snot running down his nose, greasy fingers smearing shabby clothes?

  • Not even close (Score:5, Interesting)

    by ShawnCplus ( 1083617 ) <shawncplus@gmail.com> on Tuesday September 23, 2008 @11:05AM (#25121865) Homepage
    Sorry, but 78% is not even REMOTELY accurate to consider someone dangerous. There is already a high enough false accusation rate.
    • by pizzach ( 1011925 ) <pizzach@@@gmail...com> on Tuesday September 23, 2008 @11:20AM (#25122209) Homepage

      In other words, 22% of the time it is wrong. Saying it's right 78% of the time is pure and simple market speak.

      The interesting thing about this is if people started to intrinsically act suspicious, the numbers become fudged and mostly meaningless. One way this could be accomplished is by standing around handing out complimentary eye patches, telling people it is act like a pirate day.

    • by patio11 ( 857072 ) on Tuesday September 23, 2008 @11:37AM (#25122583)

      Most AIDS tests are 99%+ accurate at telling you that a person with HIV actually has HIV. They're also 99% accurate at saying a person who doesn't have HIV, doesn't have HIV. Its the combination of those two facts plus "Very few people in the general population have HIV" which makes mass one-time AIDS screenings a bad idea -- you successfully pull the guy out of 100 who had HIV, then you throw in one negative bystander, and you end up adding 99% accurate + 99% accurate to get 50% accurate.

      There are a heck of a lot less terrorists than 1% of the flying public.

      There is a countermeasure, of course -- you use the magic machine not as a definitive test but as a screening mechanism. Know why we aggressively screen high risk groups for AIDS? Because they're high risk -- if 1 out of every 4 screenies is known to be positive (not hard to reach with some populations) then the 99%/99% math adds up to better than 95%. Better news. (You then independently run a second test before you tell anyone they're positive. Just like you wouldn't immediately shoot anybody the machine said is a terrorist -- you'd just escalate the search, like subjecting them to a patdown or asking for permission to search their bags or what have you.)

      So you could use the magic machine to, say, eliminate 75, 90, 99%, whatever of the search space before you go onto whatever your next level of screening is -- the whole flying rigamarole, for example. Concentrate the same amount of resources on searching 20 people a plane instead of 400. Less hassle for the vast majority of passengers, less cursoryness to all of the examinations.

      The quick here will notice that this is exactly the mechanism racial profiling works by -- we know a priori that the 3 year old black kid and the 68 year old white grandmother is not holding a bomb, ergo we move onto the 20 year old Saudi who it is merely extraordinarily improbable to be holding a bomb. That would also let you lop off a huge section of the search space off the top.

      The difference between the magic machine and racial profiling is that racial profiling is politically radioactive, but the magic machine might be perceived as neutral. Whether you consider that a good or a bad thing is up to you. Hypothetically assuming that the machine achieves, oh, 80% negative readings for true negatives, many people might consider it an awfully nice thing to have 80% of the plane not have to take off their shoes or get pat down -- they could possibly get screened as non-invasively as having to answer two of those silly, routine questions.

      (Of course, regardless of what we do, people will claim we're racially profiling. But that is a different issue.)

    • Re:Not even close (Score:5, Interesting)

      by JustinOpinion ( 1246824 ) on Tuesday September 23, 2008 @11:46AM (#25122739)
      Cory Doctorow described it nicely in his recent book "Little Brother" (free download available [craphound.com]):

      If you ever decide to do something as stupid as build an automatic terrorism detector, here's a math lesson you need to learn first. It's called "the paradox of the false positive," and it's a doozy.

      Say you have a new disease, called Super-AIDS. Only one in a million people gets Super-AIDS. You develop a test for Super-AIDS that's 99 percent accurate. I mean, 99 percent of the time, it gives the correct result -- true if the subject is infected, and false if the subject is healthy. You give the test to a million people.

      One in a million people have Super-AIDS. One in a hundred people that you test will generate a "false positive" -- the test will say he has Super-AIDS even though he doesn't. That's what "99 percent accurate" means: one percent wrong.

      What's one percent of one million?

      1,000,000/100 = 10,000

      One in a million people has Super-AIDS. If you test a million random people, you'll probably only find one case of real Super-AIDS. But your test won't identify one person as having Super-AIDS. It will identify 10,000 people as having it.

      Your 99 percent accurate test will perform with 99.99 percent inaccuracy.

      That's the paradox of the false positive. When you try to find something really rare, your test's accuracy has to match the rarity of the thing you're looking for. If you're trying to point at a single pixel on your screen, a sharp pencil is a good pointer: the pencil-tip is a lot smaller (more accurate) than the pixels. But a pencil-tip is no good at pointing at a single atom in your screen. For that, you need a pointer -- a test -- that's one atom wide or less at the tip.

      This is the paradox of the false positive, and here's how it applies to terrorism:

      Terrorists are really rare. In a city of twenty million like New York, there might be one or two terrorists. Maybe ten of them at the outside. 10/20,000,000 = 0.00005 percent. One twenty-thousandth of a percent.

      That's pretty rare all right. Now, say you've got some software that can sift through all the bank-records, or toll-pass records, or public transit records, or phone-call records in the city and catch terrorists 99 percent of the time.

      In a pool of twenty million people, a 99 percent accurate test will identify two hundred thousand people as being terrorists. But only ten of them are terrorists. To catch ten bad guys, you have to haul in and investigate two hundred thousand innocent people.

      Guess what? Terrorism tests aren't anywhere close to 99 percent accurate. More like 60 percent accurate. Even 40 percent accurate, sometimes.

      What this all meant was that the Department of Homeland Security had set itself up to fail badly. They were trying to spot incredibly rare events -- a person is a terrorist -- with inaccurate systems.

  • by AioKits ( 1235070 ) on Tuesday September 23, 2008 @11:05AM (#25121867)
    In other news today, Homeland Security has detained the entire Chili Cook-off Carnival event after their new FAST software registered positive hits on EVERYTHING there, including some domesticated animals and a squirrel with three legs.
  • Doesn't matter (Score:5, Insightful)

    by MadMidnightBomber ( 894759 ) on Tuesday September 23, 2008 @11:05AM (#25121873)

    "In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."

    None of that matters - what's important is the false positive rate, ie. the proportion of people with no malicious intent who get flagged up. If it's as high as 1% the system will be pretty much unworkable.

  • Really? (Score:4, Insightful)

    by gehrehmee ( 16338 ) on Tuesday September 23, 2008 @11:05AM (#25121875) Homepage

    In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,

    Isn't this a little off-base? People who are really about to commit a crime, as a rule, will be explicitly trying not to look suspicious.

  • Additional Locations (Score:3, Interesting)

    by UncleWilly ( 1128141 ) * <<UncleWilly07> <at> <gmail.com>> on Tuesday September 23, 2008 @11:05AM (#25121885)

    I propose the House, Senate and White House also.

  • I think someone has been watching the Minority Report just a bit too closely. I can just see it now... the 'Pre-Crime' Division of the DHS.
  • Things like these are only as good as their success rate. If they get a whole lot of false positives, then they're going to be worth squat when it actually comes down to hard evidence.

    Then again, perhaps they might be useful as a general indicator of "mal-intent". Not as a method of proof, but just a way of optimising the job of certain DHS officials.

  • So if I'm running and about to lie to my trainer or doctor about how far I ran today, my pulse rate, breathing rate, and body temperature are up. I'm thinking about deceiving someone. So I guess that means it's now a crime to lie to your trainer according to the DHS?

  • Were the 'positive' participants in the test told to "act suspicious" by carrying a radio transponder on their person?

    • Were the 'positive' participants in the test told to "act suspicious" by carrying a radio transponder on their person?

      Nope, only 78% of them were told to carry a radio transponder. Didn't you RTFS.

  • by bigtallmofo ( 695287 ) * on Tuesday September 23, 2008 @11:06AM (#25121911)
    I was just about to finish up my patent application for a device that could accurately detect a human pretending to be a monkey 80% of the time when a human test subject is asked in advance to pretend to be a monkey.

    Why do I even bother?
  • Those told to act suspicious ? WTF, did they give them Groucho Marx subglasses ? .And a 20% false negative rate on that.

    IMHO, every person involved with this project should be summarily fired, up to and including the Department Head.

    • by Chris Burke ( 6130 ) on Tuesday September 23, 2008 @11:32AM (#25122475) Homepage

      Just an fyi, the accuracy number doesn't directly tell you the ratio of false negatives. It's a measure not just of how many true positives it gets (that's the sensitivity), but also of true negatives(that's the specificity), in that it should both identify the "suspicious" correctly and correctly identify the non-"suspicious".

      You can't go from the accuracy directly to the specificity and sensitivity, since it's a combination of several measurements. The result, though, will be highly dependent on the prevalence of "suspicious" people in their test, which is the ratio of how often what you're trying to detect actually occurs.

      I'm willing to bet that the prevalence they used in their testing is way, way higher than it would be in real life (like 1/4 to 1/2 of the test subjects were "suspicious", while in real life the odds of a random person in an airport being a terrorist is more like 1/1e6 on a bad day). So this would skew the accuracy measurement towards detecting the suspicious and understate the importance of figuring out correctly that someone is not suspicious. The problem is that when you're dealing with something very rare, even if your specificity is very high, the odds that someone you pull out of line because the machine flagged them is in fact innocent is extremely high (it's going to be over 99% chance unless this machine is -very- specific), and if your test methodology doesn't worry as much about specificity, then it's going to be even worse.

  • by Minwee ( 522556 ) <dcr@neverwhen.org> on Tuesday September 23, 2008 @11:09AM (#25121971) Homepage

    All you need to do now is post signs reminding any potential evil-doers to "act suspicious" and the system will work perfectly.

  • by tjstork ( 137384 ) <todd@bandrowsky.gmail@com> on Tuesday September 23, 2008 @11:12AM (#25122027) Homepage Journal

    If everyone was wearing a burka, then, there's no way that this system actually works. It may seem strange, but, what right does the public have to know my face?

    • Re: (Score:3, Insightful)

      by clone53421 ( 1310749 )

      what right does the public have to know my face?

      The same "right" that you have to ride an airplane.

      what right does the airline have to know my face?

      Well, last time I checked they still required all passengers to prove their identity before letting them on an airplane.

  • by mveloso ( 325617 ) on Tuesday September 23, 2008 @11:14AM (#25122075)

    The device relies on the assumption that the physiology of people up to no good may be different than normal people.

    And that may be true.

    However, this'll be much more useful somewhere like an embassy or checkpoint than in an airport. In a sea of potentially hostile people, it's harder to pick out the ones who may actually do something. In a sea of basically docile people, it should be relatively simple to visually pick the nervous ones.

    • Re: (Score:3, Interesting)

      by zotz ( 3951 )

      HEY! You have something here...

      So they are going to have to make flying a pleasant experience again if they hope to have this system work! Wow. Now that is going to be a tall order.

      all the best,

      drew

  • by JamesP ( 688957 ) on Tuesday September 23, 2008 @11:15AM (#25122091)

    If it helps nailing Tom Cruise

  • by dbyte ( 1207524 ) on Tuesday September 23, 2008 @11:15AM (#25122101)
    God help the nervous flier :)
  • hmm (Score:3, Insightful)

    by lucky130 ( 267588 ) on Tuesday September 23, 2008 @11:18AM (#25122171)

    I notice both of those success rates are less than 100%. Personally, I don't want to be one of those innocent 20+% that gets harassed.

  • by theverylastperson ( 1208224 ) on Tuesday September 23, 2008 @11:32AM (#25122469) Homepage
    Awesome, now we have a great tool to accuse people with. How can anything with an accuracy of 78% be worth using? On a grading scale it's a C+. How many innocent people (22%) will be caught up in this mess? If the government is trying to create a rebellion by the people, then this is a perfect method.

    How about hiring intelligent guards? Or people with common sense?

    If we spent 10% of what we spend on this kind of crap on actually solving the real problems we face, then we might actually get somewhere. But as long as we live in this ultra-paranoid world filled full of invisable terrorists then we'll never get the chance to overcome the real problems. What a shame and what a waste.
  • Absurdities (Score:5, Insightful)

    by mlwmohawk ( 801821 ) on Tuesday September 23, 2008 @11:57AM (#25122953)

    We lose more people to premature death each and every year because we have no health care than we have to terrorism in the whole of the 21st century.

    fear, fear, fear, be afraid, fear, fear, be afraid.

    A young girl waring a proto-board with blinking LEDs could have ben shot dead because of the hysteria.

    fear, fear, fear, be afraid, fear, fear, be afraid. fear, fear, fear, be afraid, fear, fear, be afraid.

    You can't say we have nothing to fear, but we have a lot of real and pressing things that need to be focused upon.

    fear, fear, fear, be afraid, fear, fear, be afraid. fear, fear, fear, be afraid, fear, fear, be afraid. Threat level purple.

    The U.S.A. has to re-grow our spine. We have nothing to fear but fear itself. Unfortunately, the current powers that be like to rule by exploiting and enhancing the terror of terrorists.

    • Sociopolitical fear is a strategy to push the population to the political right.

      The old saw about a conservative being a liberal who's been mugged holds true; all you have to do is mug their minds and they'll cave in.

      It's a sleight of mind in risk assessment: the real risks are automobiles, heart disease (i.e. a botched food system), botched health care, botched education, natural disasters, and crime/poverty. Well, everyday accidents too, but that's just natural selection. Terrorism is about as much of a r

  • by kent_eh ( 543303 ) on Tuesday September 23, 2008 @01:21PM (#25124473)
    When I am bored (standing in an endless lineup, waiting for a delayed flight, etc) I often look at my surroundings.
    I used to install video equipment, so I look at the installed video monitors and cameras.
    Is noticing security cameras (and the quality of their installation) in an area suspicious?

    I am a model railroader.
    Is is suspicious that I take pictures of trains and their environment so that I can build more accurate models?

    I studied architecture for a time.
    Is it suspicious that I spend a lot of time looking at (and sometimes photographing) interesting buildings?

    Am I acting suspicious when I notice a guard of some sort watching me doing the above, and that I am curious as to how he might react to my perfectly harmless activities in these highly paranoid times?

There is no opinion so absurd that some philosopher will not express it. -- Marcus Tullius Cicero, "Ad familiares"

Working...