Homeland Security Department Testing "Pre-Crime" Detector 580
holy_calamity writes "New Scientist reports that the Department of Homeland Security recently tested something called Future Attribute Screening Technologies (FAST) — a battery of sensors that determine whether someone is a security threat from a distance. Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance. In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."
sensors... (Score:5, Insightful)
Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance
...And most importantly, skin colour?
Seriously, is there anything a device like this can do that's either more useful or less invasive than a human watching people walking past and profiling/screening them on what they can see?
Re:sensors... (Score:5, Insightful)
Re:sensors... (Score:5, Insightful)
Good point. A real terrorist doesn't show signs of distress, because he doesn't consider his actions immoral. He thinks killing IS the moral thing to do.
Re:sensors... (Score:5, Insightful)
The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?
And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.
Re:sensors... (Score:5, Insightful)
The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?
And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.
You've hit that on the head. About 200,000 people go through Chicago O'Hare, just that single (though large) airport, every day. And so far, zero terrorist attacks launched out of O'Hare. The odds that a person this machine flagged being an innocent is ridiculously high, even if it is has high specificity.
Also, aside from the raw statistics of the thing, there's another compounding factor that makes this even more useless*, which is it's rather simple for terrorists to game the system with dry runs.
Terrorist organizations already tend to use people not on our radar for attacks, so if they get pulled out of line on a dry-run, we won't have anything on them and it'll look like yet another false positive. Our young jihadi goes through the line with a bunch of his buddies, and everyone who gets pulled out of line doesn't go through the next time. Once you've discovered the group of people who aren't detected by the terrorist detector/profilers/crystal ball, the hot run can proceed with little fear of getting caught.
* For the stated goal, of course, not the goal of Security Theater for which a magical terrorist detector is great.
Re:sensors... (Score:5, Funny)
Fair warning, you should go trademark the phrase "magical terrorist detector" before I do.
Re:sensors... All they need now is a timeship (Score:3, Insightful)
And then the DHLS agents can all be Commanders Braxtion, zipping through the timelines, arresting or aborting people, or arresting and coitus-interrupting the would-be parents, all stating like the Vidal Sassoon (And, THEY'LL tell two friends, and so on and so on and so on... commercial):
"I am Commander Braxton, of the DHLS Timeship Aeon. You are being arrested for crimes you WILL commit...", or,
"I am Commander Braxton, of the DHLS TimePos ICOS. You sex act is being disrupted to delay or prevent arrival of
Re: (Score:3, Funny)
Wth kinda of teenagers STEAL a dead elk from a bunch of guys with guns no less. I mean an elk weighs what 800lbs? These are some well prepared kids if they can run off with fresh kills like that, were they waiting in the woods in camo or something?
(aside from that i totally agree)
Re:sensors... (Score:5, Insightful)
Re:sensors... (Score:5, Insightful)
Actually, the real purpose is to pull out those kids who are nervous about leaving home for the first time going to college or something. That way they can scare them into not turning into one of those dirty liberal elitist intellectuals that would dare question the authority of the system.
Because nothing turns a kid into a conservative like a bad run-in with the cops, right?
Re:sensors... (Score:5, Insightful)
Re:sensors... (Score:5, Insightful)
If the terrorists know it's a dry run, then their responses will be different - amongst other things, if they are caught, there will be no evidence or deniability.
Still, I can't see this as having a low false positive rate.
- Guy goes home to his beloved but too-oft left alone wife is nervous over the obvious.
- Gal had to much to drink last night and woke up with someone... unusual. Worried about a few things that could really change her life
- [insert various nervousness-inducing-mental-conditions-here] sufferer forgot to take his/her medicine.
- First time flier.
The list can go on.
Re: (Score:3, Insightful)
It's about as accurate as a lie detector. You know, because we all know lie detectors are so perfect. It's not like people know how to game them or anything. The "you can't hide your true intentions; your body will know" part is a 100% fallacy and guaranteed to not be accurate.
I'm disappointed; gov't spending on some stupid shit here really.
The Paradox of the False Positive (Score:5, Insightful)
I've stolen this from Cory Doctorow
Re: (Score:3, Insightful)
They can start by teaching Cory Doctorow how to count. The hypothetical test is wrong 9,999 times out of 1,000,000. Assuming, of course, that the test only produces false positives, and not also false negatives. That's what 99% accurate means.
Re:The Paradox of the False Positive (Score:5, Insightful)
Out of the 10,000 people indicated as having the disease, only one did. If the purpose of the test is to find those with the disease, then it's wrong 9,999 times out of 10,000 when it reports someone has it.
Our lovely machine that is currently 78% accurate on 'mal-intent' (sic) detection is going to incorrectly tag 22 people out of every 100 as having mal-intent. With the gp's quoted figure of 200,000 people traveling through O'Hare every day, that means potentially 46,000 people a day incorrectly tagged as terrorists. Not one of them actually a terrorist, just someone caught as a false positive.
One airport. One day. 46,000 people whose lives have just been screwed over in some manner. And no guarantee that the one terrorist that might show up once every billion of people is going to be caught by the machine.
Re: (Score:3, Insightful)
With the gp's quoted figure of 200,000 people traveling through O'Hare every day, that means potentially 46,000 people a day incorrectly tagged as terrorists. Not one of them actually a terrorist, just someone caught as a false positive.
One airport. One day. 46,000 people whose lives have just been screwed over in some manner. And no guarantee that the one terrorist that might show up once every billion of people is going to be caught by the machine.
Right, and since it's obviously impossible to do a truly t
Re: (Score:3, Interesting)
A dry run would not work.
If the attackers knew it was a dry run, then they would not exhibit the signs of stress that the machine detects, therefore all would test negative.
If the attackers did NOT know it was a dry run, then they must also carry attack devices with them through the screening process, and be at risk of detection of the devices or by an observant screener or secondary screening.
Plus, they must either carry out the attack, making their future use moot, or have the attack called off at the las
Re:sensors... (Score:4, Insightful)
That is a clear case of crimethink [wikipedia.org].
I'm calling thinkpol [wikipedia.org].
Re: (Score:3, Insightful)
Did you read a book over the weekend? Did it hurt?
Orwell got lots of easy stuff right (people like authority...a call for a leader starts with a desire to follow), but he missed the boat on just how easy it has become (and is becoming!) to use computers to not merely threaten to monitor anybody at any time, but to monitor everybody all the time.
Unfortunately, sarcastic bitching is not the solution.
Re:sensors... (Score:5, Funny)
Unfortunately, sarcastic bitching is not the solution.
No, but it does make it a little easier to handle as the problem gets worse.
Re:sensors... (Score:5, Insightful)
Given that he published it in 1949, he can be forgiven for not foreseeing modern computers.
In terms of showing how pervasive and evil a surveillance society can be, he's still highly relevant.
Pointing out just how eerie something like an automated "future crimes" concept is hardly just sarcastic bitching -- I'm betting an awful lot of people read that summary and thought "holy crap!!", I sure as hell did. Because, the sheer idea of being detained or hassled because some computer suggested you might be stressed is nuts. It's scary to think this could give them any grounds to act on anymore more than a very cursory level -- I mean, talk about your unreasonable search, and people being told they need to get the rubber glove treatment because some computer program identified them as stressed is lunacy.
Time was when one would have through it impossible for the USA to degenerate into a place where this would be happening. Now, it's hard to think of how one would stop it. Spending billions of dollars to make all of the scary stuff in Orwell come true is frightening to some of us.
Cheers
Re: (Score:3, Insightful)
Isn't that a bit circular?
FWIW, I'm glad she exists. We need more voices that are not afraid of point out that the Friedman meme that laissez-faire capitalism spreads human freedom may not be accurate. It was heresy up until just a few years ago to question that popular opinion. Anything that upsets the true believers is fine with me (btw, I'm not a fan of Klein's).
Re: (Score:3, Insightful)
Re:sensors... (Score:5, Insightful)
The biggest problem with this, is between 78% and 80% of people told to act suspiciously can fool the system into believing they are intending to commit crime, logically those same people should be able to act in the opposite fashion to fool the system into believing they are not, I mean really, what are they thinking the logic of their analysis represents.
Apparently excuses for legal pre-emptive arrests for unsavoury people is the new focus, much like the no fly lists. A list of politically undesirable people who will be arrested, searched, interrogated, transferred to a prison facility whilst their identities are confirmed (which I am sure will take no longer than 24 to 48 hours). All this will be done at a range of designated choke points, like train and subway stations and, maybe even toll booths.
Adjust your political alignment or you will find you, your family, your friends subject to random humiliations, violent arrests, searches including sexual groping and destruction of private property, of course your will be released and it will all be done with a masquerade of legality. I believe some journalists have already experienced exactly this type of pre-emptive arrest at the RNC convention, I don't believe they were particularly impressed with the concept.
Re:sensors... (Score:4, Funny)
Great. So now every time I return from a business trip to Thailand where I had relations with young men of questionable age, and I call my wife from the customs line the machine will catch my guilty face and my increased heart rate from trying to pass a lie off to her. And I'll be stuck in the airport for a good six hours under arrest.
Welp, those "Business Trips" to Thailand are over.
Re:sensors... (Score:4, Insightful)
It's easy to move heat around, so any simple thermal camera can be tricked into thinking the person looks normal. This is only useful if the camera is simple. The heat has to go somewhere, so you'd see some point being much hotter than expected, but any software designed to reject absurd anomalies would reject such a point as impossible.
Facial expressions would logically require a course at an acting school or a few minutes with a bottle of latex and a blow-drier to create a fake facial skin. Criminals would not require the skill of Hollywood. They would only need to fool automatic face recognition and facial expression recognition software. At worst, they'd also need to fool low-res, low frame-rate CCTV operators at range. Most LARP groups have experience at producing very realistic face masks. Learning from them would produce someone who could (if they wanted to) be totally secure against CCTV systems. Many ethnic profilers could logically be fooled with similar methods.
As for false positives - anyone who is ill will show higher-than-normal heat, as will anyone who has gone jogging or exercising. Anyone caught in a hot car due to snarled-up roads will be hot and show an angry, hostile expression. Many in New England are permanently in a state of anger. So, in all probability, 90% of all city-dwellers and New Englanders will be classed as potential terrorists. Of course, I've always been somewhat suspect of Philadelphia cheese, but that seem to be taking the complaint a bit too far.
Re: (Score:3, Funny)
Seriously? Have you ever been to New England, or do you just read too much Steven King?
Re: (Score:3, Interesting)
Re:sensors... (Score:5, Insightful)
but using this to help narrow who to watch would be what this should be used for.
I can't disagree more strongly. When the flood the false positives start coming in, they'll quickly start dismissing them. As another poster pointed out, Chicago O'Hare alone has 200,000 people go through it every day; when several thousand of them are flagged as suspicious, you can bet that security will stop caring pretty quickly.
Re:sensors... (Score:5, Funny)
I'll give that a shot...
Re:sensors... (Score:5, Insightful)
Re:sensors... (Score:5, Funny)
because they're usually bug-eyed, sweating, twitching, and frequently high
Based on that alone they would be catching a lot of nerds out on the first date too.
As a Marine (Score:3, Insightful)
Re: (Score:3, Funny)
Uhm, he could have been serving coffee for senior officers his entire tour of duty, what do you know...
Or does assisting "murderers" make one a murderer as well? By that definition I think all of us are murderers.
Re: (Score:3, Interesting)
Particularly, a religious fanatic will be in a state of peace and righteousness-filled euphoria because he is finally "fulfilling his destiny" in life and just hours away from being rewarded by his God for being a faithful "Holy Warrior".
I've got to disagree there. I don't want to praise the machine - This thing is nuts. And I agree that, just before detonation, a fanatic may experience a sense of euphoric peace. But, when going through security, it's a toss up between beautiful martyrdom and failure resulting in a good long stretch in Guantanamo Bay being questioned unmercifully by the infidels. A good lot of training may help them deal with that stress. And their faith may provide them with confidence that their gods wouldn't allow t
Re: (Score:3, Funny)
Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.
Re:sensors... (Score:4, Insightful)
It's okay since only a few people will get hurt in the process.
Re: (Score:3, Informative)
Hey, you should register, you keep talking like that and you'll have karma out the ass!
Re:sensors... (Score:4, Interesting)
Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.
It's not that they oppose the development of the technology. It's that they're fed up with privacy invasions and random harassment and see this device as a means of propagating both. Even if this thing threw up 50% correct red-flags, you'd see objections.
Besides, Big Brother paranoia plays very well here - Especially when it's accurate.
Re:sensors... (Score:5, Insightful)
Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.
It isn't the idea of developing an immature technology that upsets people. It is our well-justified fear of the government deploying immature technology. I'd rather not be subjected to a public beta-test of a thoughtcrime detector.
Re:sensors... (Score:5, Insightful)
Exactly. Pair this up with the red light cameras, and you've got enough income to drive any city out of recession.
"I didn't run that red light"
"No, but you wanted to"
Re: (Score:3, Insightful)
You assume that the immature technology in question is even based on a workable premise and isn't just a massive pit for money, time, effort, and pain with no hope of producing anything useful.
They told all the people specifically to "act suspiciously", and the damn thing still failed at detecting them 22% of the time!
More bad statistics (Score:5, Insightful)
So this device was 80% successful at picking up suspicious activity from PEOPLE WHO WERE ASKED TO LOOK SUSPICIOUS.
Wow, amazing! Something any police officer who has served a couple of years would be able to do with 100% (or nearly so) accuracy.
What is missing is an assay of how many people it would flag if they were told to behave as if they were SCARED. You know... scared of being flagged for behaving abnormally, strip-searched, tortured, and never seeing their families again. Something tells me that the rate of false positives on this machine will overshadow the rate of false negatives by a very large margin.
Re: (Score:3, Informative)
Re: (Score:3)
Re: (Score:3, Funny)
It can't be sued for being racist...
Re:sensors... (Score:5, Insightful)
That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.
Re:sensors... (Score:4, Insightful)
So, who are the non-humans that calibrate the systems?
Re: (Score:3, Interesting)
Even known terrorist groups are now using "non-traditional" people as attackers, so either positive (i.e., "you look like a terrorist") or negative ("you don't look like a terrorist") profiling will cause too many false positives and negatives.
Second, it wouldn't be surprising to see people that aren't part of the "traditional" terrorist groups perfoming acts of terror for reasons unrelated to the political goals of groups like al-Qaida. In the US, it might be one of the "militias", while in Germany it mig
Re: (Score:3)
Quite frankly, I don't think that DHS are for less invasive procedures.
Anyhow, if this thing can be refined so it accurately detects people intent on deception, it will mean that few politicians or lawyers ever will be able to fly. It'll get nixed, no worries.
Re: (Score:3, Interesting)
A better name than 'FAST' would be 'cattle-control'.
Project Sheepdog.
Err (Score:5, Insightful)
My first thought, too... (Score:5, Insightful)
All we've got is a device which can spot normal people trying to be visibly "suspicious".
Re:My first thought, too... (Score:4, Funny)
All we've got is a device which can spot normal people trying to be visibly "suspicious".
Doc Brown: Get yourself some fifties clothes.
Marty McFly: Check, Doc.
Doc Brown: Something inconspicuous!
the end of liberty (Score:5, Insightful)
You are correct. From TFA:
It is absolutely ridiculous to think that they have produced any kind of test results that would indicate a functioning system. This is government and business at its absolute worst.
Not only is DHS trying their damnedest to become big brother, they are doing it in the most incompetent way possible.
This tech will never, ever work. All it can measure is physiological attributes. Correlation is not causation. Just because some percentage of people who are intending to commit a crime have certain physiological characteristics does not mean that anyone with those characteristics is a 'pre-criminal' and should be questioned. I weep for the future.
And even if, in some far-flung scenario, it did become functional it would still be illegal. It is invasion of privacy. Our thoughts and intentions are private. They mean nothing until we act on them. Human thought is vast and unlimited, part of our nature is boiling down the infinite array of ideas we have into action in the physical world where there are consequences. Everyone has the right to think whatever they want. When they act on it, then that action enters the territory of having (potentially bad) consequences.
What this evolves into is thought control and that is the end of liberty.
Re:Err (Score:5, Insightful)
Re: (Score:3, Interesting)
Yep. But this is slashdot. Tot he powers that be it probably shows "great promise" and, since it is a machine, would be "unbiased."
All the things it is tagging as "suspicious" could also be explained by a bad phone call just before you come in range. Maybe your wife just caleld to say she's leaving you for your sister. Again.
Re: (Score:3, Funny)
Does this sound idiotic to anyone else?
Yes indeed it does.
Testing on my new device starts tomorrow. It has a remarkable 98% accuracy in identifying people told to dress completely in purple and sing "I Love You, You Love Me". Even at a distance. As long as the terrorists play along (and who wouldn't?) we'll win this war on terror any time soon. And even if they don't, think of all the Barney impersonators we'll get off the streets. It's an everybody-wins scenario.
Re: (Score:3, Insightful)
Does this sound idiotic to anyone else?
Yes, it's completely idiotic. What these geniuses have done has nothing to do with security - they have created a bad, amateur acting detector that boasts ~80% accuracy.
Re:Err (Score:5, Interesting)
Yes, it does sound idiotic. My reaction was: ROFLcopter at the idea that you can successfully "tell people to act suspicious". Um, if it were possible in the first place for people to notice and control the aspects of themselves that make them look suspicious, others wouldn't be suspicious of those aspects in the first place!
Think about it: people become suspicious of others based on criteria X,Y,Z because meeting X,Y,Z reveals a higher probability of intent to cause harm. But anybody trying to cause harm will suppress any *controllable* sign that they are trying to cause harm before it's too late to stop. So the only remaining criteria people use in dermining whether they'll be suspicious of someone are those that are very difficult if not impossible to control. As a bad example: someone will only look around to see if he's being watched (which looks suspicious), if he's about to do something objectionable (like picking a lock). But he can't suppress that because then he takes the chance of someone noticing him picking the lock.
A better test would be to set up a scenario like a line at the airport where the screeners have to keep out dangerous items. Then, have a few of the participants try to smuggle items through, and get a huge reward if they succeed, while the screeners get the reward if smugglers don't succeed. Then, put a time limit on, so the screeners have to be judicious about who they check, so they only check the most suspicious. Oh, and make it double-blind as much as possible. Then, the people trying to smuggle will have the same incentive structure that real smugglers have, and thus will give off all the real-world signs of planning something objectionable.
But then, that would be too much work.
Re:Err (Score:4, Funny)
Especially since their suggestion for acting suspicious was to wear a top hat, fake moustache, and black cape.
Re:Err (Score:4, Insightful)
What do those 78% and 80% mean, you ask? Let's look at The Fine Article:
Answer: it's a bad acting detector.
Seriously, a better test would be to ask test subjects to do something relevant such as, say, defeat the detector (duh!). If the subject fails, something unpleasant, yet harmless, will happen; a device that emits a startling noise and perhaps belch some smelly smoke. Imagine a grown up version of the game Operation [youtube.com] (I hate that game). Better yet, have the subject carry the device on their person. The nature of the device would be demonstrated to the subject beforehand, just as a domestic animal is allowed to experience the shock from an electric fence to establish the proper respect for the deterrent.
I'm getting nervous just describing the damn thing.
"Told to act suspicious"? (Score:5, Insightful)
The summary talks about the sujects being told to act suspicious. So, if you are told to be suspicious does this make any difference from someone who is actually planning something nasty? I suppose it is difficult to find subjects who are unaware they are being observed, and yet also intent on doing something bad. Nevertheless, I'd hypothesize there might be significant, observable differences between the two groups.
Re: (Score:3, Insightful)
You will always get these sorts of results with forced actions. If I made a happiness detector (via facial expressions), and told half of the group to smile, and the other half not to, I bet it would pick that up. Now, what if half the group were given personal responsibility toy, and the other half were given a cuddly teddy bear? I bet it wouldn't be accurate anymore...
A better test would be to give the group water bottles. Most of the group are given real water in bottles. A few of the group are give
Re: (Score:3, Interesting)
Wouldn't "suspicious" also be highly subjective? Many times that's more reflective on the prejudices of the observer. So let's take a programmer who's been up all night trying to solve a problem. He's disheveled, unshaven, and probably unkempt. He's deep in thought and in his own world. He starts talking to himself about the problem. Is he suspicious?
Re: (Score:3, Funny)
Wouldn't "suspicious" also be highly subjective? Many times that's more reflective on the prejudices of the observer. So let's take a programmer who's been up all night trying to solve a problem. He's disheveled, unshaven, and probably unkempt. He's deep in thought and in his own world. He starts talking to himself about the problem. Is he suspicious?
Is he sitting on a park bench? Snot running down his nose, greasy fingers smearing shabby clothes?
Not even close (Score:5, Interesting)
Re:Not even close (Score:4, Funny)
In other words, 22% of the time it is wrong. Saying it's right 78% of the time is pure and simple market speak.
The interesting thing about this is if people started to intrinsically act suspicious, the numbers become fudged and mostly meaningless. One way this could be accomplished is by standing around handing out complimentary eye patches, telling people it is act like a pirate day.
78% isn't the number you care about (Score:4, Interesting)
Most AIDS tests are 99%+ accurate at telling you that a person with HIV actually has HIV. They're also 99% accurate at saying a person who doesn't have HIV, doesn't have HIV. Its the combination of those two facts plus "Very few people in the general population have HIV" which makes mass one-time AIDS screenings a bad idea -- you successfully pull the guy out of 100 who had HIV, then you throw in one negative bystander, and you end up adding 99% accurate + 99% accurate to get 50% accurate.
There are a heck of a lot less terrorists than 1% of the flying public.
There is a countermeasure, of course -- you use the magic machine not as a definitive test but as a screening mechanism. Know why we aggressively screen high risk groups for AIDS? Because they're high risk -- if 1 out of every 4 screenies is known to be positive (not hard to reach with some populations) then the 99%/99% math adds up to better than 95%. Better news. (You then independently run a second test before you tell anyone they're positive. Just like you wouldn't immediately shoot anybody the machine said is a terrorist -- you'd just escalate the search, like subjecting them to a patdown or asking for permission to search their bags or what have you.)
So you could use the magic machine to, say, eliminate 75, 90, 99%, whatever of the search space before you go onto whatever your next level of screening is -- the whole flying rigamarole, for example. Concentrate the same amount of resources on searching 20 people a plane instead of 400. Less hassle for the vast majority of passengers, less cursoryness to all of the examinations.
The quick here will notice that this is exactly the mechanism racial profiling works by -- we know a priori that the 3 year old black kid and the 68 year old white grandmother is not holding a bomb, ergo we move onto the 20 year old Saudi who it is merely extraordinarily improbable to be holding a bomb. That would also let you lop off a huge section of the search space off the top.
The difference between the magic machine and racial profiling is that racial profiling is politically radioactive, but the magic machine might be perceived as neutral. Whether you consider that a good or a bad thing is up to you. Hypothetically assuming that the machine achieves, oh, 80% negative readings for true negatives, many people might consider it an awfully nice thing to have 80% of the plane not have to take off their shoes or get pat down -- they could possibly get screened as non-invasively as having to answer two of those silly, routine questions.
(Of course, regardless of what we do, people will claim we're racially profiling. But that is a different issue.)
Re:Not even close (Score:5, Interesting)
Facial experessions? (Score:4, Funny)
Doesn't matter (Score:5, Insightful)
None of that matters - what's important is the false positive rate, ie. the proportion of people with no malicious intent who get flagged up. If it's as high as 1% the system will be pretty much unworkable.
Really? (Score:4, Insightful)
Isn't this a little off-base? People who are really about to commit a crime, as a rule, will be explicitly trying not to look suspicious.
Additional Locations (Score:3, Interesting)
I propose the House, Senate and White House also.
Re:Additional Locations (Score:5, Funny)
Minority Report (Score:2, Funny)
Only as good as its success rate (Score:2)
Things like these are only as good as their success rate. If they get a whole lot of false positives, then they're going to be worth squat when it actually comes down to hard evidence.
Then again, perhaps they might be useful as a general indicator of "mal-intent". Not as a method of proof, but just a way of optimising the job of certain DHS officials.
So a jogger who's lying to his trainer... (Score:2)
So if I'm running and about to lie to my trainer or doctor about how far I ran today, my pulse rate, breathing rate, and body temperature are up. I'm thinking about deceiving someone. So I guess that means it's now a crime to lie to your trainer according to the DHS?
Was this like the Missile Defense Shield tests? (Score:2)
Were the 'positive' participants in the test told to "act suspicious" by carrying a radio transponder on their person?
Re: (Score:2)
Nope, only 78% of them were told to carry a radio transponder. Didn't you RTFS.
Government screws private sector again. (Score:5, Funny)
Why do I even bother?
What a bunch of BS (Score:2)
Those told to act suspicious ? WTF, did they give them Groucho Marx subglasses ? .And a 20% false negative rate on that.
IMHO, every person involved with this project should be summarily fired, up to and including the Department Head.
Re:What a bunch of BS (Score:4, Informative)
Just an fyi, the accuracy number doesn't directly tell you the ratio of false negatives. It's a measure not just of how many true positives it gets (that's the sensitivity), but also of true negatives(that's the specificity), in that it should both identify the "suspicious" correctly and correctly identify the non-"suspicious".
You can't go from the accuracy directly to the specificity and sensitivity, since it's a combination of several measurements. The result, though, will be highly dependent on the prevalence of "suspicious" people in their test, which is the ratio of how often what you're trying to detect actually occurs.
I'm willing to bet that the prevalence they used in their testing is way, way higher than it would be in real life (like 1/4 to 1/2 of the test subjects were "suspicious", while in real life the odds of a random person in an airport being a terrorist is more like 1/1e6 on a bad day). So this would skew the accuracy measurement towards detecting the suspicious and understate the importance of figuring out correctly that someone is not suspicious. The problem is that when you're dealing with something very rare, even if your specificity is very high, the odds that someone you pull out of line because the machine flagged them is in fact innocent is extremely high (it's going to be over 99% chance unless this machine is -very- specific), and if your test methodology doesn't worry as much about specificity, then it's going to be even worse.
That's brilliant! (Score:5, Funny)
All you need to do now is post signs reminding any potential evil-doers to "act suspicious" and the system will work perfectly.
Fancy that, Burka's protect civil rights. (Score:5, Interesting)
If everyone was wearing a burka, then, there's no way that this system actually works. It may seem strange, but, what right does the public have to know my face?
Re: (Score:3, Insightful)
what right does the public have to know my face?
The same "right" that you have to ride an airplane.
what right does the airline have to know my face?
Well, last time I checked they still required all passengers to prove their identity before letting them on an airplane.
The new polygraph? Maybe not... (Score:3, Interesting)
The device relies on the assumption that the physiology of people up to no good may be different than normal people.
And that may be true.
However, this'll be much more useful somewhere like an embassy or checkpoint than in an airport. In a sea of potentially hostile people, it's harder to pick out the ones who may actually do something. In a sea of basically docile people, it should be relatively simple to visually pick the nervous ones.
Re: (Score:3, Interesting)
HEY! You have something here...
So they are going to have to make flying a pleasant experience again if they hope to have this system work! Wow. Now that is going to be a tall order.
all the best,
drew
I'm all for it! (Score:3, Funny)
If it helps nailing Tom Cruise
Will be fun at the airport (Score:4, Insightful)
hmm (Score:3, Insightful)
I notice both of those success rates are less than 100%. Personally, I don't want to be one of those innocent 20+% that gets harassed.
Guilty until innocent (Score:3, Insightful)
How about hiring intelligent guards? Or people with common sense?
If we spent 10% of what we spend on this kind of crap on actually solving the real problems we face, then we might actually get somewhere. But as long as we live in this ultra-paranoid world filled full of invisable terrorists then we'll never get the chance to overcome the real problems. What a shame and what a waste.
Absurdities (Score:5, Insightful)
We lose more people to premature death each and every year because we have no health care than we have to terrorism in the whole of the 21st century.
fear, fear, fear, be afraid, fear, fear, be afraid.
A young girl waring a proto-board with blinking LEDs could have ben shot dead because of the hysteria.
fear, fear, fear, be afraid, fear, fear, be afraid. fear, fear, fear, be afraid, fear, fear, be afraid.
You can't say we have nothing to fear, but we have a lot of real and pressing things that need to be focused upon.
fear, fear, fear, be afraid, fear, fear, be afraid. fear, fear, fear, be afraid, fear, fear, be afraid. Threat level purple.
The U.S.A. has to re-grow our spine. We have nothing to fear but fear itself. Unfortunately, the current powers that be like to rule by exploiting and enhancing the terror of terrorists.
It's obvious but bears repeating: (Score:3, Insightful)
Sociopolitical fear is a strategy to push the population to the political right.
The old saw about a conservative being a liberal who's been mugged holds true; all you have to do is mug their minds and they'll cave in.
It's a sleight of mind in risk assessment: the real risks are automobiles, heart disease (i.e. a botched food system), botched health care, botched education, natural disasters, and crime/poverty. Well, everyday accidents too, but that's just natural selection. Terrorism is about as much of a r
What is suspicious looking anyway? (Score:3, Insightful)
I used to install video equipment, so I look at the installed video monitors and cameras.
Is noticing security cameras (and the quality of their installation) in an area suspicious?
I am a model railroader.
Is is suspicious that I take pictures of trains and their environment so that I can build more accurate models?
I studied architecture for a time.
Is it suspicious that I spend a lot of time looking at (and sometimes photographing) interesting buildings?
Am I acting suspicious when I notice a guard of some sort watching me doing the above, and that I am curious as to how he might react to my perfectly harmless activities in these highly paranoid times?
Re: (Score:3, Funny)
Re:Suspcious People? (Score:4, Funny)
Re: (Score:3, Funny)
What about people wearing Baclava or some other sort of head covering?
Well let's be completely honest here, anybody wearing a delicious Greek pastry [wikipedia.org] on their head while trying to fly under the radar has already blown it in a big way.
As far as other head coverings [hatsofmeat.com] go, I still think you want to stick with the ones that aren't food-related ... you know, the idea is to blend in.
Re: (Score:3, Insightful)