Shadow Analysis Could Spot Terrorists 245
Hugh Pickens writes "An engineer at Jet Propulsion Labs says it should be possible to identify people from the way they walk — a technique called gait analysis, whose power lies in the fact that a person's walking style is very hard to disguise. Adrian Stoica has written software that recognizes human movement in aerial and satellite video footage by isolating moving shadows and using data on the time of day and the camera angle to correct shadows that are elongated or foreshortened. In tests on footage shot from the sixth floor of a building, Stoica says his software was indeed able to extract useful gait data. Extending the idea to satellites could prove trickier, though. Space imaging expert Bhupendra Jasani at King's College London says geostationary satellites simply don't have the resolution to provide useful detail. 'I find it hard to believe they could apply this technique from space,' says Jasani." Comments on the article speculate on the maximum resolution possible from KH-11 and KH-12 spy satellites.
Upon deployment.... (Score:5, Insightful)
Go ahead, develop more technology, there's always around it.
Geostationary? (Score:5, Insightful)
Who puts a spysat in geostationary orbit? It's way too high, you'd need a telescope that dwarfs Hubble to get a decent view. You put spysats in the lowest orbit you can get away with, and you make sure that you have enough of them that any target of interest will be covered frequently enough for your purposes.
Defeated (Score:5, Insightful)
Defeated by a simple 2 inch lift in one shoe.
ok (Score:2, Insightful)
Wonderful euphemism (Score:5, Insightful)
I like how yet again ``spotting terrorists'' is an euphemism for ``spotting everyone else, too''. I habitually substitute ``spotting YOU'' and honestly think of all the good it would do.
Yes, there's useful stuff in there, but again only if those watching you can be trusted. This has been said often enough before and still people score cheap headlines with the same fallacy.
Anybody spot the shadow of a flying pig yet?
Solution in search of problem (Score:5, Insightful)
So... um, you've got a "terrorist" under tight enough surveillance that you can build a "gait profile", but you're not arresting or just outright executing them?
Admittedly, I support this effort. Once complete, the DHS can take its rightful place as the Ministry of Funny Names and Walks.
c.
Comment removed (Score:3, Insightful)
How about wheelchairs? (Score:3, Insightful)
If I ever do something where I'm concerned about Big Brother watching me with their eyes in the sky, I'll just use one of the following:
Bike
Wheelchair
Skateboard
Electric wheelchair
Segway
Re:Well, that's easy. (Score:3, Insightful)
Re:What could possibly go wrong? (Score:3, Insightful)
That's terrorist talk. The correct way to say it is 'Our Predator drone blew up a terrorist cell and 25 collaborators'.
Re:Defeated (Score:4, Insightful)
A simple long skirt or coat would also do a good job of keeping your gait out of your shadow.
I am thinking there is more to this gait analaysis than watching the shadow of your legs. Think of your whole body and how it moves. Think about your arm movements, your head movement and your greater body movement, things that don't change easily with bulky clothes. Does your send you slightly side to side, do you keep one foot on front of the other...there are a lot of factors in this and I imagine the very smart people who came up with this already thought about the very things Slashdotters are suggesting here.
And sexologists can... (Score:2, Insightful)
Yes, but that still doesn't answer his questions (Score:5, Insightful)
Yes, but that still doesn't answer his objections.
Let's say you've got nothing to hide, are on the database, then you get an abcess in your foot. I had one, for example, thanks to some retarded shoes which did that much damage and it got infected. Next thing I knew, my walking style could belong in a "ministry of funny walks" sketch, except for me it was more painful than funny.
Would I suddenly be outside the database, and thus a suspect, in that scenario? Or what if they entered a criminal in the database when he had a similar injury, and then I have a similar injury two years later and suddenly I look like the re-appearance of Abdallah ibn Jihad, wanted for arson, genocide and jay-walking in East Bumfuckistan and Elbonia? (Made up name, btw. Means IIRC slave of Allah, son of jihad, or enough to get your average anti-terrrist spook get his panties in a knot by itself.)
It's not like you can choose when you'll have such an injury.
What is the degree of confidence in such an identification, anyway? How fine you can slice a gait and still leave room for normal daily variations? (E.g., account for stuff like today I'm feeling chippy and walk a lot livelier, while yesterday was a shitty day and my walk probably reflected that. E.g., today I walk on grass in the woods, yesterday I was walking on wet concrete, and a month ago I was walking on sand at the beach.) As they say, "if you're one in a million, there are 6000 exactly like you." Will it be able to positively identify said Abdallah ibn Jihad, even when he's walking uphill through the snow with a pebble in his boot, or will it be more like "it's one of 6000 people, one of which is Abdallah ibn Jihad"? Again, that's the number if it could positively and unerringly distinguish between one million different gaits.
Bullshit and strawmen? (Score:5, Insightful)
Bullshit and strawmen, whether intentional or not. The objections to false positives have more to do with statistics, than with slippery slopes or anything else.
E.g., let's say I have a system which can look at photos from the security cameras, and tell you if a face or gait in the crowd matches a terrorist profile with 99% accuracy. (Which is actually a lot higher than what most of these snake oil systems get.) The problem in that case isn't that it lets 1% of the terrorists go. It's that it also creates 1% false positives out of people who aren't, for just one terrorist's photo. Apply that to just one airport, say the JFK, with its almost 60,000 passengers per day. If you get exactly one photo of each passenger, that's 600 false positives per day, in just one airport, for just one terrorist. But more likely you'll have everyone caught by several cameras during their trip to the airport, so the number multiplies accordingly.
Now feed it a database of several tens of thousands of known criminals, suspects, etc, and watch the number of false positives explode. Given that accuracy, just 100 photos are enough to match a majority of the passengers at one point or another.
At some point you can simply swamp the security with false positives, to the point where it's worse than useless.
And it's not just a hypothetical scenario, it's what airport security people themselves have said about previous trials with face recognition system. That they're crap and worse than useless. Would you accuse those too of being paranoid and slippery-slope types, or just accept that they probably know their job enough to know when a gizmo isn't helping it?
So basically spare me the bullshit about "nirvana fallacies" and "paranoid liberals". Learn what the real problem is, before talking out the arse about.
Re:Upon deployment.... (Score:5, Insightful)
Actually, there is no shadow analysis. This is just one of dozens of counter-intelligence announcements meant to cause a response by the terrorists. DHS figures if terrorists are out taking dancing lessons, learning the bagpipes, practicing synchronized swimming, and growing herbs then they'll have far less time to make bombs and blow stuff up.
-Loyal
Re:Trying is the first step toward failure (Score:3, Insightful)
Fund research and development, yes. By all means. But also do some simple maths before actually doing it. If the maths says it doesn't work yet, don't be a dumbass, basically.
If you ended up paying settlements over something that you could have calculated on a napkin that it doesn't work... well... you have to ask yourself why.
Again, it has nothing to do with the Nirvana fallacy. Most things just have a minimum threshold below which they're useless. They don't have to be _perfect_, they just have to be above that threshhold. E.g., you wouldn't trust a statistic that has a, say, 40% degree of confidence, because the chances of it being bogus are higher than those of it actually having a point. E.g., you wouldn't go on a two-engine plane if the engines have a 95% chance of working, because then 1 in every 400 flights would have both engines die. And no airline would want it either, because, frankly, they don't even recoup the cost of the airplane in 400 flights. E.g., you wouldn't drive your car if it had only 99% chance of getting you to the destination in one piece. Etc.
And sometimes "if at first you don't succeed, try and try again" is just a dumb idea. E.g., in skydiving ;)
And at any rate, heck, is there a problem with discussing the potential shortcomings here? I mean, best case scenario, the researchers have already thought of that and are working on a solution. Worst case scenario, someone goes "duh" and starts working on a solution. Sounds like win-win to me.
Theoretically everything is possible, but it depends. Sometimes using 5 flawed systems is actually worse than using just one.
At any rate, I have nothing against that idea. But, again, I'd expect someone to first prove that that composite thing delivers a usable degree of accuracy, and a small controlled trial, before it's used as more than a cute tech demo.
I don't think many of them are also savvy enough about these techniques to make a meaningful suggestion. Different domains, really. They complained about the hideous number of false positives, but I don't think I've read any actual suggestion from them as to how to improve the algorithm.
The rant about Obama and paranoia must have confused me.
Re:Upon deployment.... (Score:3, Insightful)
The mod probably misread, thinking that he modded it unsightly.