ACLU Examines Face-Recognition System 165
nate_drake and others wrote in about an ACLU report on face-recognition (PDF) (see also their press release and an MSNBC article). We've posted several previous stories about the Tampa police using face-recognition systems at the Super Bowl and on the streets of Ybor City.
ok... (Score:2)
all this means is the companies developing this stuff will have to improve their face-matching algorithm and then we'll all be back at square one.
given the demand for this stuff a la Sept 11, i dont think the companies will have a problem with motivation
Re:ok... (Score:5, Insightful)
all this means is the companies developing this stuff will have to improve their face-matching algorithm and then we'll all be back at square one.
It's worse than that:
How long until these companies start lobbying the gov't for mandatory inclusion of, say, license photos in the pool of database data so that people can be picked up as soon as they do something?
Re:ok... (Score:2)
I do agree, however, that the photo db is part of the problem. The system probably demands a very high-res pic in order to work properly. Most store/bank security cameras arent high-res enough to be useful for Face-It.
Building the db striaght from a photo db of licensed drivers, visas, etc wont happen, as even politicians are smart enough to realize that putting everybody in that db would create too many false-positives than true-positives.
OTOH, if someone is picked up by police w/o ID with them, that's where something like that would be helpful. But then again, such instances are rare, so justifying the costs of doing so would be difficult.
Re:ok... (Score:2, Funny)
Umm... That'd be great. I never look as bad as I do on my DL. I'm not even sure it's me.
Re:ok... (Score:2)
But in Tampa apparently they were after ordinary criminals. Most of these have been arrested and mug-shots taken multiple times. If that doesn't give them good enough pictures to work with, there's something seriously wrong in the mug shots, face-matching software, or both. I suspect that it's really the false-positives problem; the cops got tired of running over to look at the latest "bingo" when the pictures don't look at all alike to humans, and missed the few times when it really was the same guy.
Algorithm is the problem. (Score:2)
There is evidence that the face-matching is the problem, and that evidence is in fact the false positives, of which there were quite a number (fourteen over three days). Personally, I see false positives as a more damning failure than a lack of true positives.
There's a fundamental fact that is why both the lack of true positives isn't the problem, and why false positives are nigh-inevitable. That fact is that criminals are ridiculously less common than non-criminals. Even an extremely accurate and specific face recognition system is going to make more false positives than true ones. It's just an artifact of probability, which is why I'm against using these automated systems in the first place. That this particular system made so many false positives is indeed an indication that it sucks, however.
But humans don't make very many false positives, you might say. And that's true (though it does happen). We do have a very advanced image recognition algorithm in our meat-computers. But that's not all. Think about whose faces you recognize. Have you only seen them in a single photograph? Or have you talked with them, seen their face from many angles and with many facial expressions? How many of those people would you recognize having only seen their driver's license photo? Even us, the world-renowned best human face recognizers ever, need some help to get the job done. Having a photo of everyone on earth won't fix this problem.
Basically, you are doomed from the start not only by probability, but also by the handicaps you're starting your system with.
Re: NOT OK (Score:2)
Re:ok... (Score:2)
Right now the best facial identifiers we have are people. Think about how many people you know and can identify by their face. Even people who's names you've forgetten but still recognize on sight. (The problems of mug shots and line-ups not withstanding) under good conditions human beings only very rarely make false positives indicating that a stranger is someone they actually know.
We have evolved a specific part of our brains specialized for identifying human faces, and I see no fundemental reason that computers couldn't someday obtain the same level of skill. Problem being that the computer will be infinitely better at remembering names and other personal details which we forget, which leads to a whole lot of problems.
I can't tell who I'm looking at . . . . (Score:3, Interesting)
But take this for an example:
Soon after Sept 11th one of my colegues flew east on a buisness assignment. He was concerned because, as he said "Did I forget to mention, I look like one of those terrorists?"
He did, a little. Darker complextion, goatee, curlly hair, glasses. He looked more than a little like one of the suspects.
He's sinced shaved his goatee, so he no longer looks like that . . . but would this system have been abel to diferentiate?
I also seem to remeber that one of the guys they arested and incarcerated in Tampa turned out to be the wrong guy . . . .
Re:I can't tell who I'm looking at . . . . (Score:3, Interesting)
given what some people said in the MSNBC article, your friend would be fine
glasses, mustache, beards, longer hair, lighting, even turning 15 degrees were enough to foil the system.
Re:I can't tell who I'm looking at . . . . (Score:1)
Re:I can't tell who I'm looking at . . . . (Score:1)
Lie detectors. (Score:3, Interesting)
Soon we will have a camera on the market which is *AS* reliable as a polygraph test. 25% of the time they are wrong.
All of these things are a Bad Thing. These cameras will be used next to track you at the game [like the Bears? get calls at dinner about season ticket prices], at the bar [you like to go to bar!], at the strip club [the cops watch you now because they know you're a perv!]...
Just wait till this stuff goes all private!
Reminds me of Demolition Man
The upside of all this is: (Score:1, Insightful)
Pretty damn quick the powers that be will learn that people believe one thing Sunday morning and a completely different thing Friday night.
There's a reason Playboy and High Times stay in business -- although they don't freely admit it, people like sex and drugs. Just imagine our moral leaders' dismay at having their noses shoved in that nasty little fact with corroborating statistics. And the realization that you can't legislate human nature.
DMCA.... (Score:1, Funny)
Hang on, thats my face they are capturing, imaging, processing and then storing. Clearly I own broadcast rights to my own face and all of its characteristics.
If they match my face then they've reverse engineered my face into another format to circumvent its current storage mechanism.
Totally silly application of DMCA... but then again its a pretty silly bill.
you should totally start a lawsuit (Score:1)
i wonder if we could attack the fingerprint database too since it would fall under the same category of reverse engineering your prints into another format and circumventing the current storage mechanism.
heh, then imagine if this lawsuit were a class action lawsuit. to claim shares of the payout, criminals would have to come and get them. (much like the classic tales of criminals being suckered out of hiding by letting them know they've won some sort of prize.)
salient points (Score:4, Informative)
Re:salient points (Score:2)
As for straight surveillence, the fact that abuses can and are documented shows that the system CAN be monitored and the watchers CAN be watched.
Good auditing practices conducted by 3rd parties WILL keep the system (fairly) honest.
Re:salient points (Score:1)
No Problem...
Just change the Camera Lens [yahoo.com] and ask the guy.
;-)
Careful... (Score:2)
There must be a broader argument against what face-recognition technology is trying to do -- keep the populace under surveillance.
Re:salient points (Score:1)
Re:salient points (Score:2)
Hmm (Score:2)
-Pete
Re:Hmm (Score:2)
That protection probably only applies to those specific images, though, and not others, including any taken in public but certainly any taken while in police custody.
Still, it would be funny to see the look on the face recognition vendor's, well, face, when you hit him with copyright and DMCA violation lawsuits.
sPh
Hurt the ones you love (Score:2)
Re:Hurt the ones you love (Score:1)
I am a true believer in the right to privacy when we are on our own property, but when we are on someone else's property, private or public (because public property is technically owned by a government), the rights of the property owner should come before those of the visitor.
Re:Hurt the ones you love (Score:2)
(because public property is technically owned by a government)
A common misconception, and untrue (granted, it's true enough in a practical sense). The government administers public lands, and holds them in trust for the public. Always remember, the goverment is YOUR servant, not the other way around. Public land is exactly that, public. You've heard of the Boston Commons? So called because it's help in common by the residents of Boston.
Re:Hurt the ones you love (Score:2)
Legal rulings change. Our privacy is worth a rethinking of the situation.
Cameras are often useless (Score:3, Insightful)
One day, I came to the rack to find my lock cut and my bike stolen. The attendant refused to talk to the police. But that's OK, I had the perpertrator on camera.
After getting the tape from the building security people, I took it to a camera shop. We sped through it to find the point where, sure enough,m you could see a guy walk up, try on my bike helmut, and ride off with the bike. Due to position, you could not see him cut the lock.
I say him because I am pretty sure that it was a male. That was all I could tell from the poor quality of the tape. I could not tell skin color, clothes, hair color, or enough facial features to recognize.
I don't think the best AI added to this image would have been able to do anything as far a facial recognition.
I wouldn't want to be the attenandt working that booth. After they find his mangled corpse (ala Fargo) the police will tell his widow, "Sorry Ma'am, all we can say for certain is that they were in some sort of automiblie. We think a sedan, but we don't know for sure."
Facial recognition is going to be even harder than this. As a programmer, you have two choices , go with an algorithm or try to use a neural network.
Most of the weaknesses in the algorthim approach are what the ACLU document was complaining about.
A neural network may work if you are looking for a specific person. The problem is that to identify two things as being different, they need to be as orthoganl as possible. To separate the sea of faces into two groups those we are loking for , and those we aren't based on a series of images is going to be nigh unto imposible. Certainly not with the amout of computing power per camera that they would put into it.
Re: Cameras are often useless (Score:2)
A new generation of surveillance cams is starting to appear. There's the X10 junk at the low end, and megapixel camera / DVD-RW combos at the high end. Another year or two, and the midrange stuff will be replacing analog video.
Big Brother's vision is about to improve substantially.
Hmmm... I don't see the problem here (Score:5, Insightful)
But from the ACLU's press release, there was always a human step in the process, where a real live human being would examine each purported match before anybody got dragged off anywhere.
As such, all the face recognition software is is a _filter_, cutting down on the number of people a human agent must examine. Where's the problem?
After all, law enforcement officers have placed themselves in public places, looking for people they knew, for probably as long as there have been law enforcement officers.
A friend of mine was a sergent in the British Army, and he did a few tours in Northern Ireland. Part of his training was memorizing the faces of a large number of known IRA "players" (and apparently the IRA did the same thing with British soldiers' faces)
How is this any different?
I guess I don't understand the ACLU's beef here.
DG
Re:Hmmm... I don't see the problem here (Score:2)
there are other ways this system may cause ease for people.
it has to be worth the officer's time to go get the guy if it is indeed a match. someone wanted for petty theft would be better off than someone wanted for first degree murder.
it also depends on what else is currently happening on the cop's beat. If there's a hostage situation around the corner, the cop will have to wait another day to catch the guy wanted for murder 1.
i had a rather long and drawn out discussion with someone who just couldnt be conviced of this concept the last time this topic came up. good to see other open minds out there.
Re:Hmmm... I don't see the problem here (Score:2)
If the computer has given you two potential matches regarding similar crimes, then do you pursue the black guy or the white guy? In the Tampa system, the operator had to zoom in on particular faces to run the software, do you believe that there was no profiling in who he choose to look at? Will the operator be as good at eliminating false positives when the person is from a different racial group than he is?
Sure there is a human layer that offers some protection. The machine will never decide guilt or innocence, but the human layer also diminishes many of the bias-nuetral advantages that this technology is purported to have.
Re:Hmmm... I don't see the problem here (Score:2)
The only way I see bias, however, is much different than yours. To my knowledge, the system in tampa scans/zooms/pans through crowds in whatever fashion it does, then notifies an operator of a possible match. Control of the camera get switched over to the operator so they can have a look for themselves to verify the match and initiate further action if they see fit. Chances are good these operators are going to be minimum wagers, not police officers, so the chance of bias is high (much like modern airport security), whether it be race/sex/whatever.
I guess a solution would be to have multiple operators in multiple locations. When a match comes up, present it to more than one person. Differences of judgement go to a supervisor or something for a decision. While still not perfect, it does introduce some level of checks-and-balance.
Aside from all that, it has been theorized (not yet proven) by psychologists that people have difficulty identifying someone of another race. An example I can think of off the top of my head is an old 20/20 (or maybe Dateline?) story where a white woman was raped by a black man. In a police line up, the woman picked out the suspect. DNA and other evidence set the man free without being charged. Several months later, that DNA evidence (stored in a DB of some sort), was used against someone caught in an another rape case. Turns out that guy was present in the original line up -- the woman picked out the wrong guy. The only similarity the two men had were height and skin color. Faces were not especially similar.
Re:Hmmm... I don't see the problem here (Score:2)
Which would be the basis for selection bias.
i agree that bias may exist, but that would exist anyhow and there's not much that can be done to prevent other than fire the person if they're caught.
Do you fire the person for being biased even if they don't realize that they are being so? There is a lot of research showing how people can have preferences that they aren't consciously aware of.
Re:Hmmm... I don't see the problem here (Score:2)
FaceIt® will automatically detect human presence, locate and track faces, extract face images, perform identification
I dont see "The operator will manually control the camera's positioning and focus." Seems to me the ACLU got it wrong in a big way. From what I've seen on TV about this technology, this is how it works in Tampa, and has for years in the UK.
As for firing people over bias, naturally there would be a "corrective action period" where the person is told they're being baised, how they determined that, and to shape up.
Re:Hmmm... I don't see the problem here (Score:2)
Also, the ACLU found that they identified a total of zero (0) criminals in Tampa with this system, so its also a waste of money....
Re:Hmmm... I don't see the problem here (Score:2)
An excellent point.
Consider that there's a reason why the INS requires "immigration-style" (3/4 portrait, ear showing) photographs, as opposed to "driver's-license-style" (full-face, straight-on) photographs.
While the technique of using video cameras (NTSC shitty resolution, possibly at long range) in Ybor City doesn't work when your database is full of crappy photos, the technique of using high-resolution still-image cameras, at close range, at centralized locations, such as airport security checkpoints, and cross-referencing those photographs with photographs taken at a known distance and angle, is probably eminently workable.
Indeed, I'd say a good first step would be to change the procedures for a driver's license photo, to include both photos. The traditional "face-on" one could remain in use by the cop on the street, and the non-traditional 3/4 portrait for entry into a faceprint database, for automated pattern-matching software.
Finally, consider that replacing the Ybor City model of omnipresent (yet still inaccurate to the point of uselessness) cameras ogling the world, with a system whereby only a few (yet highly more accurate) cameras are required at points whereby identification is required in the first place (such as aircraft jetways, or border crossings), is likely to be both less invasive to privacy and far more effective at nabbing Bad Guys.
Re:Hmmm... I don't see the problem here (Score:2)
Re:Hmmm... I don't see the problem here (Score:3, Insightful)
A similar argument was once used in regards to lie detectors: Since there is a failsafe mechanism (a human) involved, the chance of abuse is lessened considerably. However, history has shown this not to be the case, and law enforcement frequently trusts the machine over the man. I doubt that this technology will be much different, at least in this respect. It is much easier to trust the machine and haul someone in for questioning.
Further, the use of these machines at airports means that people who potentially don't even have a high school diploma will be operating them. From me, at least, that fails to inspire much confidence.
Re:Hmmm... I don't see the problem here (Score:1)
But humans can't determine with their natural senses whether or not someone is lying, so if the machine catches an otherwise unknown sensation (ie, blood pressure rising, pulse quickening) that's when we take it's work over a humans'. However, I heartily agree with the person you're responding to...these things are filters, that reduce the number of candidates from 200,000 to 20. A person then comes in, looks at the picture and compares it to the real person. Heck, I don't care if security escorts me out into the hall briefly...I'm at a public event where they suspect someone may be endangering lives, I'll take 2 minutes out of my life to show them i'm not Arab or carrying a bomb.
I am a liberal. This means I read books and have opposable thumbs.
Frankly, I think this one sentence speaks volumes more than the rest of your post...
--trb
Re:Hmmm... I don't see the problem here (Score:5, Insightful)
Humans are not very good at detecting lies, and can in fact get very good at certain, specialized forms of deception, even in the face of equally speciallized deception-detectors. (I'm thinking here of professional poker players, who make their living on deceiving and detecting deception, in a very specialized manner)
So a lie detector is an attempt to augment an ability.
But humans are VERY good at facial recognition, much better than any machine is. The limitation here that the machine is trying to address is one of storage capacity (memory) and speed, not ability per sae.
I make the assumption here that the interface would display the picture of the person that it thinks the person in question matches alongside the picture from its camera (or print one for the officer to carry with him for an in-person comparison) so that the vastly superior human facial recognition abilities could be brought into play.
So, really, I don't see any reason to get upset. It's no different than if an officer saw your photo on a "wanted" poster in the post office, and then made the match from memory, only now the "memory" is much larger.
When an arrest warrent is cut, your driver's licence is flagged in the DMV computers. If you get pulled over (for any reason) and your licence is checked against this database, they get you. How is this any different?
DG
Re:Hmmm... I don't see the problem here (Score:2)
foiling the cameras (Score:1)
"Osama's been spotted. Take him out, over."
"I guess he didn't stay in that cave very long, over."
-G
Freedom vs. Security (Score:1)
"Clean underwear? Why?!?
Re:Freedom vs. Security (Score:1)
As far as I am concerned this would be fine if you were not talking about American citizens. You want to track foreign nationals when they come into our country to make sure they are here doing as they said they would when then crossed our border.
The bill of rights is based on the concept of the rights of an individual human being.
The concept is that a bunch of equal human berings banded together to provide for the common defense of them. If we feel these are inalienable human rights, than anyone who falls under the umbrella of the US constitution, and by extension, the US government, should be guarenteed those rights.
Everyone.
The right to a fair trial is to ensure that, as best we as fallible human beings can determine, such and such a crime was commited on such and such a location. Remove that, and we'll have witch hunts.
Re:Freedom vs. Security (Score:2)
Who said anything about that?
Indeed, the more accurate the technology gets, the less likely the software is to make a false positive.
And supposing you are flagged by a false positive -- the cop who walks up to you and eyeballs you is likely to say "No, the software's wrong", and let you move on.
And finally, suppose the cop also makes the same mistake. (Suppose, in the worst case, that you have a twin brother...) Well, sucks to be you. But then your lawyer subpoenas the various agencies to check the audit trail of your movements (security camera footage, face-idents, cell phone logs, IP addresses from Carnivore's logs of your /. postings), which demonstrate pretty conclusively that you were in East Bumfuck, Wisconsin at the time the crime took place in Los Angeles, California, and you're off the hook, with charges dropped.
Police love it, everyone else hates it (Score:1)
The reason that I point this out is that I have to wonder what is going on behind the scenes. We have a technology that is unproven at best and most likely unworkable. Yet we see police chiefs risking their jobs to get it installed. Is it the kick backs or something more sinister? Probably kick backs but I would really like to know for sure.
It's biggERING (Score:2, Insightful)
How OOP Can Solve Face-Recognition Problems (Score:1)
{
face.is_terrorist = TRUE;
police.arrest(face.parent);
police.beat(face.parent);
police.eatDonuts();
}
Re:How OOP Can Solve Face-Recognition Problems (Score:2)
> face.is_terrorist = TRUE;
> police.arrest(face.parent);
> police.beat(face.parent);
> police.eatDonuts();
Wait a minute, if they beat your face in well enough the first time you get falsely-identified, you won't be recognized the second time.
I'd say this tech will stop police face-beatings, not encourage 'em ;-)
Where do you draw the line? (Score:1)
Why do we allow the local convienience store to film us, yet we seem to have a problem with our government doing the same thing? Is it the database of faces that scares us? What if K-Mart used facial recognition to associate our purchases with our face. Would it be wrong for them to have televisions throughout the store interactively suggest purchases?
Don't get me wrong, I like retaining my rights,
but in this case, I feel that the rights of law enforcement agencies are being restricted.
Re:Where do you draw the line? (Score:1)
There isn't near as much of a chance for someone to be wrongfully accused - and even if they are accused of something like shoplifting its a simple process to clear them of the charge in the store. Sure its embarrasing, but not nearly as big of a deal as if the police take you away for verification. If you are picked up on a camera by the government as a criminal, I am willing to bet that when they track you down it won't be near as simple of a process as what you'd experience at K-Mart.
As for K-Mart using facial recognition to suggest purchases - I think its an awful idea, but for other reasons than those associated with the loss of privacy.
Re:Where do you draw the line? (Score:1)
As for k-mart. i really don't think i'd care to much for them tracking my face for personalized ads. but then i don't like it when companies keep track of that much information on me in general.
Re:Where do you draw the line? (Score:4, Insightful)
Well, technically they can. But the public, including that rent-a-cop in the convenience store, can only detain a person 1) if they witnessed a crime and 2) to turn them over to a sworn police officer at the earliest possible time. If either piece is missing, you can nail them for "false arrest." This is an important thing to remember if you're ever (wrongly) accused of shoplifting - demand a real cop, *now*, to either arrest you or release you. If they refuse to call the cops... life will soon get *very* interesting.
Even those bounty hunters have limited rights. They can detain someone who signed the bond papers, but there are some well-documented cases where the bounty hunters were prosecuted for kidnapping after detaining the wrong person and failing to exercise due diligence in verifying the identity of that person.
But sworn police officers can detain people even if the officer didn't witness a crime. They can detain people even if there's no witnesses at hand, e.g., if they reasonably believe that the person is the subject of an arrest warrant issued by another jurisdiction.
The cost of a false positive in a convenience store is minimal. They think you're a shoplifter because of their face recognition software? Fine, you walk away and shop at another store where they're more careful with their accusations.
But a false positive with a police officer may have you arrested, at gunpoint, and detained for hours or days until you can prove that you aren't the escaped mass murderer you resemble.
(IANAL, but this is stuff that should be required knowledge for a walking around on the street!)
The difference (Score:1)
Face scanning as performed by the police then differs in two ways: (1) It is being performed by public servants. (2) It is potentially done in any public spaces.
If you don't want to be on a store security camera, don't go to the store. If you don't want to be in the police face-scan database, don't ever go out of your house.
See the difference?
face recognition has better uses (Score:2, Interesting)
I think a much better use would be in an access-control situation-- flag someone as suspicious if their face is NOT in the database.
It isn't too difficult to socially engineer your way into a building if you don't work there. But if employees were required to look at a camera for a split second before passing by the security desk, it would be much easier to identify those who don't belong (especially in large companies).
No, I don't think face recognition is accurate enough yet, but there are legit and non-privacy-invading uses for this technology.
Interesting result, odd conclusion (Score:2, Insightful)
But I'm not sure how that leads to their conclusion, "the technology does not deliver security benefits sufficient to justify the Orwellian dangers that they present." If the stuff doesn't work, then it doesn't present any dangers.
At least, not any more dangers than the police officer standing out there looking for people. I'm not sure that the ACLU would grant police that right, actually, since it is a violation of what they seem to consider an absolute right of privacy.
In my opinion, a society will always make exchanges of some about of liberty for security. For example, I don't have the liberty to shoot people. The boundaries will always be contentious, but it seems to be as if this sort of absolutism does not comprise a reasonable discussion of where the boundaries are most appropriately set.
Re:Interesting result, odd conclusion (Score:3, Insightful)
The ACLU is saying that the system doesn't work, so there is no good reason for deploying lots of cameras. (Actually they are saying more than that, but that's the part that's easy to agree with.) In any case, it's clear that the survelliance capacity presents a danger even if the intended purpose for the system doesn't work.
Re:Interesting result, odd conclusion (Score:1)
While I appreciate and commend the ACLU's work protecting me from overzealous police, the antagonistic tone may be counterproductive. I'd like to believe that the police would say, "Gee, this doesn't work, let's wait a few years and try again" rather than try to use it as cover for a genuinely Orwellian plot.
Perhaps I'm naive, but as long as this is framed in terms of the ACLU wanting absolute privacy, and the police depicted as the agents of Big Brother, it's unlikely that we'll actually find any sort of useful compromise.
Mind you, I live in Prince George's County, Maryland, where the police _are_ actively engaged in covering up abuses, as documented in The Washington Post. So perhaps I'm deliberately ignoring ugly truths rather than being simply naive.
One-sided arguement (Score:2, Insightful)
Just because the technology failed in their "one month study" doesn't mean it's not a success. How many criminals could be walking down the street of NYC right now without anyone even noticing them? I'm sure there are plenty! Even if you have the person's face plastered on every telephone post, could you with 100% accuracy point your finger and say "THAT IS THE GUY IN THE PHOTO!". I know I couldn't. How many times have people misidentified criminals... seeing someone and saying it looked like someone else... then when the police did looked into the suspect, found he or she was not guilty? This happens all the time. So to say that the technology is flawed because it picked up innocent people is just dumb.
The other point made was that was made in the MSNBC article was that the system failed to point out someone with a 15 degree variance in the image they had compared to the person being compared... or if the lighting situtation was different. Unless you know someone and you see them daily, can YOU tell a person by their profile view when you only have picture of their face? No, you can't. If it was dark, could you tell? No.
Another point, that the system was gender bias... well, I know several people who could go either way sexually... and if you were to look at someone with long hair and lipstick, wouldn't you consider them female? All the guys who pick up drag queens accidently in NYC everyday surely couldn't tell the difference between male and female!
ACLU is making lame points and has no real evidence. Then again, I haven't seen any university studies or other non-biased studies on this situation. The ACLU claims Tampa police have abandoned the technology, which seems to be an all out lie. It's all a bunch of bullshit, and that single item is the proof.
Re:One-sided arguement (Score:4, Insightful)
So I would say it is definately worth it to fight very hard against bad technology right from the start.
sPh
Re:One-sided arguement (Score:2)
The fact the polygraph has been used for 70 years shows that it is an effective means of collecting evidence. If it wasn't, they wouldn't still use it.
Re:One-sided arguement (Score:3, Insightful)
If you read up on the history of the polygraph, this is clearly what is going on. Faced with a choice of admitting that they don't know who the heck committed a crime, or using a demonstrably useless machine, investigative agencies (particularly the FBI) have been using the useless machine for 70 years. And have now made such a culture of using it that they dare not admit to anyone, particularly themselves, that it just doesn't work.
sPh
Re:One-sided arguement (Score:2)
sPh
Re:One-sided arguement (Score:1)
What, exactly, does this mean? If it's wrong 60% of the time, couldn't you just use the opposite answer from the polygraph test and have it be wrong 40% of the time instead?
Some clarification would be helpful.
--Tom
Re:One-sided arguement (Score:2)
sPh
Maybe a compromise... (Score:2)
Re:Maybe a compromise... (Score:1)
sousveillance [wearcam.org]
the guy has been at it a long time, and has come up with some interesting approaches from both a technical and a social perspective. the 25 MB video file inside sears had me both laughing on the floor and scared shitless. "Call security" takes on a whole new meaning...
taking action isn't difficult, but the powers that be want you to think so.
hope that helps.
Boon to Ellison & Co. (Score:4, Interesting)
This facial recognition has a near-zero Hit rate and a high false-positive (Type III error???) rate. The false-positive rate is a killer because it may cause system operators to miss a Hit (true positive). So what do we end up with: an authoritarian tool that is completely worthless.
Meanwhile, the failure of this project can be a selling point for Larry Ellison's proposed National ID card system. Perhaps the streetlamp cameras in Ybor City will soon be replaced by turnstiles manned by undereducated, undermotivated, understimulated, minimum-wage-earning Security Engineers (read: displaced airport security screeners) checking each person's National ID card. These people probably won't be able to grasp the concept of Type II/III errors; thus the implementation of the National ID Card will suffer from the same problems as the facial recognition system.
In summary the two vendors will profit substantially from their products--which won't make the public any safer--and we will be eased into acceptance of the mercantilist authoritarian police state.
Wanna see something funny? (Score:3, Interesting)
For more on McDonald's over-hyping of the emperor's new security blanket, see this article from the Fresno Bee [fresnobee.com].
Sample quote: "This breakthrough technology makes us the safest airport in America," Mayor Alan Autry said.
Re:Wanna see something funny? (Score:2)
Ha! Funny pic. Now I wonder what is the difference--to a camera--between a 2-D rendering and a 3-D face? The image of ibn Laden on the photo paper is a reflection (bad pun...) of the lighting, f-stop, and other conditions determined by when/where/how the picture was taken. An image taken of a person walking through the airport security checkpoint will be subjected to the same conditions, unique to the location. There will be some difference--to the camera--between a real-life ibn Laden walking through the checkpoint and a man of different stature walking through with an image of ibn Laden in front of his face.
The Pelco system must use some very loose logic to indicate a probable Hit. This will lead to higher Type II/III errors and greater mistrust of the system. Not before Pelco profits outrageously, though.
Excerpt from ACLU report (Score:1)
Re:Excerpt from ACLU report (Score:2)
Texas Sex Offender Database.
Revocation of right to vote for (some?) felons.
Re:Excerpt from ACLU report (Score:1)
The system is only "part" of the solution... (Score:2)
To me this is just like predicting weather. All weather stations (SW & HW) out there do is "predict" the weather -- humans than jump in to validate the "prediction". To me, this system is no different.
Granted, it still needs a lot more work, but wasn't the first weather station in the same state when it was first introduce? Lets give this thing some time before killing it down.
ACLU is looking for a reason to discredit the system; they need to do a better job than spend two months reviewing one uint which used only 2 days a week
This is not great news... (Score:4, Insightful)
Remember the good ol' days of th'Internet, where no one could possibly track you and where anonymity was technologically "guaranteed"
My point is, that arguing the TECHNICAL weaknesses of this, or any other privacy-infringing item/product/software/etc. will only result in TECHNICAL innovations that make it more effective.
We must argue the LEGAL weaknesses - the 4th amendment. We need to argue that no person waives their constitutional rights simply by the virtue of entering a commercial, travel, or other legal relationship with any other entity. (unfortunately, I fear we lost this one a long time ago)
We need to argue against clickwrap agreements, and their cousins:
Our legal rights are important. The details of whatever technology the FBI, CIA, or any other no - such - agency uses in an attempt to violate those rights, are less so.
Don't Frustrate their efforts. Fight them head - on!
Run the Slashdot Reaction Subroutine (Score:2, Funny)
What method will they use... (Score:2)
Why does everyone think I'm a troll? (Score:2)
ACLU is fighting the wrong fight. (Score:2, Insightful)
Pardon the bad grammar. And most likely spelling I just woke up and hung over
Re:ACLU is fighting the wrong fight. (Score:2)
Re:ACLU is fighting the wrong fight. (Score:2, Interesting)
Slashdotters are weird in that they have double standards. If Strong-Crypto can be misused by criminals thats ok because there is a legit reason for the existence of such a technology. If this camera system were misused by the government thats bad. Even though there is a legit reason for the technology. And in the event it was misused by storing your picture permanently (WHICH IT DOES NOT) then there would be a very very large public outcry and it would be political suicide to support such implementations.
Leftists always mean well but the cold hard facts of reality and the real world have yet to be realized by them.
Re:ACLU is fighting the wrong fight. (Score:3, Interesting)
We should hold public servants to a (much) higher moral standard than we hold private individuals, and there should be signifigant oversight. I also don't agree that there IS a legitimate use for widescale monitoring of innocent, private people - one of the (theoretical) keystones of our legal system is that people are innocent until they are PROVED guilty, and that it's better to let 100 guilty people go than convict 1 innocent person. The modern corporate/government culture of assuming people will be criminals unless you stop them cuts at the foundations of our society.
Re:ACLU is fighting the wrong fight. (Score:2)
So the worst case scenario is more like (i) false positive on facial recognition system (ii) put in jail over a 3-day weekend, where you are beaten and raped a few times (iii) prosecutor decides you look like someone wanted for 3 outstanding child molestation warrents (iv) fast railroad to hell.
If you don't believe me, make friends with any big-city cop. After about 15 years, when he starts to trust you, he will tell you about some of the things that really go on in the stationhouse.
sPh
Re: ACLU is fighting the wrong fight. (Score:1)
And youre also ignoring the problem of false positives, i.e., innocent bystanders being recognized. You want to be denied access to some place or have an hour or two of your time wasted by the police because the computer thought you looked like someone in its database?
Dramatic changes (Score:3, Interesting)
How will this software be able to truly establish it's me if my photo in the database is 40 pounds lighter and with contact lenses if I walk around with my glasses on now.
I also wear false teeth so I can change my jawline at will.
Come and get me!
Camera worked for me! (Score:5, Interesting)
I put a visible non-working video camera pointed at the street and our local crack-house, and their business collapsed. The house in question was rented by a slum-lord to the lowest bidder and the drug traffic was driving me nuts. I made a good show of the camera and suddenly, the 'customers' were a bit leery. The druggies soon moved out and were replaced by a rather nice poor family.
I was set to get the camera working, but the it's presence was enough. Highly recomended. PS: I removed the camera once it became obvious that the new tenants were cool.
Face Recognition vs. Cameras (Score:3, Insightful)
According to the report, it almost doesn't work at all. Other installations using face recognition have degenerated into checking out the girls. I understand; it's gotta be boring as can be after the first 50,000 false alarms.
It seems to me that this software isn't really an invasion of our privacy. The cameras themselves may be, but if we accept the cameras, we can't really quibble about the face recognition software.
It's time to reconsider our concepts of privacy, anyway. Read David Brin's The Transparent Society [kithrup.com] and see if you don't come up with a new view.
I think the cameras should be everywhere... especially the police station. And we should all be allowed to watch them. It would certainly make everyone think twice about their biases before taking drastic action.
Doublemint (Score:1)
[ot] i didn't imagine a beowulf cluster of them... (Score:2)
the one question they didn't ask, however (Score:2, Interesting)
But the one question that wasn't asked or answered in this piece is very important -- did the crime rate in that area drop during that period? Because if it did, the face recognition system may actually be working, and not seeing criminals may be a GOOD thing!
If the system actually deters crime, and criminals know about the system being in place, it wouldn't be unexpected that criminals would choose to actively avoid that area, leading to a lower detection rate! Did they consider that?
Easy Test (Score:2, Insightful)
Or, it might find that some people believed to be different people are actually the same person.
A Little Reminder On Stats (Score:3, Interesting)
Assume a 90% accuracy with a database of 10 criminals. If there are 100,000 samples, 10% will be false positives. That's 10,000 alarms that mean nothing. Of the remaining 90,000 people, you'll get 9 true alarms (assume they walk by!) and 1 gets away. Therefore you have to fend off about a thousand errors to get a real suspect. At that point, human error could easily step in and assume it's another false alarm.
This is the same objective reason racial profiling is wrong. You spend all your time and resources harrassing innocent people.
Of course, a cheap solution to this problem is multiple camera angles.
Iron fists with velvet gloves (Score:3, Interesting)
Re:Iron fists with velvet gloves (Score:2)
sPh
Face Rec @ 100% no false positives. (Score:2, Interesting)
Face Recognition is the least intrusive of the Big Brother security technologies. Especially when you take into consideration that scanning a crowd for a certain face will not work . Let's be real here kids, some companies are simply lying about what thier programs can do.
What about my right to my likeness? (Score:2, Insightful)
In theory, under time-tested requirements for media recording, they cannot use my likeness without my permission.
Yet they are.
I did not grant usage to them, when I walked on the sidewalk. I did not sign a contract granting them usage in a film or TV show.
They stole my copyright!
-
Heard of soundex? (Score:3)
FaceIt, it appears, is an overhyped technology that isn't any where near ready for prime time. And knowing what it's up against, I don't think it ever will be.
Re:Why Slashdot Sucks (Score:1)
kudos