81% of 'Suspects' Flagged By Met's Police Facial Recognition Technology Innocent, Independent Report Says (sky.com) 120
Four out of five people identified by the Metropolitan Police's facial recognition technology as possible suspects are innocent, according to an independent report. From a report: Researchers found that the controversial system is 81% inaccurate -- meaning that, in the vast majority of cases, it flagged up faces to police when they were not on a wanted list. The force maintains its technology only makes a mistake in one in 1,000 cases -- but it uses a different measurement to arrive at this conclusion. The report, exclusively revealed by Sky News and The Guardian, raises "significant concerns" about Scotland Yard's use of the technology, and calls for the facial recognition programme to be halted. Citing a range of technical, operational and legal issues, the report concludes that it is "highly possible" the Met's usage of the system would be found unlawful if challenged in court.
Re: (Score:2)
As someone from the US I knew exactly what Met Police meant. It's not a term that is used in the US to describe anything but UK police.
Re: (Score:2)
As someone from the US I knew exactly what Met Police meant. It's not a term that is used in the US to describe anything but UK police.
"Scotland Yard" is also infrequently used to describe American cops.
Re: (Score:2)
As someone from the US I knew exactly what Met Police meant. It's not a term that is used in the US to describe anything but UK police.
Untrue, the Met Police are the a-holes who harass you if your Mets' ball cap brim is improperly bent. ;-)
Re: (Score:2)
Those were probably Yankees hazing you.
Re: (Score:2)
As someone from the US I knew exactly what Met Police meant. It's not a term that is used in the US to describe anything but UK police.
The D.C. Metro Police would beg to differ with you.
This is just the base rate fallacy. (Score:2)
It doesn't matter how low the false positive rate of the *test* is; if the true positive rate in the *population* you're screening is lower than that, your hits are going to have more false positives than true positives.
So you can do a really impressive-looking demo for a system; in fact the technology may be legitimately impressive without it being good enough for screening *everybody*. That's why drug tests are a crock.
Re: (Score:2)
It doesn't matter how low the false positive rate of the *test* is; if the true positive rate in the *population* you're screening is lower than that, your hits are going to have more false positives than true positives.
Yep, and apparently no-one ever explained it to them. The should be forced to read Data Mining for Terrorists [schneier.com]
Re: (Score:2, Insightful)
The alarmist headline is targeted at the statistically incompetent, which are 93.71% of the population.
81% false positive is VERY GOOD. If this system narrows the suspect pool in a population of a million down to five people, one of whom is the perp, then that is good enough. Because four of those people will have no connection to the victim, while the fifth guy will be her ex-boyfriend with a restraining order. The police should be able to figure it out from there.
Bayesian statistics [wikipedia.org] should be taught in
Re: (Score:2)
If only policing worked that way. But in practice the first suspect found is the only suspect searched for.
Re: This is just the base rate fallacy. (Score:1)
One thing I might caution is that a Chicken Little device placed in the hands of police can still be an effective way of gathering information. "Oh, our camera alerted on you...we'll need a DNA sample to clear this up. Spit here." In this case the cops wouldn't care if the device was 100% incorrect and alerted on everything (including water hydrants). The goal isn't a correct anything except an excuse to violate yet more civil liberties. It can be just as effective as the drug dog that conveniently alerts a
I concur (Score:5, Insightful)
came here to say this same thing. The results are phenomenally good. it means that instead of one in tenthousand people the cops look at being a criminal, now one in 5 are. Stunningly useful. and a boon to society not an intrusion.
Ironically it's good that that the False positive rate is higher than the true Positive rate. Why? well imagine there was only one false positive in 100 true positives. The cops would assume you a guilty no matter how much you protested. that inocent person would have a hard time. But when it's the reverse, that 4/5ths of the people flagged are innocent the cops know that it's more likely they are dealing with an innocent person than a guilty one. And because their workload is orders of magnitude less with such a good screening system they might even have the time to suss out the guilty party
Not bad.
Re: (Score:3)
The cops would assume you a guilty no matter how much you protested.
I think that's going to happen anyway. There's no way that cops or prosecutors will understand the base rate fallacy; even doctors seldom understand how truly unreliable even a very good test can be.
Re: (Score:2)
imagine there was only one false positive in 100 true positives. The cops would assume you a guilty no matter how much you protested.
Good point.
Re: (Score:2)
Re:I concur (Score:5, Insightful)
IF the cops keep in mind that there is an 80% chance the flagged person is innocent, it can be quite helpful. But that's a BIG if. There is a long and sordid history of police deciding they 'like someone' and becoming blind to anything exculpatory.
Re: (Score:2)
Re: (Score:2)
The more excuse you give them to 'like' someone, the more they will do it.
Re: (Score:2)
I'm not talking about framing someone. I'm talking about shoddy investigation and failing to let go of non-viable theories. Unlike tinfoil-hat theories, this is known to have happened on multiple occasions.
Re: (Score:2)
Not when the tool is 'scientific' and has 80% false positives.
Re: (Score:2)
IF the cops keep in mind that there is an 80% chance the flagged person is innocent, it can be quite helpful. But that's a BIG if. There is a long and sordid history of police deciding they 'like someone' and becoming blind to anything exculpatory.
80% of the flagged people being innocent is even more useful. These are the law abiding who will not violently resist violation of their rights and prime targets for civil assets forfeiture.
Re: (Score:2)
Not all cops are American.
Re: (Score:3)
What actually happens is that the cops use "the computer says it was you" as an excuse to harass innocent people, maybe even pin crimes on them. The Metropolitan Police have form on that.
When it's your innocent ass in jail for 3 months.. (Score:2)
...based on a false positive, [nytimes.com] will you still think the same? That woman nearly gave in and made a false confession to get out of lock up.
Re: (Score:2)
came here to say this same thing. The results are phenomenally good. it means that instead of one in tenthousand people the cops look at being a criminal, now one in 5 are. Stunningly useful. and a boon to society not an intrusion.
It will sure be a boon to civil assets forfeiture.
Ironically it's good that that the False positive rate is higher than the true Positive rate. Why? well imagine there was only one false positive in 100 true positives. The cops would assume you a guilty no matter how much you protested. that inocent person would have a hard time. But when it's the reverse, that 4/5ths of the people flagged are innocent the cops know that it's more likely they are dealing with an innocent person than a guilty one. And because their workload is orders of magnitude less with such a good screening system they might even have the time to suss out the guilty party
That does not stop civil assets forfeiture.
Not bad.
No, it will be all bad, at least in the US.
Not a fan but I get it (Score:5, Interesting)
Re: (Score:3)
Re: (Score:1)
The poorly trained and overly empowered jackboots responsible for running the system will not be as reasonable as you propose.
Re: (Score:2)
To understand the issue with this you have to understand a bit of the history of the Metropolitan Police. The short version is that they were described in an official investigation into their failing to properly investigate the murder of a black man as "institutionally racist".
What that means is that the have systemic issues that cause bias against non-white people. Not just individually racist officers, but problems with their procedures and the systems and techniques they have in place to investigate crim
We have a live version of a wellknown conundrum. (Score:4, Informative)
There is a new testing method X, which is able to find a certain illness Y with 99 percent confidence.
The same testing method will yield a false positive result in one of 1000 cases.
The illness Y you are screening for appears once within 1,000,000 people.
If everyone gets screened and you are tested positive using X, how probable is it that you are really sick with Y?
No matter how impressively exact the face recognition method for a single case is, because of the sheer number of people screened, your false positive rate will always be large.
That... doesn't make any sense (Score:1, Flamebait)
If the system works at all the false positive rate would go down as more data is input into the system.
OTOH the system is just as likely to be one of these [wikipedia.org]. An excuse for cops to establish probably cause and search your belongings.
Since drugs are illegal at some level in most jurisdictions and 1 in 5 smoke weed it's pretty easy to get to a 20
Re: (Score:2)
Re: (Score:1)
Undoing fat fingered bad mod
Re: (Score:2)
Sadly I think the solution is going to be more data. So the UK has a population of 66 million, if there's a 0.1% error rate that's 66000 people that look kinda like you, even if you're looking for a local criminal in a small town those odds are just bad. However through cell phone records etc. we probably got a good idea who positively are in the vicinity of this camera, if you're the police maybe you can set up an IMSI catcher for this specific purpose otherwise 5G cells are real tiny by default. Paired wi
Re: (Score:2)
Would not these be immensely beneficial for law enforcement & common good of general public?
No. It would destroy society and lead to inevitable genocide.
Jury rigged? (Score:3)
it flagged up faces to police when they were not on a wanted list.
Presumably the false positives are people who "looked like" a criminal.
Which seems to be quite a prevalent view for juries, when deciding the fate of an accused person.
But as a first-levell filter, it seems to me that this thing is scanning thousands - possibly millions - of faces and just reporting the "probables", Since this is then fed to a second-line of actual people, who validate or reject the computer's flagging then there is both the necessary safeguards and a huge saving in police time, from not having to scan all those millions of faces manually. I don't think anyone is suggesting the computer gets all these false-positives arrested.
Re: (Score:2)
Re: (Score:2)
Given someone was fined for hiding their face from a scanner, I think there's cause to be concerned.
Did you not spot the Slashdot post earlier today about the challenges someone faced trying to opt out of facial recognition at an airport?
However: I did try and hunt down some actual statistics, and it does look like very few people are inconvenienced by being falsely flagged:
https://bigbrotherwatch.org.uk... [bigbrotherwatch.org.uk]
Re: (Score:2)
They could miss a flight at the airport, they could be detained for questioning at a workplace, embarassed in front of their families, this could be used as a common excuse for profiling and detaining anyone without other reason, etc.
Yes, but fewer people will be subject to such inconveniences compared to the current manual process.
Re: (Score:2)
Now you are stuck having to go through the courts to get all of the images and the records of your questioning removed from the databases.
In the UK? No fucking chance. The police repeatedly refuse to obey the law on PII.
Re: (Score:1)
Indeed, consider a system that was 90% accurate. Which policeman would waste time checking to see if an identification was correct? No, it would be "You're nicked! The computer says so", A system that's wrong 80% of the time _requires_ manual intervention, and that makes it safer.
Probable cause (Score:3)
With Marijuna still illegal at some level in most places and with 1 in 5 using it getting to a 20% "accuracy" rate isn't exactly hard.
Re: (Score:1)
This is why I've advocated that police must log their probable cause BEFORE inspection & that must be the ONLY thing they can detain/ticket/arrest a subject for. i.e. if a broken taillight, then that is only infraction that can be ticketed, not seatbelt nor dui. If they log a probable cause that cannot be seen from a distance then it should get thrown out without defendant having to raise the issue.
Re: (Score:2)
I pointed this out elsewhere on the thread, but there is a good chance this is just an excuse to establish probable cause for searches.
I hope you got modded down elsewhere on the internet for this as well since a computer identifying your face when a police officer can see that you're a different person isn't probable cause. Not even in the USA where laws about this are far laxer.
Oh I guess you were too busy ranting to notice we're not talking about the USA right?
Re: (Score:2)
Do you know Google Images still confuses black people for apes?
Well, you have to click through to the second page of the image search result for "apes" to see some black people. But yes, they still do show up!
As They Should Be (Score:3)
I'm sure most people that come to the police's attention the old fashioned way are also innocent.
After all the vast vast majority of people are innocent of serious wrongdoing so identifying the few bad eggs means sifting through a haystack for a few needles. Anyone who remembers their statistics should know that even super accurate tests produce mostly false positives if the base rate of true positives is low enough.
This isn't any reason to distrust the technology. Merely a reason to understand that it isn't proof of guilt just a good way to generate leads. The real danger here is misunderstanding by the public in criminal trials. People need to understand that being selected as a bad guy by a system even with a very high accuracy still could mean you're most likely innocent.
Re: (Score:2)
I'm sure most people that come to the police's attention the old fashioned way are also innocent.
On Monday a man called 999 (the emergency services number in the UK) and asked for the police, requesting that they come and.. actually, I'm not sure. I don't know if he wanted them to arrest me or just stop me from doing what I was doing.
I kept doing what I was doing. They must've told him to fuck off, no police came. But if they had shown up then I would indeed have come to their attention while innocent.
Mayfield, Shwartz, Tankleff & Cotton Candy (Score:2)
You seem to be urging calm when freaking out is actually a better reaction.
And people need to understand that innocent people are crushed by the system every day, which means staying out of the system is a priority, which means opposing bad practices like faulty facial recognition and the psudoscience of bite marks. [criminallegalnews.org] In 2004 Brandon Mayfield [wikipedia.org] had his fingerprints
Being innocent ... (Score:2)
That fact that the suspect turns out to be innocent doesn't prove the identification is in error. It seems to me that both pro and anti are using misleading statistics here. I have concerns that this technology should be subject to proper controls and scrutiny but I don't agree with using dishonesty or misrepresentation to get to that position.
The article is certainly wrong in its claim "A recent BBC documentary captured an incident where a man was fined after refusing to take part in a facial recognition
Re: (Score:2)
He was an idiot for swearing at the police.
He was an even bigger idiot for accepting the fine.
https://www.telegraph.co.uk/ne... [telegraph.co.uk]
Innocent? Of everything? (Score:2)
Haul them in anyway and we'll make something stick.
"Show me the man and I'll find you the crime." - Lavrentiy Beria
Alert: Statistical Innumeracy in new report (Score:2)
These results could actually be quite impressive. The question should be, how many images / people were scanned, what is the rate in the general population that was scanned. If, for example, 1% of the people scanned are truly suspects, then the software is definitely doing much better than random.
Basically, TFA doesn't say how much better the software does compared to random selection, so one can't really assess the effectiveness of the software based on the sensationalist article.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Depends on how it is used (Score:2)
Being questioned has a large cost - in time, in risk of exposing real or imagined crimes etc.
If there were a way to add this information to other information to reduce the number of innocent people who are questioned, it could be a win. OTOH if becomes relied on, or leads to more biased questioning (sure we questioned every black guy in the shopping mall - we are biased, the computer just told us to), then it could be very negative
Re: (Score:2)
Re: (Score:2)
Your honor, even though I am on camera in the area of the crime, have a history of committing crimes, was convicted of a similar crime, and they have iron clad proof I am guilty; I challenge their use of facial recognition to obtain suspects for the crime!
Will you challenge human facial recognition using the same camera data?
Just Think... (Score:2)
Facial recognition is not, and won't be for some time yet to come, ready for real world conditions, especially ones that put people at risk, despite what the marketing weasels will say to try and sell stuff.
I know one large company I worked at we were always going over and yelling at them for lying to customers! It's not just that they didn't know, they could always call us on an inside line and get an im
Re: (Score:2)
Let the police have their new tools.
Use the tech in areas with increasing crime rates.
A nice map/GUI of the crimes and the faces detected.
Are the people seen criminals? The "police" have humans to look over each "detection" of a face.
Slowly the areas is made safe and fully gentrified as the criminals are detected.
Add in ait analysis and it gets more advanced. On a smartphone while doing crime? Get a voice print.
Slowly every criminal is detected using human review and
Re: (Score:2)
Why only 6 of the 10 trials?
Why trials instead of actual usage data? Why only 42 hits? How many possible people were scanned? In a group of 50, that would be bad. In a group of 50,0000, you have 42 possible bad actors, of which 8 were correctly identified.
What was the false negative rate?
Re: (Score:2)
If it's false positive rate is 81%, it's rather likely that it's false negative rate is about the same.
That does not follow at all. In fact, there is an inverse relationship between false positive and false negative rates. Basically, without increasing the accuracy of the algorithm, you would generally tradeoff between type 1 and type 2 errors.
Do you smell cherries, because I smell cherries. (Score:2)
evaluated the technology's accuracy at six of the 10 police trials. They found that, of 42 matches, only eight were verified as correct
Insignificant sample size during some of the trials rather than actual usage data. I smell cherry picking.