Police Use of Facial Recognition Violates Human Rights, UK Court Rules (arstechnica.com) 58
An appeals court ruled today that police use of facial recognition technology in the UK has "fundamental deficiencies" and violates several laws. Ars Technica reports: South Wales Police began using automated facial recognition technology on a trial basis in 2017, deploying a system called AFR Locate overtly at several dozen major events such as soccer matches. Police matched the scans against watchlists of known individuals to identify persons who were wanted by the police, had open warrants against them, or were in some other way persons of interest. In 2019, Cardiff resident Ed Bridges filed suit against the police, alleging that having his face scanned in 2017 and 2018 was a violation of his legal rights. Although he was backed by UK civil rights organization Liberty, Bridges lost his suit in 2019, but the Court of Appeal today overturned that ruling, finding that the South Wales Police facial recognition program was unlawful.
"Too much discretion is currently left to individual police officers," the court ruled. "It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed." The police did not sufficiently investigate if the software in use exhibited race or gender bias, the court added. The South Wales Police in 2018 released data admitting that about 2,300 of nearly 2,500 matches -- roughly 92 percent -- the software made at an event in 2017 were false positives. The ruling did not completely ban the use of facial recognition tech inside the UK, but does narrow the scope of what is permissible and what law enforcement agencies have to do to be in compliance with human rights law. Other police inside the UK who deploy facial recognition technology will have to meet the standard set by today's ruling. That includes the Metropolitan Police in London, who deployed a similar type of system earlier this year.
"Too much discretion is currently left to individual police officers," the court ruled. "It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed." The police did not sufficiently investigate if the software in use exhibited race or gender bias, the court added. The South Wales Police in 2018 released data admitting that about 2,300 of nearly 2,500 matches -- roughly 92 percent -- the software made at an event in 2017 were false positives. The ruling did not completely ban the use of facial recognition tech inside the UK, but does narrow the scope of what is permissible and what law enforcement agencies have to do to be in compliance with human rights law. Other police inside the UK who deploy facial recognition technology will have to meet the standard set by today's ruling. That includes the Metropolitan Police in London, who deployed a similar type of system earlier this year.
so this? (Score:5, Insightful)
Re: (Score:3)
I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.
Re: (Score:2)
Well, for one thing, the Brits are often a test case for the US.
Re: (Score:2)
Re: (Score:2)
People remember the article but for some reason never thought to find out about the methodology.
Re: (Score:2)
Re: (Score:2)
That is what my memories of being approached by the police for access to my camera's recordings tells me too. They can request; nobody is required to comply.
And for private citizens, monitoring their own property. Often your FoV will extend beyond the immediate borders of your property, but if it'
Re: (Score:2)
I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.
I don't understand what you're trying to say here. Some people believed that a certain behavior was unlawful and took it to court and eventually won. Just having read the summary it seem like the rule of law prevailed and there are checks and balances that need to be followed, I'm not cynical enough to be surprised by that.
Re: (Score:2)
I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.
It's camera density per capita. The US leads the list with 15.28 cameras per 100 people, followed by China (14.36), UK (7.5), Germany (6,27). So at least the US finally beat China at something. The UK just gets a lot of heat over cameras because they were the first European country to really go to town with the camera surveillance state and they have always been at the forefront of the most authoritarian surveillance measures such as face recognition and fining people 90+ Pounds per instance of a citizen re
Re: (Score:2)
I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.
It's camera density per capita. The US leads the list with 15.28 cameras per 100 people, followed by China (14.36), UK (7.5), Germany (6,27). So at least the US finally beat China at something. The UK just gets a lot of heat over cameras because they were the first European country to really go to town with the camera surveillance state and they have always been at the forefront of the most authoritarian surveillance measures such as face recognition and fining people 90+ Pounds per instance of a citizen refusing to show their face to a FR camera:
https://metro.co.uk/2019/05/16... [metro.co.uk]
Which is also why it makes make laugh out loud every time some Tory git gives a speech that boils down to something like 'the freedoms that we enjoy compared to the Soviet communist police surveillance state'. Ninety quid for not showing your face to an FR camera, ... freedom, ... right! You stick a camera in everybody's face everywhere all of the time, track people's movements 24/7 and call it 'freedom', some freedom that.
How does London compare to NYC, compared to Mexico City, etc, per capital, would make more sense, not nationally.
In any country you could drive through fairly populated areas with few cameras, the average of all those with the dense urban areas is pretty useless.
Re: (Score:3)
And good for them for taking a step back from the brink! They were looking more and more like the world of "V for Vendetta". Things get better one step at a time.
Re: (Score:2)
And good for them for taking a step back from the brink! They were looking more and more like the world of "V for Vendetta". Things get better one step at a time.
That's a roger.
Re: (Score:1)
Re: (Score:3)
An area of private land in Kings Cross, London also used facial recognition for a period of time which was challenged, but the facial recognition was withdrawn after massive negative publicity before the case reached court. [theguardian.com]
The Met Police also ran facial recognition trials in London which attracted negative publicity, a
Re: (Score:2)
The City of London and London are two seperate legal entities.
The City is even called a Tax Haven in some literature.
Re: (Score:2)
> In short, they've said they don't care about the court's judgement.
I'm no fan of this at all, but the judgement doesn't say they can't use facial recognition - more like "you need to have some better justification, and your watchlist needs to be more accountable/transparent" (and some stuff about racial/gender bias). The Welsh police can absolutely still use FR - they just need some better procedures, and get the vendor to do due-diligence on bias.
The thing that's annoying about this case is that apart
Re:so this? (Score:4, Informative)
You know the City of London is one square mile, right?
It's literally called that? The Square Mile.
And most of the cameras within it are privately-owned, nothing to do with the state?
So from this you have - like every other idiot - extrapolated that the whole country is blanketed in facial-recognition cameras all talking to the state and watching everything that everyone in the country does.
And what does "on a daily basis" mean? It's a camera. It's operational 24/7. Whether there's even someone LOOKING at it, that's highly unlikely, especially for a privately-owned camera.
This really is the dumbest extrapolation that I hear on a regular basis from Americans believing that the UK is something out of 1984.
The school I work in has 64 cameras, 82 if you count our other site. That other site is a nursery. Are we sitting watching it 24/7? No. Does it have face-rec? No, we're lucky if you can see a face. Why do we have it? Primarily to put people off breaking in. Did it cost a fortune? Not really. A drop in the ocean to any premises costs. The front gate that opens on a button costs more. The printed signposts out the front cost more.
Have we ever given the state access to the cameras? No. Have we ever used the footage from the cameras? Once or twice when we knew something had happened and just wanted the footage of how/who. Do we watch even one camera 24/7? Nope. Would we? Nope. We don't even have a "security" guy in that respect. That's why we have cameras - to deter people from trying.
Are we "in London"? You mean that 1,569 km Greater London? (Yes) Or the 28km "Central London"? Or the 2.9 km City of London?
And if you haven't noticed, the cameras that are police cameras, with face rec, and central storage and processing, are so good that they actually do WORSE than just picking a guy out of a crowd and seeing if he has weed on him. Copper's instinct literally trumps anything to do with the cameras, when you look at the stats. Per actual, convictable charge after arrest they cost hundreds of thousands of pounds. They're useless.
But you keep on. Because obviously I should be living in fear of a square mile of tourist attractions, where hardly anyone can afford to actually live, that has such expensive houses and businesses that they decide themselves to put up cameras, cameras that mostly do nothing more than record to a cheap NVR box in most instances.
Despite the fact that the 8 million people who live in London don't go anywhere near that square mile, in which only 8000 (very rich!) people live and have a lot of cameras to protect their personal property and in which the police put up cameras to watch the tube stations, pubs, main vehicular routes etc.
I've lived in or around London my entire life, and I can't say that I've ever noticed or cared about any significant camera presence in that time. Mostly because I'm someone who's fitted hundreds of them in my life and know they are mostly there "just in case" for private individuals.
I'm gonna guess that you've never been to London for more than a tourist visit. Ask most Londoners if they fear the cameras and they'll say "What cameras?"
Re: (Score:1)
The average londoner is caught on camera 300 times a day: https://www.caughtoncamera.net... [caughtoncamera.net]
With between 4-6 million total cameras around the country: https://www.bbc.com/news/uk-30... [bbc.com]
Re: (Score:2)
Walk past our nursery, that's 10 cameras that capture you in a few yards of walking, and that's without you going anywhere near our building.
Walk down a street of commercial premises all with 2-10 cameras covering their frontage, that's hundreds before you even start.
Again... your point?
Are they state-controlled face-rec cameras? Or are they cheap chinese shit off Amazon that each shop owner has put on their property to stop you smashing up their frontage at 3am and getting away with it?
My front door has t
Re: (Score:1)
Re: (Score:2)
Generally speaking, in all the incidents I've had to do deal with in my professional career.
5% of them have ANYTHING caught on camera (even break-ins, assaults, etc.).
About 2% of those have any kind of face showing in any kind of capacity that you could identify them if you already knew exactly who they were likely to be.
Less than 10% of those would provide any kind of identifying feature sufficient that you could correlate it to other cameras, etc. unless they were literally the only person on the street a
Re: (Score:2)
It's important to understand this ruling. It doesn't say the police can't use facial recognition and they have already said they will continue to do so. It says that the police have too much power to put anyone they like on the list of faces to be recognized, and that the police didn't bother checking to see if the technology discriminates against race or gender.
The police will probably commission some dubious studies to refute the existing ones showing that facial recognition is highly unreliable for black
Re: (Score:2)
It's important to understand this ruling. It doesn't say the police can't use facial recognition and they have already said they will continue to do so. It says that the police have too much power to put anyone they like on the list of faces to be recognized, and that the police didn't bother checking to see if the technology discriminates against race or gender.
Yep. There's nothing inherently wrong with using a watch list or a list of active warrants to match against facial recognition, as long as the person is not immediately arrested (but you could have a cop approach them and make contact to confirm identity). The main issue is that you need to have approved, standardized processes/criteria for putting people on those watch lists, and they have to be followed. You can't just throw someone on a watch list because their name is Patrick O'Brien, they have red h
Re: (Score:2)
So this ruling is from a country that says it violates the camera's human rights yet only cameras aren't human........
Re: (Score:2)
Well there definitely does need to be one major change to the use of facial recognition, it should never ever be used to instigate automated police activity, not ever. What needs to happen, is laws need to be put in place to ensure facial recognition is also used as a guide for a competent and trained senior police officer to check the images themselves to make sure they are accurate and then they put their name and responsibility plus the consequence for getting it wrong and they initiate the police action
Re: (Score:2)
What "automated police activity"? I'll agree, that it should not trigger an automatic shooting of an identified suspect, but that no one is proposed that — indeed, it is not triggered even when a human officers recognizes a suspect's face.
So, can a positive match trigger an automatic notice to officers nearby?
Re: (Score:2)
This was not a ruling about privacy but about incompetence.
The issue appears to have been the fact that 92% of identifications were wrong. Acting on information that is wrong more than 9 out of 10 times is not acceptable.
If you came in and reported someone as the rapist you saw rape your sister last year, the police would be justified in arresting them. But if you were wrong NINE times then came in and insisted that this time, you were right, the police should not arrest you. They barely have enough evi
Re: (Score:2)
You can look at
Re: (Score:2)
People have no right to privacy in a public place of public photographs. This is nothing but protection of criminal acts in a public place.
A taxpayer-funded system was deployed that managed to rack up a whopping 2,300 out of 2,500 false-positives, which results in thousands being falsely identified and accused of a crime. Think being falsely accused of a crime always ends with "oops, my bad" before having to spend money legally defending yourself? Not likely. That will be abused to generate legal revenue, and target those who struggle to afford it.
From taxpayer waste to the wanton dismantling of privacy...nothing but bullshit excuses from s
False positives will dominate the results (Score:2)
If you have an error rate of as little as 1/4%, matching 23 people against each other (and that's important) gives you a 50% chance of a false positive.
It turns out the number of individual comparisons is hugely larger that one expects. In the 23-person case the actual number is 23 × 22/2 = 253: you may recognize this as the example from the "birthday paradox", where 253 comparisons time 1/365 chances ~= 0.69 (the formula is more complex, that's the simplification).
Let's assume 97% accuracy (all wh
Re: (Score:2)
Not "accused" (indicted), but "suspected".
Are human officers with reconstructed images from the "WANTED" poster back at the precinct any better at identifying the suspects? I doubt it.
Inevitable... We can argue, whether the numbers are good enough or so bad, the system is impractical, but it is not illegal.
A mistaken arrest is not profitable, on the contrary...
Re: (Score:2)
Not "accused" (indicted), but "suspected".
Are human officers with reconstructed images from the "WANTED" poster back at the precinct any better at identifying the suspects? I doubt it.
A WANTED poster hanging on the Bumfuck, Nowhere Post Office wall won't land you on a Federal no-fly list. A human officer won't be the one denying your federal loan application because you're on a "suspect" list. Tell me why you think a "suspected" list won't be expanded considerably and used against you. And don't compare the old-fashioned 19th Century way of manual law enforcement to the mass automation we have in place in the 21st Century.
It is pure ignorance that welcomes Death by 1,000 Cuts.
Inevitable... We can argue, whether the numbers are good enough or so bad, the system is impractical, but it is not illegal.
"What I
Re: (Score:1)
Neither would a positive match from the AI. Dismissed.
Except I didn't say "immoral" or "unethical". I said impractical. Strawman dismissed.
Clickbait headline (Score:4, Insightful)
The headline implies that usage of facial recognition violates human rights, while even the summary clearly states that it was the way that particular police department deployed facial recognition which according to courts needed more restrictions.
Next headline, "Use of H2O violates human rights, court rules!", further down in the article, "...when waterboarding suspects".
Re: (Score:2)
Your analogy is deeply flawed. Water is a necessity of life. Wholesale degradation of a law-abiding citizen's right to be left alone is not.
Re: (Score:2)
Re: (Score:2)
Ever hear of the 4th Amendment [aclu.org]?
Re: (Score:2)
Incorrect summary, facial recognition still legal (Score:4, Informative)
'The appeal succeeded on Ground 1, that the DC erred in concluding that SWP’s interference with Mr Bridges’s Article 8(1) rights was “in accordance with the law” for the purposes of Article 8(2). : [...] there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist. The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2).'
'The appeal failed on Ground 2, that the DC erred in determining that SWP’s use of AFR was a proportionate interference with Article 8 rights under Article 8(2). The Court held that the DC had correctly conducted a weighing exercise [...] The benefits were potentially great, and the impact on Mr Bridges was minor, and so the use of AFR was proportionate under Article 8(2).'
'The appeal succeeded on Ground 3, that the DC was wrong to hold that SWP provided an adequate “data protection impact assessment” (“DPIA”) as required by section 64 of the DPA 2018. The Court found that, as the DPIA was written on the basis that Article 8 was not infringed, the DPIA was deficient.'
'The appeal failed on Ground 4, that the DC was wrong to not reach a conclusion as to whether SWP had in place an “appropriate policy document” within the meaning of section 42 DPA 2018.'
'The appeal succeeded on Ground 5, that the DC was wrong to hold that SWP complied with the PSED. The Court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The Court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.'
So the appeal was (partly) granted mostly because of procedural stuff: the police haven't done their homework; that's what Points 1, 3, 4 and 5 are about. Plaintiff lost on Point 2, the most crucial one that actually is about human rights, and whether or not facial recognition can be used by the police in general. The court states in clear terms that use of this technology is proportionate.
Re: (Score:3)
So in this case I think it's mostly to ensure that the thing doesn't flag large numbers of minorities as persons of interest when
Re: (Score:3)
we're catching two many wanted criminals of one sex / race, time to stop !
Not at all. What they said was that this software falsely "recognised" too many innocent people, which is obviously bad, and that nobody checked if it recognised more people of one race incorrectly than of another. If members of one race are more likely to be incorrectly identified and investigated, that's something that needs fixing.
Re: (Score:1)
Strange racial bias (Score:2)
However, their face recognition managed to "recognise" more black faces. "Recognise" and not recognise because they had 90% false positives. There are two possibilities: 1. They have many more black people on their wa
Re: (Score:2)
Apple doesn't use the regular camera for Face ID. You can cover the front facing camera it with a privacy cover and iPhone will still recognize the face just fine.
Makes life easier (Score:1)
There are worse things than this (Score:1)