Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy The Courts United Kingdom Technology

Police Use of Facial Recognition Violates Human Rights, UK Court Rules (arstechnica.com) 58

An appeals court ruled today that police use of facial recognition technology in the UK has "fundamental deficiencies" and violates several laws. Ars Technica reports: South Wales Police began using automated facial recognition technology on a trial basis in 2017, deploying a system called AFR Locate overtly at several dozen major events such as soccer matches. Police matched the scans against watchlists of known individuals to identify persons who were wanted by the police, had open warrants against them, or were in some other way persons of interest. In 2019, Cardiff resident Ed Bridges filed suit against the police, alleging that having his face scanned in 2017 and 2018 was a violation of his legal rights. Although he was backed by UK civil rights organization Liberty, Bridges lost his suit in 2019, but the Court of Appeal today overturned that ruling, finding that the South Wales Police facial recognition program was unlawful.

"Too much discretion is currently left to individual police officers," the court ruled. "It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed." The police did not sufficiently investigate if the software in use exhibited race or gender bias, the court added. The South Wales Police in 2018 released data admitting that about 2,300 of nearly 2,500 matches -- roughly 92 percent -- the software made at an event in 2017 were false positives. The ruling did not completely ban the use of facial recognition tech inside the UK, but does narrow the scope of what is permissible and what law enforcement agencies have to do to be in compliance with human rights law. Other police inside the UK who deploy facial recognition technology will have to meet the standard set by today's ruling. That includes the Metropolitan Police in London, who deployed a similar type of system earlier this year.

This discussion has been archived. No new comments can be posted.

Police Use of Facial Recognition Violates Human Rights, UK Court Rules

Comments Filter:
  • so this? (Score:5, Insightful)

    by arbiter1 ( 1204146 ) on Tuesday August 11, 2020 @09:25PM (#60392177)
    So this ruling is from a country that has CAMERA's ALL over the city of london watching people on a daily basis yet only that violates their human rights.......
    • by Tanaka ( 37812 )

      I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.

      • Well, for one thing, the Brits are often a test case for the US.

        • Strange. Here the perception is that America is where things are tested before being deployed anywhere important. What else are colonies for?
      • Because of a somewhat hysterical article some years ago, that counted the cameras in shops on two streets, multiplied it by the number of businesses, made up a number of cameras owned by private businesses, and reported that as an estimate of the number of security cameras in the UK.

        People remember the article but for some reason never thought to find out about the methodology.
        • by q_e_t ( 5104099 )
          And in some instances those cameras were so poor that you'd be able to distinguish a human from a giraffe but not have done any facial recognition. Some were probably not even working. It's likely that the cameras are cheaper and better now and more likely to be working, but mostly they are for shops in case something happens and rarely monitored in any meaningful way. The police might request footage when there is a crime, or at least that is what my watching of dramas tells me :). But then they can also b
          • The police might request footage when there is a crime, or at least that is what my watching of dramas tells me :).

            That is what my memories of being approached by the police for access to my camera's recordings tells me too. They can request; nobody is required to comply.

            but mostly they are for shops in case something happens and rarely monitored in any meaningful way.

            And for private citizens, monitoring their own property. Often your FoV will extend beyond the immediate borders of your property, but if it'

      • I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.

        I don't understand what you're trying to say here. Some people believed that a certain behavior was unlawful and took it to court and eventually won. Just having read the summary it seem like the rule of law prevailed and there are checks and balances that need to be followed, I'm not cynical enough to be surprised by that.

      • I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.

        It's camera density per capita. The US leads the list with 15.28 cameras per 100 people, followed by China (14.36), UK (7.5), Germany (6,27). So at least the US finally beat China at something. The UK just gets a lot of heat over cameras because they were the first European country to really go to town with the camera surveillance state and they have always been at the forefront of the most authoritarian surveillance measures such as face recognition and fining people 90+ Pounds per instance of a citizen re

        • I have been to cites all over the globe. Camera's everywhere. I don't know why London is singled out.

          It's camera density per capita. The US leads the list with 15.28 cameras per 100 people, followed by China (14.36), UK (7.5), Germany (6,27). So at least the US finally beat China at something. The UK just gets a lot of heat over cameras because they were the first European country to really go to town with the camera surveillance state and they have always been at the forefront of the most authoritarian surveillance measures such as face recognition and fining people 90+ Pounds per instance of a citizen refusing to show their face to a FR camera:

          https://metro.co.uk/2019/05/16... [metro.co.uk]

          Which is also why it makes make laugh out loud every time some Tory git gives a speech that boils down to something like 'the freedoms that we enjoy compared to the Soviet communist police surveillance state'. Ninety quid for not showing your face to an FR camera, ... freedom, ... right! You stick a camera in everybody's face everywhere all of the time, track people's movements 24/7 and call it 'freedom', some freedom that.

          How does London compare to NYC, compared to Mexico City, etc, per capital, would make more sense, not nationally.

          In any country you could drive through fairly populated areas with few cameras, the average of all those with the dense urban areas is pretty useless.

    • by lgw ( 121541 )

      And good for them for taking a step back from the brink! They were looking more and more like the world of "V for Vendetta". Things get better one step at a time.

      • And good for them for taking a step back from the brink! They were looking more and more like the world of "V for Vendetta". Things get better one step at a time.

        That's a roger.

    • isn't it a good thing? I mean it just camera all over the city but not your home? why it became a problem for some people? And if something happened. it is for the best.
    • London has the privately financed 'ring of steel' which is financed by the City of London, a private consortium of (largely finance-related) companies. But CCTV is not facial recognition.

      An area of private land in Kings Cross, London also used facial recognition for a period of time which was challenged, but the facial recognition was withdrawn after massive negative publicity before the case reached court. [theguardian.com]

      The Met Police also ran facial recognition trials in London which attracted negative publicity, a
      • by DrSkwid ( 118965 )

        The City of London and London are two seperate legal entities.

        The City is even called a Tax Haven in some literature.

      • > In short, they've said they don't care about the court's judgement.

        I'm no fan of this at all, but the judgement doesn't say they can't use facial recognition - more like "you need to have some better justification, and your watchlist needs to be more accountable/transparent" (and some stuff about racial/gender bias). The Welsh police can absolutely still use FR - they just need some better procedures, and get the vendor to do due-diligence on bias.

        The thing that's annoying about this case is that apart

    • Re:so this? (Score:4, Informative)

      by ledow ( 319597 ) on Wednesday August 12, 2020 @02:24AM (#60392747) Homepage

      You know the City of London is one square mile, right?

      It's literally called that? The Square Mile.

      And most of the cameras within it are privately-owned, nothing to do with the state?

      So from this you have - like every other idiot - extrapolated that the whole country is blanketed in facial-recognition cameras all talking to the state and watching everything that everyone in the country does.

      And what does "on a daily basis" mean? It's a camera. It's operational 24/7. Whether there's even someone LOOKING at it, that's highly unlikely, especially for a privately-owned camera.

      This really is the dumbest extrapolation that I hear on a regular basis from Americans believing that the UK is something out of 1984.

      The school I work in has 64 cameras, 82 if you count our other site. That other site is a nursery. Are we sitting watching it 24/7? No. Does it have face-rec? No, we're lucky if you can see a face. Why do we have it? Primarily to put people off breaking in. Did it cost a fortune? Not really. A drop in the ocean to any premises costs. The front gate that opens on a button costs more. The printed signposts out the front cost more.

      Have we ever given the state access to the cameras? No. Have we ever used the footage from the cameras? Once or twice when we knew something had happened and just wanted the footage of how/who. Do we watch even one camera 24/7? Nope. Would we? Nope. We don't even have a "security" guy in that respect. That's why we have cameras - to deter people from trying.

      Are we "in London"? You mean that 1,569 km Greater London? (Yes) Or the 28km "Central London"? Or the 2.9 km City of London?

      And if you haven't noticed, the cameras that are police cameras, with face rec, and central storage and processing, are so good that they actually do WORSE than just picking a guy out of a crowd and seeing if he has weed on him. Copper's instinct literally trumps anything to do with the cameras, when you look at the stats. Per actual, convictable charge after arrest they cost hundreds of thousands of pounds. They're useless.

      But you keep on. Because obviously I should be living in fear of a square mile of tourist attractions, where hardly anyone can afford to actually live, that has such expensive houses and businesses that they decide themselves to put up cameras, cameras that mostly do nothing more than record to a cheap NVR box in most instances.

      Despite the fact that the 8 million people who live in London don't go anywhere near that square mile, in which only 8000 (very rich!) people live and have a lot of cameras to protect their personal property and in which the police put up cameras to watch the tube stations, pubs, main vehicular routes etc.

      I've lived in or around London my entire life, and I can't say that I've ever noticed or cared about any significant camera presence in that time. Mostly because I'm someone who's fitted hundreds of them in my life and know they are mostly there "just in case" for private individuals.

      I'm gonna guess that you've never been to London for more than a tourist visit. Ask most Londoners if they fear the cameras and they'll say "What cameras?"

      • by ph0tik ( 669210 )
        I am so confused by this post, especially because it's easily refuted.

        The average londoner is caught on camera 300 times a day: https://www.caughtoncamera.net... [caughtoncamera.net]

        With between 4-6 million total cameras around the country: https://www.bbc.com/news/uk-30... [bbc.com]
    • by AmiMoJo ( 196126 )

      It's important to understand this ruling. It doesn't say the police can't use facial recognition and they have already said they will continue to do so. It says that the police have too much power to put anyone they like on the list of faces to be recognized, and that the police didn't bother checking to see if the technology discriminates against race or gender.

      The police will probably commission some dubious studies to refute the existing ones showing that facial recognition is highly unreliable for black

      • by Nidi62 ( 1525137 )

        It's important to understand this ruling. It doesn't say the police can't use facial recognition and they have already said they will continue to do so. It says that the police have too much power to put anyone they like on the list of faces to be recognized, and that the police didn't bother checking to see if the technology discriminates against race or gender.

        Yep. There's nothing inherently wrong with using a watch list or a list of active warrants to match against facial recognition, as long as the person is not immediately arrested (but you could have a cop approach them and make contact to confirm identity). The main issue is that you need to have approved, standardized processes/criteria for putting people on those watch lists, and they have to be followed. You can't just throw someone on a watch list because their name is Patrick O'Brien, they have red h

    • So this ruling is from a country that says it violates the camera's human rights yet only cameras aren't human........

  • Clickbait headline (Score:4, Insightful)

    by misnohmer ( 1636461 ) on Tuesday August 11, 2020 @11:50PM (#60392465)

    The headline implies that usage of facial recognition violates human rights, while even the summary clearly states that it was the way that particular police department deployed facial recognition which according to courts needed more restrictions.

    Next headline, "Use of H2O violates human rights, court rules!", further down in the article, "...when waterboarding suspects".

    • Your analogy is deeply flawed. Water is a necessity of life. Wholesale degradation of a law-abiding citizen's right to be left alone is not.

  • by JaredOfEuropa ( 526365 ) on Wednesday August 12, 2020 @02:05AM (#60392729) Journal
    From the verdict [judiciary.uk]

    'The appeal succeeded on Ground 1, that the DC erred in concluding that SWP’s interference with Mr Bridges’s Article 8(1) rights was “in accordance with the law” for the purposes of Article 8(2). : [...] there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist. The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2).'

    'The appeal failed on Ground 2, that the DC erred in determining that SWP’s use of AFR was a proportionate interference with Article 8 rights under Article 8(2). The Court held that the DC had correctly conducted a weighing exercise [...] The benefits were potentially great, and the impact on Mr Bridges was minor, and so the use of AFR was proportionate under Article 8(2).'

    'The appeal succeeded on Ground 3, that the DC was wrong to hold that SWP provided an adequate “data protection impact assessment” (“DPIA”) as required by section 64 of the DPA 2018. The Court found that, as the DPIA was written on the basis that Article 8 was not infringed, the DPIA was deficient.'

    'The appeal failed on Ground 4, that the DC was wrong to not reach a conclusion as to whether SWP had in place an “appropriate policy document” within the meaning of section 42 DPA 2018.'

    'The appeal succeeded on Ground 5, that the DC was wrong to hold that SWP complied with the PSED. The Court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The Court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.'

    So the appeal was (partly) granted mostly because of procedural stuff: the police haven't done their homework; that's what Points 1, 3, 4 and 5 are about. Plaintiff lost on Point 2, the most crucial one that actually is about human rights, and whether or not facial recognition can be used by the police in general. The court states in clear terms that use of this technology is proportionate.
    • Thank you for the link to the verdict.
  • It is well-known that any technology based on cameras has difficulties to identify black people - because their faces are darker, providing less light and less contrast and making it harder to recognise someone. (I'd be curious about Apple's FaceID which works in a completely different way).

    However, their face recognition managed to "recognise" more black faces. "Recognise" and not recognise because they had 90% false positives. There are two possibilities: 1. They have many more black people on their wa
    • Apple doesn't use the regular camera for Face ID. You can cover the front facing camera it with a privacy cover and iPhone will still recognize the face just fine.

  • That facial recognition scanner makes life so much easier for police. With all of the killings of minorities for no reason, should their lives be easier?
  • What about the social media platforms using facial recognition of your "friends" as an account access security measure?

There are two ways to write error-free programs; only the third one works.

Working...