Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy United Kingdom Stats Technology

London Police's Face Recognition System Gets It Wrong 81 Percent of the Time (technologyreview.com) 55

An anonymous reader quotes a report from MIT Technology Review: London's police force has conducted 10 trials of face recognition since 2016, using Japanese company NEC's Neoface technology. It commissioned academics from the University of Essex to independently assess the scheme, and they concluded that the system is 81% inaccurate (in other words, the vast majority of people it flags for the police are not on a wanted list). They found that of 42 matches, only eight were confirmed to be correct, Sky News reports. The Met police insists its technology makes an error in only one in 1,000 instances, but it hasn't shared its methodology for arriving at that statistic.
This discussion has been archived. No new comments can be posted.

London Police's Face Recognition System Gets It Wrong 81 Percent of the Time

Comments Filter:
  • by Kjella ( 173770 ) on Friday July 05, 2019 @05:17PM (#58879688) Homepage

    Dupe [slashdot.org]. Don't ever change /.

  • I, for one, am rooting against the widespread dissemination of this surveillance-rich technology. Continued reports of its inaccuracies should at least delay its inevitable ubiquitous deployment.

    • For a while, I thought this would encourage creative make-up or masks... but you stand a good chance of being incorrectly identified as, say, the Meat-Safe Murderer.

      ("I never doon it!")

  • Year after year, the TSA's own tests show they miss up to 95% of all fake bombs and other fake weapons used in testing.

    And yet, the claim is this agency is helping to protect us.

  • apparently gets it wrong close to 100% of the time.

  • ... if the percentage of people that it even flags in the first place is very low. If out of a however many million people it takes for the software to flag 100 of them, where only about 29 turn out to be guilty, then as long as the number of actually guilty people that didn't get flagged in the first place is extremely low, then the software may not be doing that bad a job.

One good suit is worth a thousand resumes.

Working...