Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Government United States

FBI Agents Are Using Face Recognition Without Proper Training (wired.com) 32

An anonymous reader quotes a report from Wired: The U.S. Federal Bureau of Investigation (FBI) has done tens of thousands of face recognition searches using software from outside providers in recent years. Yet only 5 percent of the 200 agents with access to the technology have taken the bureau's three-day training course on how to use it, a report from the Government Accountability Office (GAO) this month reveals. The bureau has no policy for face recognition use in place to protect privacy, civil rights, or civil liberties. Lawmakers and others concerned about face recognition have said that adequate training on the technology and how to interpret its output is needed to reduce improper use or errors, although some experts say training can lull law enforcement and the public into thinking face recognition is low risk.

Since the false arrest of Robert Williams near Detroit in 2020, multiple instances have surfaced in the US of arrests after a face recognition model wrongly identified a person. Alonzo Sawyer, whose ordeal became known this spring, spent nine days in prison for a crime he didn't commit. The lack of face recognition training at the FBI came to light in a GAO report examining the protections in place when federal law enforcement uses the technology. The report was compiled at the request of seven Democratic members of Congress. Report author and GAO Homeland Security and Justice director Gretta Goodwin says, via email, that she found no evidence of false arrests due to use of face recognition by a federal law enforcement agency.

The GAO report focuses on face recognition tools made by commercial and nonprofit entities. That means it does not cover the FBI's in-house face recognition platform, which the GAO previously criticized for poor privacy protections. The US Department of Justice was ordered by the White House last year to develop best practices for using face recognition and report any policy changes that result. The outside face recognition tools used by the FBI and other federal law enforcement covered by the report comes from companies including Clearview AI, which scraped billions of photos of faces from the internet to train its face recognition system, andThorn, a nonprofit that combats sex trafficking by applying face recognition to identify victims and sex traffickers from online commercial sex market imagery.The FBI ranks first among federal law enforcement agencies examined by the GAO for the scale of its use of face recognition. More than 60,000 searches were carried out by seven agencies between October 2019 and March 2022. Over half were made by FBI agents, about 15,000 using Clearview AI and 20,000 using Thorn.
"No existing law requires federal law enforcement personnel to take training before using face recognition or to follow particular standards when using face recognition in a criminal investigation," notes Wired.

"The DOJ plans to issue a department-wide civil rights and civil liberties policy for face recognition but has yet to set a date for planned implementation, according to the report. It says that DOJ officials, at one point in 2022, considered updating its policy to allow a face recognition match alone to justify applying for a search warrant."
This discussion has been archived. No new comments can be posted.

FBI Agents Are Using Face Recognition Without Proper Training

Comments Filter:
  • I read somewhere there are only 267 different faces, and everyone's face is a minor tweak from one of those. I don't know if that includes all races, but even if not, it would be just a low multiple to cover the planet.

    So...ai might do better as the tweaks would be measurable, but humans confirming ai decisions?

    • I read somewhere there are only 267 different faces

      That sounds like a made-up factoid.

      A quick Google search brings up nothing relevant. An actual scientific finding that there were only N different faces would be big news and widely reported.

    • 267 is the number of flavors in Dr. Heinz's Homemade Ice Cream. The human face bank only has 188 options.

  • by Anonymous Coward

    Most LEO training consists of click-through power-point presentations, barely addressing only the most superficial legal aspects, and is mostly a CYA checkbox against possible lawsuits.

    As with most new powers/technologies allowed to law enforcement, this will be abused massively.

    The real question should be why the FBI needs these tools to affect effective investigations vs. privacy concerns, which they really don't give a damn about anyway.

  • by awwshit ( 6214476 ) on Monday September 25, 2023 @11:08PM (#63877295)

    Stalking women everywhere no doubt.

  • by Required Snark ( 1702878 ) on Monday September 25, 2023 @11:42PM (#63877339)
    What they want is something that looks high-tech and objective and seems unimpeachable but is easy to manipulate to give whatever result they want.

    Want a real world example? Comparative bullet-lead analysis [wikipedia.org]

    Comparative bullet-lead analysis (CBLA), also known as compositional bullet-lead analysis, is a now discredited and abandoned forensic technique which used chemistry to link crime scene bullets to ones possessed by suspects on the theory that each batch of lead had a unique elemental makeup.

    The technique was first used after U.S. President John F. Kennedy's assassination in 1963. From the early 1980s through 2004 the US Federal Bureau of Investigation conducted about 2,500 analyses on cases submitted by law-enforcement groups. The results of these analyses had often been questioned by defense lawyers and the press, so the FBI finally asked the United States National Academy of Sciences' Board on Science, Technology, and Economic Policy to research the scientific merit of the process.

    ... The greatest caveat in the report was that the statistical tests as applied by the FBI could cause confusion and misinterpretation when transmitted to prosecutors or when explained to a trial jury. Because of the significance of this weakness, the report concluded that the analysis should be used with caution. This report helped the FBI decide in 2004 to voluntarily cease offering the analysis to law-enforcement entities.

    So they latched on to some scientific mumbo-jumbo which was never validated and then used it for over forty years. It was useful to win cases, so why bother to check when it helped get convictions? That's great if the only criteria is winning and no one gives a rat's ass about truth, honesty or justice. Which pretty much sums up the operations of the criminal legal system as it currently exists at this time.

    It's incredibly obvious that justice is not a consideration.

    The U.S. government has fought releasing the list of the estimated 2,500 cases over three decades in which it performed the analysis, which may have led to false convictions. According to the FBI, only 20% of the 2,500 tests performed introduced the CBLA results into evidence at trial.

    Given the estimate of 20% that means unreliable evidence in 500 trials, but who cares? Also note the weasel wording about "evidence at trial". That can easily exclude plea deals where the evidence was never presented to a jury. So the real damage has effectively swept under the rug.

    • Texas may execute someone convicted using the discredited "Shaken Baby Syndrome" theory. It was only ever a theory and when subjected to proper rigorous analysis was shown to be impossible, but Texas still wants to execute someone convicted of causing death by shaking.

      https://www.theguardian.com/wo... [theguardian.com]

    • by Marful ( 861873 )
      Exactly like when they lied about DNA matching from hair, or bullet case matching (which tv shows and movies love to reference, but in actuality is useless).

      The FBI's motus operandi of the last 30 years seems to be "Parallel Construction".
  • by bogaboga ( 793279 ) on Tuesday September 26, 2023 @12:00AM (#63877375)

    Since the false arrest of Robert Williams near Detroit in 2020, multiple instances have surfaced in the US of arrests after a face recognition model wrongly identified a person. Alonzo Sawyer, whose ordeal became known this spring, spent nine days in prison for a crime he didn't commit. The lack of face recognition training at the FBI came to light in a GAO report examining the protections in place when federal law enforcement uses the technology.

    [Bold/Italic mine]

    Can someone tell me what we in these US of A still do very well, and can be proud of?

  • by Wizardess ( 888790 ) on Tuesday September 26, 2023 @12:56AM (#63877453)

    We've a long history using facial recognition. Human brains do this all the time with somewhat questionable accuracy. People have been picked up, tried, and convicted based on facial recognition. Usually, it appears, we get it right. But, a very significant percentages of all such events are sad mistakes.

    Within my life time the wanted poster with photographs of criminals appeared. And organic facial recognition leapt to the fore - and fell on its face again, perhaps a few percentage points less often.

    As best I can tell digital facial recognition is significantly better than organic facial recognition; but, it is also not nearly perfect. We keep stopping development. Is this because of the latent criminal within us that figures, "I might no longer be able to get away with it!" Maybe that is why the digital facial recognition creeps me out a bit. But, when I think about it the organic based facial recognition REALLY creeps me out.

    No matter what kind of facial recognition is used, other substantive evidence must also exist before any conviction (or exoneration) can take place. Yeah, if you are in California and picked up because you resemble somebody accused of killing somebody else based on organic facial recognition, the facial recognition in Columbus, Ohio that shows you were at a specific street corner at precisely the time the killing took place can clear you. Maybe we can achieve the huge leap of extracting this argument from the realm of emotions to the realm of logic, risks, and rewards. Or is that another thing that is "Probability Zero?"

    {^_^}

    • The problem with "AI-based" facial recognition is that because it is better than humans, humans begin to believe it's infallible. We start getting the arguments that "the computer said it," so "it must be right." No critical thinking happens and the humans simply do what the computer tells them. Fingerprint analysis has an error rate, but you can be certain that no one talks about that. DNA analysis has a similar problem. Error rates are ignored and the computer is "always right."
    • by Zak3056 ( 69287 )

      Within my life time the wanted poster with photographs of criminals appeared.

      While we do have some older people on /. (including at least one nonagenarian, though I haven't seen him post in a while so he may have passed away), unless you're some form of supernatural entity, I think the above claim is extremely unlikely. Photographs on wanted posters have been a thing since the mid-19th century [wikipedia.org].

    • If you're at a crowded place, unless you have a reason to remember someone's face they will be out of your short term memory in a matter of seconds. But even if you had total recall, you couldn't upload that for a commercial or government entity to be used with facial recognition. And you couldn't see someone in London on Tuesday and have then arrested in Paris two days later based on a false mismatch.

      So I hope all the people who fall victim to AI have alibis like Brian Mayfield, a lawyer who's fingerprints

  • [joke]It figures that people working at the FBI would require training to recognise faces. Mind you, deskies in general make my arse grapes itch.[/joke]
  • 'The US Department of Justice was ordered by the White House last year to develop best practices for using face recognition'

    And nothing happened. Anyone surprised?

  • Arrest warrants being issued with only sparse evidence of criminal behavior is a persistent problem with police departments. People can be arrested on hearsay. One arrest can cause the loss of employment and stain someone's record permanently.

  • The suspect image and Williams aren't even close.

    • by Marful ( 861873 )
      That's the problem, cops are lazy.

      Do you know how often they can't be bothered to check the address they're raiding with a swat team? It's orders of magnitude more than people think.
  • the tech is very, very dodgy. It's the 21st century equivalent of a lie detector. Something you use to establish probably cause and bully innocent people into confessions and jail time.

    Wear glasses? Have a beard? A tan? Brown skin in general? Congrats, you're now much more likely to be pulled into a police line up!

    The systems are designed to bias finding a match than not because if you're a company that sells this software you want to show how "effective" it is. And the police are more than happy to
  • if a database of school photos was used instead of scraped data more faces would be used and then the exploited children could be compared to school photos and saved this would be a great thing for our children exploited

If it wasn't for Newton, we wouldn't have to eat bruised apples.

Working...