Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI The Courts Crime

Police Use of AI Facial Recognition Results In Murder Case Being Tossed (cleveland.com) 21

"A jury may never see the gun that authorities say was used to kill Blake Story last year," reports Cleveland.com.

"That's because Cleveland police used a facial recognition program — one that explicitly says its results are not admissible in court — to obtain a search warrant, according to court documents." The search turned up what police say is the murder weapon in the suspect's home. But a Cuyahoga County judge tossed that evidence after siding with defense attorneys who argued that the search warrant affidavit was misleading and relied on inadmissible evidence. If an appeals court upholds the judge's ruling to suppress the evidence, prosecutors acknowledge their case is likely lost...

The company that produced the facial recognition report, Clearview AI, has been used in hundreds of law enforcement investigations throughout Ohio and has faced lawsuits over privacy violations.

Not only does Cleveland lack a policy governing the use of artificial intelligence, Ohio lawmakers also have failed to set standards for how police use the tool to investigate crimes. "It's the wild, wild west in Ohio," said Gary Daniels, a lobbyist for the American Civil Liberties Union. The lack of state regulation of how law enforcement uses advanced technologies — no laws similarly govern the use of drones or license plate readers — means it is essentially up to agencies how they use the tools.

The affidavit for the search warrant was signed by a 28-year police force veteran, according to the article — but it didn't disclose the use of Clearview's technology.

Clearview's report acknowledged their results were not admissible in court — but then provided the suspect's name, arrest record, Social Security number, according to the article, and "noted he was the most likely match for the person in the convenience store."

Thanks to tlhIngan (Slashdot reader #30,335) for sharing the news.

Police Use of AI Facial Recognition Results In Murder Case Being Tossed

Comments Filter:
  • The so-called "inadmissibility" will vanish by the tip of a black Sharpie marker, when the "right" "AI company" provides the software.

  • How is facial recognition any different than partial license plate match search? Or is it because Clearview used a prior arrest record to bubble up that particular suspect to the top of the list? I think being able to use a prior conviction record to bubble a name up to the top of the list is OK. But maybe not arrest (not leading to conviction) or profiling based on legal or race/religion based behavior. That fine, but let's keep in mind non-facial recognition techniques are worse .. where a detective sees

    • So, the detective seeing partial evidence and using prior experience to jump to conclusions is bad, but an AI doing it is fine. Right. Gotcha. Makes perfect sense.

    • Re:I don't get it (Score:5, Informative)

      by ihavesaxwithcollies ( 10441708 ) on Saturday February 01, 2025 @12:18PM (#65134965)

      I think being able to use a prior conviction record to bubble a name up to the top of the list is OK.

      The judge absolutely disagrees with you.

      McMonagle agreed with defense attorneys, saying the AI identification of Tolbert was akin to an anonymous informant, which is not enough to establish probable cause.

    • Re:I don't get it (Score:4, Insightful)

      by OngelooflijkHaribo ( 7706194 ) on Saturday February 01, 2025 @12:48PM (#65135011)

      It isn't, but admissibility in court, like all laws and legal principles, is based on ridiculously uneven standards and respecting tradition. The simple reality is that new technology is held to a far higher standard of reliability than old, established customs, such as human facial recognition which has been shown time and time again to be completely unreliable. Or that polygraphs are not admitted because of how unreliable they are, but somehow a human layman can give testimony as to whether he thought someone acted like he was lying or not, which is surely no more reliable, probably less.

      All laws are uneven nonsense when held to proper scrutiny, because they grew organically over time and lawmakers don't want to revisit old laws that only came into place when nothing better existed. Also, politicians are idiots and political debates is like debating religious truth and one is quickly with the feeling of “One cannot reason a man out of what he was never reasoned into to begin with.”. Laws are not formed based on any rational analysis. And yet, they influence every aspect of our lives, and decide who is guilty, and who is not, as in this case.

    • by znrt ( 2424692 )

      How is facial recognition any different than partial license plate match search?

      there is a qualitative difference: facial recognition exploits biometric data and is far more privacy intrusive than license plate recognition. it is also much more complex and less understood/verified tech. for those reasons it rises far more concerns and understandably legislators and courts are still reluctant to accept it.

      you are right that human recognition is similarly dangerous and prone to error and misuse, but we are sort of the gold standard for ourselves. also, there's the question of scale: mass

      • It's a sliding scale of security versus privacy. You can't have perfect of either without lots of negatives. For instance, we could have a surveillance society but that still wouldn't result in no crime. Alternatively, we could have privacy as our number one value but bad actors would most certainly take advantage of this to profit.

        At the moment, it looks like we are trying to find the proper balance for identifying a suspect with new technology but we don't want to overly embrace this technology as it coul

    • License plate matching is extremely reliable because it's very simple to read a license plate. Facial recognition is extremely unreliable. With a tendency to have extremely high false positives especially for people with darker skin.

      So it very quickly becomes like drug sniffing dogs where it's used not to actually find crime but to come up with an excuse to get a search warrant where there isn't enough evidence. And that quickly gets abused.

      It's basically modern phrenology at that point. Just more br
    • by Sique ( 173459 )
      Because in this case, facial recognition was not used. On the CCTV which recorded the robbery, the face of the perpetrator was not visible. Clearview provided eight possible matches of gesture and gait, two of them being identified as Mr. Tolbert. In the search warrant, it was not disclosed that there were other matches, and apparently, the police did not follow the other leads at all.
  • by Bruce66423 ( 1678196 ) on Saturday February 01, 2025 @12:29PM (#65134977)

    The painful reality is that a murderer is likely to walk free because a police officer forgot that the system's evidence couldn't be used in court. This is bad. The alternative approach is, as I understand it, seen in the UK where if an invalid search produces significant evidence, it can still be used in court, but if it doesn't the officer responsible gets a reprimand and the victim of the unlawful search a large payout. Note that the large needs to be LARGE...

    • The logic the U.S.A. and many other countries maintain is that if this were possible, police officers would resort to illegal searchers if they be willing to pay the price for it personally, or they would perhaps shift blame and be protected.

      Every time evidence is suppressed of a clearly guilty person it seems like a very bad idea, but consider all the times illegal searchers were done where someone wasn't guilty, whose privacy was unduly violated.

      That having been said, the U.S.A. accepts many doctrines as

      • Wow. That list of European characteristics was a real eye-opener for me, as a US citizen. To my eye, those things will work just fine, as long as society is peaceful and functional, and everyone behaves, ESPECIALLY the judges.

        The US has no such trust that individuals will act responsibly. Thus, a great many things in our system are basically insurance policies against tyranny. We might never have to active that insurance policy, but we pay the premiums anyways.

        Most European systems don’t allo
        • Wow. That list of European characteristics was a real eye-opener for me, as a US citizen. To my eye, those things will work just fine, as long as society is peaceful and functional, and everyone behaves, ESPECIALLY the judges.

          Do they work fine? I feel the U.S.A. has immense class justify and guilt and sentencing disparity. I wouldn't want to be the poor black male accused of a crime he didn't commit in the U.S.A. to be honest.

          The US has no such trust that individuals will act responsibly. Thus, a great many things in our system are basically insurance policies against tyranny. We might never have to active that insurance policy, but we pay the premiums anyways.

          That is your perspective that these stop tyranny. From where I stand, they cause it. Judicial review has led to a lot of tyranny of the minority in the U.S.A. where a small group just decided things were as they were despite the majority being against it, ostensibly because “the constitution said it

The reward of a thing well done is to have done it. -- Emerson

Working...