Police Use of AI Facial Recognition Results In Murder Case Being Tossed (cleveland.com) 13
"A jury may never see the gun that authorities say was used to kill Blake Story last year," reports Cleveland.com.
"That's because Cleveland police used a facial recognition program — one that explicitly says its results are not admissible in court — to obtain a search warrant, according to court documents." The search turned up what police say is the murder weapon in the suspect's home. But a Cuyahoga County judge tossed that evidence after siding with defense attorneys who argued that the search warrant affidavit was misleading and relied on inadmissible evidence. If an appeals court upholds the judge's ruling to suppress the evidence, prosecutors acknowledge their case is likely lost...
The company that produced the facial recognition report, Clearview AI, has been used in hundreds of law enforcement investigations throughout Ohio and has faced lawsuits over privacy violations.
Not only does Cleveland lack a policy governing the use of artificial intelligence, Ohio lawmakers also have failed to set standards for how police use the tool to investigate crimes. "It's the wild, wild west in Ohio," said Gary Daniels, a lobbyist for the American Civil Liberties Union. The lack of state regulation of how law enforcement uses advanced technologies — no laws similarly govern the use of drones or license plate readers — means it is essentially up to agencies how they use the tools.
The affidavit for the search warrant was signed by a 28-year police force veteran, according to the article — but it didn't disclose the use of Clearview's technology.
Clearview's report acknowledged their results were not admissible in court — but then provided the suspect's name, arrest record, Social Security number, according to the article, and "noted he was the most likely match for the person in the convenience store."
Thanks to tlhIngan (Slashdot reader #30,335) for sharing the news.
"That's because Cleveland police used a facial recognition program — one that explicitly says its results are not admissible in court — to obtain a search warrant, according to court documents." The search turned up what police say is the murder weapon in the suspect's home. But a Cuyahoga County judge tossed that evidence after siding with defense attorneys who argued that the search warrant affidavit was misleading and relied on inadmissible evidence. If an appeals court upholds the judge's ruling to suppress the evidence, prosecutors acknowledge their case is likely lost...
The company that produced the facial recognition report, Clearview AI, has been used in hundreds of law enforcement investigations throughout Ohio and has faced lawsuits over privacy violations.
Not only does Cleveland lack a policy governing the use of artificial intelligence, Ohio lawmakers also have failed to set standards for how police use the tool to investigate crimes. "It's the wild, wild west in Ohio," said Gary Daniels, a lobbyist for the American Civil Liberties Union. The lack of state regulation of how law enforcement uses advanced technologies — no laws similarly govern the use of drones or license plate readers — means it is essentially up to agencies how they use the tools.
The affidavit for the search warrant was signed by a 28-year police force veteran, according to the article — but it didn't disclose the use of Clearview's technology.
Clearview's report acknowledged their results were not admissible in court — but then provided the suspect's name, arrest record, Social Security number, according to the article, and "noted he was the most likely match for the person in the convenience store."
Thanks to tlhIngan (Slashdot reader #30,335) for sharing the news.
Only a temporary problem (Score:1, Troll)
The so-called "inadmissibility" will vanish by the tip of a black Sharpie marker, when the "right" "AI company" provides the software.
Re: (Score:1)
Helped by an unelected immigrant no less.
Re:Only a temporary problem (Score:4, Informative)
Re: (Score:2)
Generally, there is no requirement that evidence for a search warrant be admissible in court.
For instance, hearsay is inadmissible in court but used all the time to obtain a warrant.
I don't get it (Score:2)
How is facial recognition any different than partial license plate match search? Or is it because Clearview used a prior arrest record to bubble up that particular suspect to the top of the list? I think being able to use a prior conviction record to bubble a name up to the top of the list is OK. But maybe not arrest (not leading to conviction) or profiling based on legal or race/religion based behavior. That fine, but let's keep in mind non-facial recognition techniques are worse .. where a detective sees
Re: (Score:2)
So, the detective seeing partial evidence and using prior experience to jump to conclusions is bad, but an AI doing it is fine. Right. Gotcha. Makes perfect sense.
Re:I don't get it (Score:5, Informative)
I think being able to use a prior conviction record to bubble a name up to the top of the list is OK.
The judge absolutely disagrees with you.
McMonagle agreed with defense attorneys, saying the AI identification of Tolbert was akin to an anonymous informant, which is not enough to establish probable cause.
Re:I don't get it (Score:4, Insightful)
It isn't, but admissibility in court, like all laws and legal principles, is based on ridiculously uneven standards and respecting tradition. The simple reality is that new technology is held to a far higher standard of reliability than old, established customs, such as human facial recognition which has been shown time and time again to be completely unreliable. Or that polygraphs are not admitted because of how unreliable they are, but somehow a human layman can give testimony as to whether he thought someone acted like he was lying or not, which is surely no more reliable, probably less.
All laws are uneven nonsense when held to proper scrutiny, because they grew organically over time and lawmakers don't want to revisit old laws that only came into place when nothing better existed. Also, politicians are idiots and political debates is like debating religious truth and one is quickly with the feeling of “One cannot reason a man out of what he was never reasoned into to begin with.”. Laws are not formed based on any rational analysis. And yet, they influence every aspect of our lives, and decide who is guilty, and who is not, as in this case.
Re: (Score:2)
How is facial recognition any different than partial license plate match search?
there is a qualitative difference: facial recognition exploits biometric data and is far more privacy intrusive than license plate recognition. it is also much more complex and less understood/verified tech. for those reasons it rises far more concerns and understandably legislators and courts are still reluctant to accept it.
you are right that human recognition is similarly dangerous and prone to error and misuse, but we are sort of the gold standard for ourselves. also, there's the question of scale: mass
Because it doesn't work (Score:2)
So it very quickly becomes like drug sniffing dogs where it's used not to actually find crime but to come up with an excuse to get a search warrant where there isn't enough evidence. And that quickly gets abused.
It's basically modern phrenology at that point. Just more br
The facial recognition system got it right (Score:3)
The painful reality is that a murderer is likely to walk free because a police officer forgot that the system's evidence couldn't be used in court. This is bad. The alternative approach is, as I understand it, seen in the UK where if an invalid search produces significant evidence, it can still be used in court, but if it doesn't the officer responsible gets a reprimand and the victim of the unlawful search a large payout. Note that the large needs to be LARGE...
Re: (Score:2)
The logic the U.S.A. and many other countries maintain is that if this were possible, police officers would resort to illegal searchers if they be willing to pay the price for it personally, or they would perhaps shift blame and be protected.
Every time evidence is suppressed of a clearly guilty person it seems like a very bad idea, but consider all the times illegal searchers were done where someone wasn't guilty, whose privacy was unduly violated.
That having been said, the U.S.A. accepts many doctrines as
Re: (Score:2)
The US has no such trust that individuals will act responsibly. Thus, a great many things in our system are basically insurance policies against tyranny. We might never have to active that insurance policy, but we pay the premiums anyways.
Most European systems don’t allo