US Police Seldom Disclose Use of AI-Powered Facial Recognition, Investigation Finds (msn.com) 63
An anonymous reader shared this report from the Washington Post:
Hundreds of Americans have been arrested after being connected to a crime by facial recognition software, a Washington Post investigation has found, but many never know it because police seldom disclose their use of the controversial technology...
In fact, the records show that officers often obscured their reliance on the software in public-facing reports, saying that they identified suspects "through investigative means" or that a human source such as a witness or police officer made the initial identification... The Coral Springs Police Department in South Florida instructs officers not to reveal the use of facial recognition in written reports, according to operations deputy chief Ryan Gallagher. He said investigative techniques are exempt from Florida's public disclosure laws... The department would disclose the source of the investigative lead if it were asked in a criminal proceeding, Gallagher added....
Prosecutors are required to inform defendants about any information that would help prove their innocence, reduce their sentence or hurt the credibility of a witness testifying against them. When prosecutors fail to disclose such information — known as a "Brady violation" after the 1963 Supreme Court ruling that mandates it — the court can declare a mistrial, overturn a conviction or even sanction the prosecutor. No federal laws regulate facial recognition and courts do not agree whether AI identifications are subject to Brady rules. Some states and cities have begun mandating greater transparency around the technology, but even in these locations, the technology is either not being used that often or it's not being disclosed, according to interviews and public records requests...
Over the past four years, the Miami Police Department ran 2,500 facial recognition searches in investigations that led to at least 186 arrests and more than 50 convictions. Among the arrestees, just 1 in 16 were told about the technology's use — less than 7 percent — according to a review by The Post of public reports and interviews with some arrestees and their lawyers. The police department said that in some of those cases the technology was used for purposes other than identification, such as finding a suspect's social media feeds, but did not indicate in how many of the cases that happened. Carlos J. Martinez, the county's chief public defender, said he had no idea how many of his Miami clients were identified with facial recognition until The Post presented him with a list. "One of the basic tenets of our justice system is due process, is knowing what evidence there is against you and being able to challenge the evidence that's against you," Martinez said. "When that's kept from you, that is an all-powerful government that can trample all over us."
After reviewing The Post's findings, Miami police and local prosecutors announced plans to revise their policies to require clearer disclosure in every case involving facial recognition.
The article points out that Miami's Assistant Police Chief actually told a congressional panel on law enforcement AI use that his department is "the first to be completely transparent about" the use of facial recognition. (When confronted with the Washington Post's findings, he "acknowledged that officers may not have always informed local prosecutors [and] said the department would give prosecutors all information on the use of facial recognition, in past and future cases".
He told the Post that the department would "begin training officers to always disclose the use of facial recognition in incident reports." But he also said they would "leave it up to prosecutors to decide what to disclose to defendants."
In fact, the records show that officers often obscured their reliance on the software in public-facing reports, saying that they identified suspects "through investigative means" or that a human source such as a witness or police officer made the initial identification... The Coral Springs Police Department in South Florida instructs officers not to reveal the use of facial recognition in written reports, according to operations deputy chief Ryan Gallagher. He said investigative techniques are exempt from Florida's public disclosure laws... The department would disclose the source of the investigative lead if it were asked in a criminal proceeding, Gallagher added....
Prosecutors are required to inform defendants about any information that would help prove their innocence, reduce their sentence or hurt the credibility of a witness testifying against them. When prosecutors fail to disclose such information — known as a "Brady violation" after the 1963 Supreme Court ruling that mandates it — the court can declare a mistrial, overturn a conviction or even sanction the prosecutor. No federal laws regulate facial recognition and courts do not agree whether AI identifications are subject to Brady rules. Some states and cities have begun mandating greater transparency around the technology, but even in these locations, the technology is either not being used that often or it's not being disclosed, according to interviews and public records requests...
Over the past four years, the Miami Police Department ran 2,500 facial recognition searches in investigations that led to at least 186 arrests and more than 50 convictions. Among the arrestees, just 1 in 16 were told about the technology's use — less than 7 percent — according to a review by The Post of public reports and interviews with some arrestees and their lawyers. The police department said that in some of those cases the technology was used for purposes other than identification, such as finding a suspect's social media feeds, but did not indicate in how many of the cases that happened. Carlos J. Martinez, the county's chief public defender, said he had no idea how many of his Miami clients were identified with facial recognition until The Post presented him with a list. "One of the basic tenets of our justice system is due process, is knowing what evidence there is against you and being able to challenge the evidence that's against you," Martinez said. "When that's kept from you, that is an all-powerful government that can trample all over us."
After reviewing The Post's findings, Miami police and local prosecutors announced plans to revise their policies to require clearer disclosure in every case involving facial recognition.
The article points out that Miami's Assistant Police Chief actually told a congressional panel on law enforcement AI use that his department is "the first to be completely transparent about" the use of facial recognition. (When confronted with the Washington Post's findings, he "acknowledged that officers may not have always informed local prosecutors [and] said the department would give prosecutors all information on the use of facial recognition, in past and future cases".
He told the Post that the department would "begin training officers to always disclose the use of facial recognition in incident reports." But he also said they would "leave it up to prosecutors to decide what to disclose to defendants."
Re: (Score:3)
It's not exactly legal, you know.
Unfortunately, it's not exactly illegal, either.
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Could you elaborate on the "unfortunate" part? Why are you lamenting law enforcement's use of facial recognition?
As well as DNA and fingerprints, while you're at it...
Re: (Score:1)
Why are you lamenting law enforcement's use of facial recognition?
No trustworthy. And the cops themselves aren't exactly trustworthy either. If they want to use this stuff they have to be transparent about it. It is in our interest to keep them honest.
Re:Yeah, why would they? (Score:5, Insightful)
No trustworthy.
AI recognition isn't admissible evidence. Prosecutors aren't going into court and claiming a defendant is guilty because the AI said they are.
But if there is a photo in a surveillance video, the AI can match it to an ID, and then the cops can look for evidence linking the identified person to the crime.
If the AI identifies five matches and four of them live in a different state while the fifth lives in town and is the ex-boyfriend of the murder victim, the cops can likely figure it out from there.
And the cops themselves aren't exactly trustworthy either.
Sure, but that's a different problem.
Re: (Score:2)
No trustworthy.
AI recognition isn't admissible evidence. Prosecutors aren't going into court and claiming a defendant is guilty because the AI said they are.
But police arrest people based on facial recognition. The fact that it isn't reliable doesn't stop them.
https://time.com/6991818/wrong... [time.com]
https://www.nbcnews.com/news/u... [nbcnews.com]
If the AI identifies five matches and four of them live in a different state while the fifth lives in town and is the ex-boyfriend of the murder victim, the cops can likely figure it out from there.
And the cops themselves aren't exactly trustworthy either.
Sure, but that's a different problem.
Wait, what? You said that the fact that facial recognition isn't reliable doesn't matter because-- direct quote from you "the cops can likely figure it out." When the person you're replying to says "the cops aren't reliable," you say "that's a different problem."
No. You just said that facial recognition is hecked by the cops. It's not
colour me thurprithte (Score:2)
Re: (Score:1)
Huh? (Score:5, Insightful)
This disingenuous weaseling seriously undermines Armando Aguilar's credibility. He seems to be saying, "It's not the police's fault if the prosecutor fails to disclose" ... however the prosecutor can't disclose what the police do not disclose. Armando Aguilar knows very well that prosecutors are not supposed "to decide what to disclose to defendants", but it is clear that the culture in law enforcement by both the police and the prosecutors offices has been to violate Brady on a daily basis.
Re: (Score:2)
This is the key: the facial recognition is not used as evidence. They cannot claim "you're him!" because an algorithm thinks you are. That's not what this is about.
Re:Huh? (Score:5, Informative)
Re: Huh? (Score:1)
As a reminder, the prosecution relies on one of two things in a criminal case to get a conviction - a unanimous jury decision OR a confession of guilt.
Can the wrong person be convicted? Sure, it happens, but not because Facial Recognition picked a name out of a hat and decided that that is the person of color to investigate/prosecute.
Re: Huh? (Score:1)
Short answer, never talk to the cops. They're not trying to straighten things out or understand your situation. They're looking to get a conviction, regardless of whether you're the one who committed the crime or whether their was an actual crime committed at all. Nothing you tell them will ever serve your best interests, it will only ever serve theirs.
Re: Huh? (Score:1)
The court has no mercy.
Re: (Score:2)
At that point, the fact that the initial tip by which the
System of Injustice (Score:2)
That is complete fantasy. A more realistic scenario is that facial identification is used to screen for people with prior arrest or other police contact. Maybe there are three or four people who fit that description. They check each one's alibi and all but one has enough of one that will make "reasonable doubt" likely. They now focus their efforts on gathering evidence against the remaining suspect. The alternative is to start over from scratch if they can't make a case against that last one standing. So th
Re: (Score:3)
Think about it this way your average city spends 50 to 60% of its budget on police. If they're not arresting people are you going to feel like you're getting your money's worth? When you're trading parks and schools for more cops you're going to want to feel like that trade was worth it.
That's
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Think about it this way your average city spends 50 to 60% of its budget on police.
I'm not sure what planet you are from, but my nearest city spends a majority of its budget on the school system (a transfer to the county). A quarter of the rest is "Public Safety", closely followed by "Health and Welfare" and "Nondepartmental (Fringe Benefits)". The "Public Safety" budget, which apparently includes the for department, is 11% of the total budget.
New York City's FY2024 budget looks similar in terms of priorities: $547M for education, $322M for human resources and homeless services, $132M fo
You're confusing spending with budget (Score:2)
Re: (Score:2)
You specifically made an assertion about the amount that "your average city spends". External finding is by definition not what the city is spending.
Re: (Score:2)
Re: (Score:2)
But it may be used in order to obtain further evidence. You have a photo of a person, you don't know who they are. You run it through AI and it spits out a name.
Now you use that name to look up information about the person to s
Re: (Score:2)
They cannot claim "you're him!" because an algorithm thinks you are.
This is literally what they are doing at time of arrest.
Re: (Score:2)
It is the police department's concealment - that is, lying - which, while it may not be exculpatory prima facia, may be exculpatory from the standpoint of establishing the credibility (or rather, lack thereof) of the police department itself.
Think about it this way: if you knew that someone worked for an openly racist organization, would you be more or less likely to believe that person was a racist? If you knew that someone worked for an organization which instructed its members to lie on matters of se
Re: (Score:2)
The Brady decision did not concern itself with unconstitutional investigation methods, but withholding evidence which could prove innocence. If the defendant bragged about the crime on social media and they found the bragging through AI, how does knowing that help the defendant prove his innocence?
This is just a different flavor of parallel construction and there should be rulings against that, but Brady isn't it.
Yes prosecutors do get to decide (Score:2)
So for example the AI facial recognition lets them finger a suspect and go on a fishing expedition until they find something to charge them with. With the way our drug policy works it's not at all hard to do that.
From there you just threaten with long long long prison sen
Re: (Score:2)
On the basis of facial recognition alone? (Score:1)
Can someone be wrongly convicted of a crime they didn't commit based solely on facial recognition?
I don't think so.
I imagine facial recognition gives them a starting point to build a case, but having the computer identify, say, a rapist by comparing video with the state drivers license photo database? I'm ok with that.
Defense can challenge the idea, present exonerating facts, etc - this isn't a reverse "Johnnie Cochran" ("if the face is recognized, you must convict")
Re:On the basis of facial recognition alone? (Score:4, Insightful)
Can someone be wrongly convicted of a crime they didn't commit based solely on facial recognition?
Of course they can. We just executed somebody who was found guilty based on the testimony of a jailmate of a jailhouse confession and an ex-girl friend with no physical evidence to connect him to the crime. You can't confess to a jailmate until you are arrested. The jailmate got "consideration" for his testimony in sentencing for his own crime. There was even evidence that someone else committed the crime.
Once someone is identified as a possible suspect there is always a chance they will be convicted unless they are obviously innocent. In which case the police and prosecutor move on to someone else who isn't obviously innocent. But if you don't have a iron clad alibi ...
Re: (Score:1)
It should not be used as any part of the reason the person was arrested, beyond pointing the investigation in their direction.
Do you want someone on the jury to hear that an AI system detected the person? How do you as the defense go against that.
This is kinda fucked up (Score:5, Insightful)
Two lines in the summary really stood out.
He said investigative techniques are exempt from Florida's public disclosure laws...
Like "is that true?" and seems like, yes, it is. [flsenate.gov] I get that, you don't want to disclose all your methods with the criminals but it kinda clashes with the second one
acknowledged that officers may not have always informed local prosecutors
That seems somewhat malicious in intent. Prosecutors should definitely not be in the dark about these things and what kind of department lets officers use this tools and not record that it happened or at their discretion disclose it or this guy is passing the buck to diffuse "officers". Prosecutors should probably have access to records of these types of searches being performed.
If it isn't used as evidence... (Score:2)
does have it to be submitted to the defendant?
Re: (Score:3)
And just last week... (Score:2)
There was a story about a couple of college kids who built an AR viewer that doxxed whoever you looked at... and I said, "so, the NSA, the FBI, FSB... all have it." I didn't think the US cops did...
The witness should be compelled to testify (Score:2)
And when the computer just sits there, refusing to answer questions, they should throw it in jail for contempt.
Wait til Alexa gets legal personhood status (Score:1)
It's coming - AI will be a witness in court to inform on you. AI already has been deputized for traffic cop and facial ID duties.