Detroit Man Sues Police For Wrongfully Arresting Him Based On Facial Recognition 92
A man who was falsely accused of shoplifting has sued the Detroit Police Department for arresting him based on an incorrect facial recognition match. The American Civil Liberties Union filed suit on behalf of Robert Williams, whom it calls the first US person wrongfully arrested based on facial recognition. The Verge reports: The Detroit Police Department arrested Williams in 2019 after examining security footage from a shoplifting incident. A detective used facial recognition technology on a grainy image from the video, and the system flagged Williams as a potential match based on a driver's license photo. But as the lawsuit notes, facial recognition is frequently inaccurate, particularly with Black subjects and a low-quality picture. The department then produced a photo lineup that included Williams' picture, showed it to a security guard who hadn't actually witnessed the shoplifting incident, and obtained a warrant when that guard picked him from the lineup.
Williams -- who had been driving home from work during the incident -- spent 30 hours in a detention center. The ACLU later filed a formal complaint on his behalf, and the prosecutor's office apologized, saying he could have the case expunged from his records. The ACLU claims Detroit police used facial recognition under circumstances that they should have known would produce unreliable results, then dishonestly failed to mention the system's shortcomings -- including a "woefully substandard" image and the known racial bias of recognition systems.
Williams -- who had been driving home from work during the incident -- spent 30 hours in a detention center. The ACLU later filed a formal complaint on his behalf, and the prosecutor's office apologized, saying he could have the case expunged from his records. The ACLU claims Detroit police used facial recognition under circumstances that they should have known would produce unreliable results, then dishonestly failed to mention the system's shortcomings -- including a "woefully substandard" image and the known racial bias of recognition systems.
I'm from the "Don't feed the sharks" school (Score:5, Insightful)
Lawsuits seem about the only way to arrive at a sane analysis for balancing liberty with minimizing crime, and then feed that back into legislation.
Maybe some want to live in a panopticon, whereas others prefer a more traditional feel. One hopes that we can let people sort themselves by taste.
Re:I'm from the "Don't feed the sharks" school (Score:4, Insightful)
In some cases it can be caught early and corrected and maybe all that's needed is an apology. In other cases, the only way to really fix the problem is for someone to have to pay.
Re: (Score:2)
Wrongful arrest is wrongful arrest no matter whether it was due to some fault in a computer system, a jackass abusing their power, or even just an innocent mistake in some paperwork.
Yep, at least he wasn't thrown in federal prison for 20 years.
(...to share a designed-for-3-inmates cell with 5 large gangbangers).
Re: (Score:3)
I was just thinking that this is so much like some software development. Ie, I don't know what the root cause of the bug is, but I found something suspicious, I changed the code, and checked it in. I didn't test it myself, because that's time consuming and it gets in the way of my fast response that I do so that the boss thinks I'm not a slacker. I can't say I don't what the bug's cause was because the boss will think I'm incompetent, and the customer won't like that as an answer. Upper management is happ
Re: (Score:1)
Like any career criminal, they never change unless met with greater force.
Re: (Score:2)
Re: (Score:3)
"Tough on crime" as a slogan gets you re-elected much easier than "respect for the people". Even worse as a campaign slogan would be "respect for people who may not even look like you or live in your neighborhood."
Re:biological facial recognition (Score:5, Insightful)
"obtained a warrant when that guard who never saw him picked him from the lineup"
If you're going to quote, don't leave out the important part.
Re:biological facial recognition (Score:4, Insightful)
showed it to a security guard who hadn't actually witnessed the shoplifting incident
Note that it does not say that the guard "never saw him". It merely says the guard didn't witness the shoplifting incident.
Re:biological facial recognition (Score:5, Informative)
Defendant Bussa arranged a photo lineup with a security contractor who was not even present in the store on the day of the crime, and who had only watched the same grainy surveillance video that was already in Detective Bussa’s possession.[] When that person picked Mr. Williams from the lineup . . , The warrant request was faulty and misleading because Defendant Bussa hid the fact that the person who picked Mr. Williams out of the lineup had never actually seen the shoplifter in person.
https://www.aclumich.org/en/pr... [aclumich.org]
Re: (Score:2)
Re: (Score:2)
2) The guard wasn't a witness to a crime so why does his testimony matter.
You can think if you try but you gotta try.
Right or wrong, likely to fail (Score:2)
"Qualified immunity" (q.v.) means they can't be held to account unless there's a court precedent with near-identical facts. I'd be surprised if face recognition technology has been around long enough for that to have happened already.
Re: (Score:3)
Qualified immunity protects the individuals. It does not protect the Police Department itself.
Re: (Score:3)
Re: Right or wrong, likely to fail (Score:3)
> the city ... has no control over the police
Oh really?
Re: (Score:2)
It's the continuing narrative that places that have been blue for decades (or half a century plus) must cope with evil, racist police with no ability to do anything about it.
Ironically, the heavily blue areas blame police unions for this while also heavily promoting unions.
Even more ironically, people keep putting the same do-nothing "leaders" back in office locally and keep doing their level best to export the failed policies as far and wide as possible.
Finally, when called out on this insanity, they simpl
Re: (Score:2)
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened. There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the
Re: (Score:2)
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened.
Far from refuting what I said, you're actually reinforcing my point about putting do-nothing leaders with failed policies back in office. Throwing up your hands saying, "my subordinates will not do what I tell them to, I guess I tried" means either you need new subordinates, or you don't need to be doing the job.
There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the police got almost no punishment for ex excessive force.
So stop accepting said arbitration clauses in the contract?
I realize negotiations aren't easy, and police unions really DO have a lot of power, but the "we can't do anything" mindset just ensures th
Re: (Score:2)
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened.
Far from refuting what I said, you're actually reinforcing my point about putting do-nothing leaders with failed policies back in office. Throwing up your hands saying, "my subordinates will not do what I tell them to, I guess I tried" means either you need new subordinates, or you don't need to be doing the job.
The point is that they don't view themselves as her subordinates. Also, no one says she stopped trying, but they dragged the process out in an attempt to avoid accountability. For some reason they don't want to give up their ability to bash down doors at any hour they feel like.
There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the police got almost no punishment for ex excessive force.
So stop accepting said arbitration clauses in the contract?
I realize negotiations aren't easy, and police unions really DO have a lot of power, but the "we can't do anything" mindset just ensures the cycle will continue. At some point, the politicians running the cities are going to have to choose between the police union and the people they ostensibly serve. Break them like Reagan broke the air traffic controllers. Of course, that would be politically disadvantageous for those politicians, wouldn't it?
The police have a fun set of pressure tactics that can make a politician's life miserable when they don't get what they want. They are also really good at spinning the arbitration clauses as a good thing. But I agree with you, in s
Re: (Score:2)
The point is that they don't view themselves as her subordinates.
Their view of the subject doesn't really matter, does it? "My boss is an idiot, fuck that guy" doesn't tend to do very well when the boss is willing to do something about that attitude.
Also, no one says she stopped trying, but they dragged the process out in an attempt to avoid accountability. For some reason they don't want to give up their ability to bash down doors at any hour they feel like.
If only they had access to some resources like subpoenas or state/federal authorities who could simply seize the information being withheld. The Chicago police department may think themselves to be above the law, but they only are as long as they are allowed to be. I'm sorry you can't see that the politicians who claim to b
Re: (Score:2)
Re: (Score:2)
Don't know about the locality where this happened, but Baltimore, Maryland, USA, hasn't controlled the Baltimore City PD since 1860. It's actually a state agency.
Re: (Score:2)
Re: (Score:1)
And at some point, the cities budget for the police department *does* run out an the police pensions get a severe reduction when the city declares bankruptcy and the federal pension backup system takes over the pensions.
Re: (Score:2)
Re: (Score:1)
I have very mixed emotions about that.
We need to do something about it... but we also need a police force. The police force as a career could easily reach a point where no qualified candidates wanted to work there.
I don't have a good answer.
Qualified immunity is bad. But so is losing your pension to a false accusation after 11 years work.
Re: (Score:2)
Qualified immunity is protection from criminal/civil action. It doesn't mean you can't be fired or fined, employment contracts can't be revised, laws can't be changed etc.
Re: (Score:2)
Re: (Score:2)
No, but the Police Associations and contracts written in the "Law and Order" obsessed 1990s deals with the whole issue of firing police who perform what, to everyone else, is obvious malpractice on the job.
Let me preface this by saying: I agree in principle with your general point. Our police are militarized. Soldiers make (and always have made) awful police officers, because the primary mission of a solider is "kill people and break things" which is decidedly at odds with "serve and protect."
That said, it's disingenuous to blame "contracts written in the 90s" for today's ills, when it's the mindset that began in the 90s (actually, it's probably the 70s) that is the problem. No labor contract lasts for thre
Re: (Score:2)
Re: (Score:1)
Qualified immunity doesn't protect them from being punished by citizens in various ways. All legal. Banning them from stores, following them around, recording them while off duty, freezing them out / ostracizing them from society, etc. If citizens really cared this is the only way to go to punish cops that hide behind qualified immunity when for instance shooting unarmed people during a welfare check.
The cops can quite their job but that punishment will still haunt them for life. Not even moving could preve
Re: (Score:2)
If a police officer tells a security guard to pick out someone out of a lineup, _knowing_ that the security guard hasn't witnessed any crime, and that the security guard doesn't have any information that the police officer doesn't have, then this is either plainly incompetent, or an intentional attempt to get a warrant when getting a warrant wasn't justified, which is
could have the case expunged = have to pay for it? (Score:2)
could have the case expunged = have to pay for it?
At very least the cops need to cover all costs and all legal fees
Re: (Score:3)
Re: (Score:2)
Re: Stop this B.S. already (Score:2)
Itâ(TM)s because mediafolk with mediocre education and basic math and science training are now AI experts.
Now they can use words without understanding what they mean, as long as they get the desired reaction.
What happened to the guy is messed up, but this whole AI âoerace biasâ thing is stupid and shows serious ignorance about how statistical learning works.
Re: (Score:2)
The point is that the person who trained the system or who selected the training data had biases. Maybe they selected a training set that had mostly white faces but didn't notice because it looked "normal" to them, causing the system to make errors more often when dealing with black faces. The system then recapitulates the biases of the people who worked on it. It's really not that hard to understand.
Re: (Score:2)
The point is that the person who trained the system or who selected the training data had biases
The bias of a user (which may very well be biased) has little to do with errors of a system.
causing the system to make errors more often when dealing with black faces
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias. It just forces the manufacturer to specify the worst of these random error in the datasheet as the instrument's error.
Re: (Score:2)
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias. It just forces the manufacturer to specify the worst of these random error in the datasheet as the instrument's error.
A systematic error over a particular range in a random sample of values is pretty much the textbook definition of bias.
Re: (Score:2)
Re: (Score:2)
The bias of a user (which may very well be biased) has little to do with errors of a system.
How the errors get into the system is irrelevant to whether the system is biased.
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias.
Except, in this case, the randomness differs based upon race. The system isn't doing anything wrong by not being as accurate with black people. But, the system does not handle different races equally. That is all that matters in this context.
Re: (Score:2)
How the errors get into the system is irrelevant to whether the system is biased.
Perhaps true, but irrelevant here.
Except, in this case, the randomness differs based upon race.
But that's not bias. Bias is a systematic error, not a varying random error.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Bias through Poor Training (Score:2)
Poor image quality is a legitimate complaint. But there is no "racial bias" in computer software.
Yes, there is if you do not properly train the algorithm. If you train it with predominantly white faces it will be objectively poorer at matching black people's faces making it literally racially biased. That bias clearly is not the fault of the algorithm but the person who trained it. Nevertheless, the result is software that is less likely to accurately match certain races making it racially biased.
Re: (Score:2)
Yeah further: i was reading about a machine learning session which was trying to teach the machine was a Party is. So images of classrooms, parties, meetings etc. but the researchers forgot to include any black people in the Party images. So the machine concluded that if there's any black people in the picture it can't be a Party. That was an unintentional oversight on the part of the developers that translated into a racial bias in the algorithm.
Police Always Used Facial Recognition by Humans (Score:5, Insightful)
The police have always used facial recognition by humans, both by the cops and the general public. This facial recognition is also unreliable.
So in all cases, either by computer program or by humans, you need corroborating evidence.
Re: (Score:3)
Yeah, this is pretty dumb. It has nothing to do with facial recognition, it's entirely reasonable to use facial recognition (or any other unreliable witness statement) as a guide to choose a possible suspect. It's what happens after that which was the issue here, apparently.
It's the policing version of those "(generic, obvious thing)...on the internet!" patents we used to joke about. Bad police work... after facial recognition!
Re: (Score:1)
>> it's entirely reasonable to use as a guide
Sure, if you're from a species that hasn't proven it has a fucking massive hardon for "the computer said".
The consequences range from mild inconvenience to fatality, because the scenario is universal, not unlike situations with "an accident", a word with a similarly wide range of impact.
The difference is one of the incompetences above is avoidable. Not by futilely trying to improve or educate the idiot human operator and his immutable human nature, of cours
Re: Police Always Used Facial Recognition by Human (Score:3)
Part of the problem with digital vs human facial recognition is the set of faces being compared against. In this case, he was matched to his drivers license photo. Will a human really go through the database of all drivers license photos in the entire state and claim to pick a match? Of course not. There are aspects of this that are unique to the scale, methodologies, and obtuse nature of technological systems.
Re: Police Always Used Facial Recognition by Human (Score:5, Informative)
There was a case where a young woman was arrested for shoplifting and hitting a security guard, was identified by two security guards, got convicted and was extremely lucky that the case, with her photo was reported in the newspaper: Because the real culprit saw it, came forward, and she was set free. They showed photos of both side by side, and they were identical. Closer than identical twins usually are.
With that statistic, if you took a high quality photo of any bank robbery in the USA and compared it to 200 million license photos, you could make 200 arrests immediately. With at least 199 people being innocent, so at that point the similarity alone cannot be enough.
Re:Police Always Used Facial Recognition by Humans (Score:5, Insightful)
Cops keep making this same mistake over and over and over again.
Some new tech comes along. Fingerprints, DNA, IP addresses, ANPR, cell phone tracking, facial recognition. They buy into the hype that it's infallible and does their job for them, and start arresting people. Sometimes those people even get convicted by clueless juries, influenced by media hype and CSI.
Aggressive Marketing (Score:2)
They buy into the hype that it's infallible and does their job for them
It's the same ploy drug companies pull on doctors to sell their new wonder drug that they hav
Re: (Score:2, Insightful)
The police have always used facial recognition by humans, both by the cops and the general public. This facial recognition is also unreliable.
So in all cases, either by computer program or by humans, you need corroborating evidence.
By corroborating evidence, you mean the business that has a $4000 8K HDTV for the front lobby visitors, while continuing to budget for My-First Security cameras made by Hasbro? 'Cause you know..priorities.
Re:Police Always Used Facial Recognition by Humans (Score:4, Insightful)
But in this case it was particularly egregious.
Basically the computer looked at the image and spit out a name. The police then issued a warrant for that person, and arrested him.
In interrogation, the guy looks at the surveillance photo, holds it up to his face and asks the interrogator if they look alike. The only similarity was "they were both black".
In fact, the police compared that image against the stored image of the guy and in a few seconds, you could tell the computer match was false.
In other words, someone could've spent 30 seconds comparing the image in the surveillance video with the guy's image, and basically tell they were different.
The computer spit out a name, No one takes a few seconds to see if the two people were even remotely similar to each other. It was a shoplifting case. I'm sure you could afford a few seconds.
Re: (Score:1)
The computer identified him.
Then they set up a block of pictures that included him.
From the block of pictures, he was identified by one of the store's security people.
At that point, they got a warrant and he was arrested.
Since you have already misunderstood what actually happened, if you do compare the two pictures there is a lot of similarity between the two.
Re: (Score:2)
The problem is also one of scale. Human eyeballs possessed by police are in limited supply and expensive. You can't create a panopticon using police eyeballs without obviously creating a police state. People would resist that.
Cameras hooked up to facial recognition systems are much more abundant and cheaper. You can have your total surveillance of the population without all the problematic visuals of secret police tailing people everywhere and at much less expense.
Even without malintent, having ubiquito
Hope he wins... (Score:5, Insightful)
Any IT job that you interview for, they ask you about your *arrest* record. Not convictions. Mainly because HR feels that one can buy an acquittal, but if a cop thinks you are guilty enough to pull out the handcuffs, you are guilty.
An -arrest- for shoplifting can fuck up someone's future with regards to jobs completely. Even if expunged, all the third party databases will have this entry in it forevermore, as well as the mug shots, which employers will find and ensure the candidate does not get hired on, under the "where is there smoke, there is fire" pretense.
Re: (Score:2)
I served on a jury trial and we voted to acquit. During deliberations, one guy refused absolutely to do this despite the lack of evidence and a poor case by the prosecutor. He comes into the trial at the start assuming the defendant must be guilty, because he was arrested, and would not change his mind. He eventually talked to the judge, I don't know what he said but he was removed from the jury and an alternate juror was added.
Re: (Score:2)
whoosh
Wat? (Score:3)
The department then produced a photo lineup that included Williams' picture, showed it to a security guard who hadn't actually witnessed the shoplifting incident, and obtained a warrant when that guard picked him from the lineup.
"Will you please pick the man you didn't see commit the crime from these pictures."
Re: (Score:2)
Working as designed (Score:2)
Conflated issues (Score:2)
failed to mention the system's shortcomings -- including [...] known racial bias of recognition systems
If it fails to tell one person from another of the same race (damn I loathe the USA obsession with that word), is it really racial bias? Because I am guessing that this is not an instance of the video showing a white guy and the algo saying "it's likely the black guy, it usually is".
FR algos are not perfect, and people's application and interpretation of it is not perfect. In order to improve either issue, they should probably not be conflated.
Re:Conflated issues (Score:4, Insightful)
If it fails to tell one person from another of the same race (damn I loathe the USA obsession with that word), is it really racial bias? Because I am guessing that this is not an instance of the video showing a white guy and the algo saying "it's likely the black guy, it usually is".
In this case they had a grainy photo, and the algorithm said "with this low quality photo, it is possible that Mr. Smith was the person on the photo". It should have also said "probably one in twenty random people could also be the person on that photo".
The real problem wasn't the photo, but the fact that the photo didn't justify a warrant, and a police officer basically created fake evidence out of thin air to justify a warrant.
If you look like a robber and a security guard thinks it was you, and you get arrested as a result, and released after 30 hours, that's just bad luck. If there is a photo that doesn't justify an arrest, and a police officer lets a security pick you out of a line up who has not seen the robbery and can't possibly know it was you, and if the police officer then lies to the judge about exactly what he did to get a warrant (because I'm sure the judge would have never signed the warrant if he or she had known the correct facts), then it is not bad luck, it is something the police force should pay you compensation for.
Re: (Score:3)
Re: (Score:2)
What's in a police database? (Score:4, Insightful)
Case 2: Someone calls your employer, telling them you are a criminal because they saw you being arrested (and as far as the caller knows they are telling the truth). Then YOU should be able to call the police, and they should be able to provide the true facts - that you were arrested because a police officer faked evidence against you.
I think that's how it works in Germany; YOU can call the police and ask them, and they can give YOU a complete list of everything they know about you, including illegal arrests, proven or likely false allegations, and of course convictions where you served time 20 years ago. All things that your future employer would not be told.
Oh, please (Score:2)
Re: Oh, please (Score:2)
Now there's one more.
That is the problem.
Re: (Score:2)
Obviously, but the roots of this problem have nothing to do with facial recognition technology.
Solving this problem at its core reduces to eliminating enough of the human race that everyone personally knows everyone else alive, and therefore this kind of mistaken identity cannot occur.
Lineup? Is this a joke?? (Score:2)
Everyone knows that human memory isn't reliable!
The first memory is already a vague matching to patterns that were most dominant in the person's memory at that moment, based on what had happened in the short time before. And every time it is remembered again, it is re-modulated by the oned at *that* moment. So after a while, little is left.
Intentionally limiting the input at that time, with a lineup, mostly narrows that to those peope in that line then. Even unintentionally.
That's very close to deliberate c