Detroit Police Changing Facial-Recognition Policy After Pregnant Woman Says She Was Wrongly Charged (nbcnews.com) 32
The Detroit police chief said he's setting new policies on the use of facial-recognition technology after a woman who was eight months pregnant said she was wrongly charged with robbery and carjacking in a case that was ultimately dismissed by prosecutors. From a report: The technology, which was used on images taken from gas station video, produced leads in the case but was followed by "very poor" police work, Chief James White said. "We want to ensure that nothing like this happens again," White said Wednesday. His comments came two days after the American Civil Liberties Union of Michigan announced a lawsuit on behalf of Porcha Woodruff, a 32-year-old Black woman, who was arrested in February while trying to get children ready for school. There have been two similar lawsuits against Detroit.
Woodruff was identified as a suspect in a January robbery and carjacking through facial-recognition technology. She denied any role. The Wayne County prosecutor's office said charges later were dropped because the victim did not appear in court. White said his officers will not be allowed "to use facial-recognition-derived images in a photographic lineup. Period." He said two captains must review arrest warrants when facial technology is used in a case, among other changes. The new policies will be presented to the Detroit Police Board of Commissioners.
Woodruff was identified as a suspect in a January robbery and carjacking through facial-recognition technology. She denied any role. The Wayne County prosecutor's office said charges later were dropped because the victim did not appear in court. White said his officers will not be allowed "to use facial-recognition-derived images in a photographic lineup. Period." He said two captains must review arrest warrants when facial technology is used in a case, among other changes. The new policies will be presented to the Detroit Police Board of Commissioners.
Gee, twice in one week? (Score:2)
https://slashdot.org/story/23/... [slashdot.org]
Seems strange that this keeps happening in Detroit...
Re:Gee, twice in one week? (Score:4, Informative)
This isn't a dupe. This is additional aftermath from that same incident. They're changing their policies.
Re: (Score:2)
As I grow older, I still wonder about some things. Among them - what kind of an imbecile makes posts like the parent and what do they feel when posting? It's a great mystery of life. I wish I knew.
Good (Score:5, Insightful)
This was quite literally the nature of my comment to the original article, so I'll quote myself :)
The person was identified using the facial recognition *first*, and only then shown to a witness.
If the detectives had to use some other method to come up with their suspect (rather than running facial recognition on a set of photos), they'd probably have to examine the circumstances at least a bit, and may be apply reason to their suspect selection. That would, hopefully, preclude them from picking a woman in the late stages of her pregnancy. OTOH, using facial recognition is a lazy approach that eliminates any need to use reason or logic, and that's exactly what happened here.
Bottom line - face recognition should be used only as part of reasonable investigation, and not a lazy substitute.
Re:Good (Score:4, Interesting)
In addition to, apparently, using an 8-year-old photo of her rather than her current driver's license photo (which the Police definitely had access to) even after the initial "matches" -- which I noted [slashdot.org] in the previous thread.
Whatever system they were using should have noted the age of the photo and Police should have said, "Hmm, let's see what she looks like now." etc ...
And then there's the whole she was 8-month's pregnant thing, which no witness, victim or even the person caught in the stolen car ever mentioned.
Lazy, sloppy (probably biased) Police work. Unfortunately, probably could happen to any of us.
Re:Good (Score:5, Interesting)
It's worse than that. The alleged victim spent the whole day with the robber and even had sex with her. The fact that she was 8-months pregnant shouldn't have escaped the notice of even the dullest observer! It is her single most defining feature.
Lazy and sloppy police work, for sure, but lets not pretend that they weren't overjoyed at the prospect of abusing and harassing a pregnant black woman. They had to know they had the wrong person the instant they first saw her, but decided to make her life as difficult as possible for as long as possible anyway, all while the real culprit is still at large.
Re: (Score:3)
Lazy and sloppy police work, for sure, but lets not pretend that they weren't overjoyed at the prospect of abusing and harassing a pregnant black woman. They had to know they had the wrong person the instant they first saw her, but decided to make her life as difficult as possible for as long as possible anyway, all while the real culprit is still at large.
True. And that should get them a few months behind bars and a lifetime ban on doing police work of any kind. These "people" are scum, nothing else.
Re: (Score:2)
" They had to know they had the wrong person"
To paraphrase Upton Sinclair, It's difficult to get a man to understand something when there are absolutely no consequences for his not understanding it.
Re: Good (Score:2)
Re: (Score:2)
This follows with how qualified immunity was apparently intended (degree of forgiveness for honest mistakes) and how it is actually used.
Since it appears law enforcement cannot use discretion, these tools should be denied them. Completely.
Re: (Score:1)
Re: (Score:2)
We don't know if the judge would have helped or not. TFA says the prosecutor's office dropped charges when the victim was a no-show in court.
Ultimately, the problem is that police and prosecutors have stopped looking for anything like justice and are just playing pin the crime on a donkey (any donkey). Even the party game for small children requires more care than they show.
Re: (Score:2)
It boils down to a sensitivity vs specificity question.
The idea of simply arresting someone based on nothing more than a witness lineup proves extreme laziness. Witness identifications are wrong all the time, and throw enough somewhat similar to the description pictures in front of the witness and they are likely to pick someone eventually. With the right wheedling, I am sure a detective could find someone to arrest 99% of the time. While that is good sensitivity, we do not need police officers who canno
Re: (Score:1)
Obviously, that was shoddy police work.
But it makes me think that the only reason it is exposed in that case is because facial recognition didn't remove one obviously false positive. Whereas, in the traditional shoddy police work, they would have picked up some innocent black non-pregnant woman, and nobody would have known it was shoddy work.
Bottom line, police has been doing lots of shoddy police work for ever. I guess they need time to adapt to new tech so that they can continue doing lots of shoddy polic
Re: (Score:1)
I know a public defender, I've seen police photo lineups where 4 of 5 pictures are of the same person. Super reliable those lineups.
Peachy, but... (Score:2)
Until and unless every... single... person... who played any part in the false accusation and arrest is themselves terminated, prosecuted, and imprisoned, and the victim justly compensated; this is little more than an empty platitude.
Re: (Score:2)
What? (Score:1)
FTA - "The Wayne County prosecutor’s office said charges later were dropped because the victim did not appear in court."
I happen to believe that this woman was at the very least negligently accused and more likely wrongly. accused. Either way, people can get their charges dropped just by not showing up for court? What?!?!?!?!?!
Re: (Score:2, Informative)
... the *victim* did not appear in court.
Re:What? Tin foil hat time... (Score:2)
Basic statistics (Score:2)
Search vs Confirmation (Score:3)
This sort of use of face recognition, and the failure of the ' "very poor" police work' that the chief referred to, are a classic case of users of technology not understanding the technology.
There are around 650,000 people in Detroit. If facial recognition has a false positive chance of (very optimistically) 1 in 10,000 then you still expect 65 hits. A similar thing happens when the police scan DNA databases; even with a supposed 1-in-a-million chance of error, any test will throw up 350 hits across the population of the USA.
If the use case is "Jane Doe is our prime suspect, she was seen in the area and she had opportunity and motive", and then you run a test of a photo/DNA sample from the crime scene against the suspect, then a hit is pretty compelling. If you have no idea who the suspect is and you trawl the entire city/county/country for a hit then the chances of it being wrong are very high indeed. Sadly it seems the lazy police officers sometimes mistake confirmation tools for search tools. When they do, this sort of stuff happens.
Aha (Score:2)
Aha, the new information is that the "lineup" that was used for the victim/witness to identify her was a "photographic lineup." In other words, the witness was given say 5 pictures of people and asked which one most closely resembles the perp. Not surprisingly, the witness agreed with the face recognition algorithm, showing that the algorithm was effective, and nothing else.
Moral: recognizing people from a picture is not the same as recognizing people from a person.