Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Privacy

Detroit Police Changing Facial-Recognition Policy After Pregnant Woman Says She Was Wrongly Charged (nbcnews.com) 32

The Detroit police chief said he's setting new policies on the use of facial-recognition technology after a woman who was eight months pregnant said she was wrongly charged with robbery and carjacking in a case that was ultimately dismissed by prosecutors. From a report: The technology, which was used on images taken from gas station video, produced leads in the case but was followed by "very poor" police work, Chief James White said. "We want to ensure that nothing like this happens again," White said Wednesday. His comments came two days after the American Civil Liberties Union of Michigan announced a lawsuit on behalf of Porcha Woodruff, a 32-year-old Black woman, who was arrested in February while trying to get children ready for school. There have been two similar lawsuits against Detroit.

Woodruff was identified as a suspect in a January robbery and carjacking through facial-recognition technology. She denied any role. The Wayne County prosecutor's office said charges later were dropped because the victim did not appear in court. White said his officers will not be allowed "to use facial-recognition-derived images in a photographic lineup. Period." He said two captains must review arrest warrants when facial technology is used in a case, among other changes. The new policies will be presented to the Detroit Police Board of Commissioners.

This discussion has been archived. No new comments can be posted.

Detroit Police Changing Facial-Recognition Policy After Pregnant Woman Says She Was Wrongly Charged

Comments Filter:
  • https://slashdot.org/story/23/... [slashdot.org]

    Seems strange that this keeps happening in Detroit...

  • Good (Score:5, Insightful)

    by ugen ( 93902 ) on Friday August 11, 2023 @01:14PM (#63759784)

    This was quite literally the nature of my comment to the original article, so I'll quote myself :)

    The person was identified using the facial recognition *first*, and only then shown to a witness.
    If the detectives had to use some other method to come up with their suspect (rather than running facial recognition on a set of photos), they'd probably have to examine the circumstances at least a bit, and may be apply reason to their suspect selection. That would, hopefully, preclude them from picking a woman in the late stages of her pregnancy. OTOH, using facial recognition is a lazy approach that eliminates any need to use reason or logic, and that's exactly what happened here.

    Bottom line - face recognition should be used only as part of reasonable investigation, and not a lazy substitute.

    • Re:Good (Score:4, Interesting)

      by fahrbot-bot ( 874524 ) on Friday August 11, 2023 @02:15PM (#63759988)

      In addition to, apparently, using an 8-year-old photo of her rather than her current driver's license photo (which the Police definitely had access to) even after the initial "matches" -- which I noted [slashdot.org] in the previous thread.

      Whatever system they were using should have noted the age of the photo and Police should have said, "Hmm, let's see what she looks like now." etc ...

      And then there's the whole she was 8-month's pregnant thing, which no witness, victim or even the person caught in the stolen car ever mentioned.

      Lazy, sloppy (probably biased) Police work. Unfortunately, probably could happen to any of us.

      • Re:Good (Score:5, Interesting)

        by narcc ( 412956 ) on Friday August 11, 2023 @02:42PM (#63760084) Journal

        It's worse than that. The alleged victim spent the whole day with the robber and even had sex with her. The fact that she was 8-months pregnant shouldn't have escaped the notice of even the dullest observer! It is her single most defining feature.

        Lazy and sloppy police work, for sure, but lets not pretend that they weren't overjoyed at the prospect of abusing and harassing a pregnant black woman. They had to know they had the wrong person the instant they first saw her, but decided to make her life as difficult as possible for as long as possible anyway, all while the real culprit is still at large.

        • by gweihir ( 88907 )

          Lazy and sloppy police work, for sure, but lets not pretend that they weren't overjoyed at the prospect of abusing and harassing a pregnant black woman. They had to know they had the wrong person the instant they first saw her, but decided to make her life as difficult as possible for as long as possible anyway, all while the real culprit is still at large.

          True. And that should get them a few months behind bars and a lifetime ban on doing police work of any kind. These "people" are scum, nothing else.

        • " They had to know they had the wrong person"
          To paraphrase Upton Sinclair, It's difficult to get a man to understand something when there are absolutely no consequences for his not understanding it.

      • So the carjacking victim effectively said that the carjacker looked like this person looked eight years ago?
      • This follows with how qualified immunity was apparently intended (degree of forgiveness for honest mistakes) and how it is actually used.

        Since it appears law enforcement cannot use discretion, these tools should be denied them. Completely.

    • Makes sense, right? Some people can and do look strikingly similar. Take twins for example. But it does not make second twin guilty in all the crimes the first twin committed. This automatic trust in AI results really starts to get out of hands. It is good that there was a real human judge in the loop still, or she could have been sent to jail by AI as a part of the automated investigation process.
      • by sjames ( 1099 )

        We don't know if the judge would have helped or not. TFA says the prosecutor's office dropped charges when the victim was a no-show in court.

        Ultimately, the problem is that police and prosecutors have stopped looking for anything like justice and are just playing pin the crime on a donkey (any donkey). Even the party game for small children requires more care than they show.

    • It boils down to a sensitivity vs specificity question.

      The idea of simply arresting someone based on nothing more than a witness lineup proves extreme laziness. Witness identifications are wrong all the time, and throw enough somewhat similar to the description pictures in front of the witness and they are likely to pick someone eventually. With the right wheedling, I am sure a detective could find someone to arrest 99% of the time. While that is good sensitivity, we do not need police officers who canno

    • by Anonymous Coward

      Obviously, that was shoddy police work.
      But it makes me think that the only reason it is exposed in that case is because facial recognition didn't remove one obviously false positive. Whereas, in the traditional shoddy police work, they would have picked up some innocent black non-pregnant woman, and nobody would have known it was shoddy work.

      Bottom line, police has been doing lots of shoddy police work for ever. I guess they need time to adapt to new tech so that they can continue doing lots of shoddy polic

    • by Anonymous Coward

      I know a public defender, I've seen police photo lineups where 4 of 5 pictures are of the same person. Super reliable those lineups.

  • Until and unless every... single... person... who played any part in the false accusation and arrest is themselves terminated, prosecuted, and imprisoned, and the victim justly compensated; this is little more than an empty platitude.

    • The fact is they aren't paid enough to care, so its hard to get people to replace them that are not just as bad. Personally I want to look up what the judge said about the whole thing when the prosecutor submitted it. He seemed to just outright dismisses the indictment yet the prosecutor staunchly keeps saying he followed "procedure" It sounds like more the person who made this a "procedure" needs to fired more.
  • FTA - "The Wayne County prosecutor’s office said charges later were dropped because the victim did not appear in court."

    I happen to believe that this woman was at the very least negligently accused and more likely wrongly. accused. Either way, people can get their charges dropped just by not showing up for court? What?!?!?!?!?!

    • Re: (Score:2, Informative)

      by Anonymous Coward

      ... the *victim* did not appear in court.

  • When you search a large database for a photo then you will find a match, whether the guilty person is in the database or not. Since a is to be expected, it proves very little. Worse if you say "computer, find the best match" because you know a match will be found.
  • by nickovs ( 115935 ) on Friday August 11, 2023 @09:57PM (#63761198)

    This sort of use of face recognition, and the failure of the ' "very poor" police work' that the chief referred to, are a classic case of users of technology not understanding the technology.

    There are around 650,000 people in Detroit. If facial recognition has a false positive chance of (very optimistically) 1 in 10,000 then you still expect 65 hits. A similar thing happens when the police scan DNA databases; even with a supposed 1-in-a-million chance of error, any test will throw up 350 hits across the population of the USA.

    If the use case is "Jane Doe is our prime suspect, she was seen in the area and she had opportunity and motive", and then you run a test of a photo/DNA sample from the crime scene against the suspect, then a hit is pretty compelling. If you have no idea who the suspect is and you trawl the entire city/county/country for a hit then the chances of it being wrong are very high indeed. Sadly it seems the lazy police officers sometimes mistake confirmation tools for search tools. When they do, this sort of stuff happens.

  • Aha, the new information is that the "lineup" that was used for the victim/witness to identify her was a "photographic lineup." In other words, the witness was given say 5 pictures of people and asked which one most closely resembles the perp. Not surprisingly, the witness agreed with the face recognition algorithm, showing that the algorithm was effective, and nothing else.

    Moral: recognizing people from a picture is not the same as recognizing people from a person.

Make sure your code does nothing gracefully.

Working...