Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Government Privacy United States

Man Wrongfully Arrested By Facial Recognition Tells Congress His Story (vice.com) 94

An anonymous reader quotes a report from Motherboard: Michigan resident Robert Williams testified about being wrongfully arrested by Detroit Police in an effort to urge Congress to pass legislation against the use of facial recognition technology. Williams' testimony was part of a hearing held by the House of Representatives' subcommittee on crime, terrorism, and homeland security, which dealt with how law enforcement uses the highly controversial surveillance technology. Congress recently introduced the Facial Recognition and Biometric Technology Moratorium, which would indefinitely ban its use by law enforcement. Williams was wrongfully arrested in 2020 for federal larceny after he was misidentified by the Detroit Police Department's facial recognition software after they used a grainy image from the surveillance footage. He was then picked from a photo lineup by the store security guard who wasn't actually present for the incident. According to his testimony, Williams was detained for thirty hours and was not given any food or water. [...]

Research has repeatedly shown that facial recognition technology is fundamentally biased against women and people of color, leading to errors like this. Even when working properly, privacy advocates have argued facial recognition systems disproportionately target communities of color, creating further pretext for police intervention. [...] "Large scale adoption of this technology would inject further inequity into a system at a time when we should be moving to make the criminal justice system more equitable," Representative Sheila Jackson Lee (TX-18) said during the hearing. The subcommittee also referenced a recent study from the U.S. Government Accountability Office that reported that 20 federal agencies used facial recognition software last year. Six federal agencies, including the FBI and the U.S. Postal Service, reported using it during the 2020 Black Lives Matter protests that followed the police murder of George Floyd.

Robert Williams is just one of many people impacted by this technology's errors and biases. Williams is now represented by the ACLU and is suing the Detroit Police Department for damages and policy changes to prohibit the use of facial recognition technology. So far, the technology has been banned statewide in Vermont and Virginia, as well as in 20 cities across the US. "Mr. Williams deserved better from the law enforcement agencies entrusted to use a technology that we all know is less accurate when applied to citizens who look like him," House Judiciary Committee Chairman Jerrold Nadler (D-NY) said in his testimony.

This discussion has been archived. No new comments can be posted.

Man Wrongfully Arrested By Facial Recognition Tells Congress His Story

Comments Filter:
  • Some comfort (Score:4, Insightful)

    by Rosco P. Coltrane ( 209368 ) on Tuesday July 13, 2021 @06:10PM (#61579527)

    In the US, a black man spending 30 hours without food and water in the slammer after being wrongfully identified by a stupid software worked by single-digit IQ coppers can count himself lucky he didn't die like a dog in a ditch, agonizing and gasping for air with a knee on his throat.

    • Nonetheless, the main problem is the single-digit IQ coppers rather than face recognition software.

      Any tool can be misused by idiots.

      • I don't know what your country's IQ scales are like, but the ones I've had dealings with are meant (important word) to have a mean of 100 points and an SD of 15. So a "single digit IQ" would be more than 6 SD off the norm, or about 1 in (100-99.999 999 901 341) people - about 1/7 people, if the human population of the planet was 10 billion (which it isn't, quite, yet).

        I had a friend at Uni whose first job after graduation was rehabilitating people into "normal life" after they'd been jailed (effectively -

        • Who maintains this excuse for a filter?

          The filter code is not maintained.

        • "I don't know what your country's IQ scales are like, but the ones I've had dealings with are meant (important word) to have a mean of 100 points and an SD of 15"

          Yeah, hyperbole.

          OTOH, there have been numerous studies showing that (as of 2019) the average IQ of Law Enforcement 'officers' in the US is currently ~93. And has been steadily declining for a few decades.

          See Jordan v City of New London. Where the court held it was perfectly reasonable for the police to *not* hire someone for scoring too high on t

          • That's pretty ... challenging, coming from a legal authority (sense : government, not sense "likely to be sane or useful"). It's even more challenging given the long-standing concerns about the accuracy and reproducibility of IQ testing, and what, if anything, of meaning they actually measure. My schooling system was in the process of throwing out a testing programme that completely structured the schooling of 11+ pupils when I went through it in the mid-1970s, and now those same people, under the influence
      • by AmiMoJo ( 196126 )

        You have to account for human nature when engineering things. People are lazy and bad at statistics. They will end up doing what the magic facial recognition CSI software tells them to. Even if someone is smart enough and cares enough to understand the limitations, that information will likely be lost by the time the message has been passed down the chain from detective to uniformed patrol officers.

    • You always were a man of the law Sheriff Coltrane, and on the rare occasion you caught them no-good Duke Boys, you always respected their rights. Salute.

      Sorry about your wife.

    • Re:Some comfort (Score:5, Informative)

      by Brymouse ( 563050 ) on Tuesday July 13, 2021 @06:43PM (#61579675)

      He's lucky he didn't die like this guy [thedailybeast.com] after being paralyzed by the police and died of dehydration with a cup of water in front of his eyes.

      • by Anonymous Coward

        Holy shit, and just imagine we only hear about some these cases. How many more have been affected?! Much like healthcare, the system is totally fucked on all sides. America?

      • by MrL0G1C ( 867445 )

        Shocking deliberate neglect to the point of very much being negligent homicide, how are these people not themselves in prison?

    • Re:Some comfort (Score:5, Interesting)

      by Kisai ( 213879 ) on Tuesday July 13, 2021 @08:52PM (#61580049)

      Facial recognition has it's uses, but I think LEO's should be using it as any kind of fishing expedition. Because Facial recognition is akin to dragnet fishing and it will have a lot of bycatch.

      For it to be used safely and legally it needs:
      a) A narrow scope, a crime has to be have been committed, or suspected of being committed. It can not be "looking for a witness" nor can it be "using for marketing, advertising or loss-prevention services"
      b) The operator of the camera must give consent to acquire the footage, they do not require a warrant or court order, since the footage may be destroyed (eg disks being wiped automatically) before it happens. The operator is only obliged to give LEO's the footage for +/- 7 days, at LEO's cost. eg if they want the footage from the last week to now, the operator can consent to give them a copy, or set aside the the drive/storage media in a sealed container that is only turned over when the warrant or court order gives the LEO permission for it. The footage must have a chain of custody beginning with the operator.
      c) the software that analyzes the photos can only use "DMV" and "Passport" photos of domestic residents and citizens. For any other database, the foreign country must give consent to use the machine learning software on their database, inside their country or diplomatic office, and that data does not leave their office.

    • Yes - because that happens everywhere in the US all the time.

      We're a big country with a (decreasingly) decentralized power structure. Taking these examples and applying them to the whole country is getting fucking old.

      Hold bad cops accountable - and that includes the cops that cover for bad cops. But painting the entire police system in America as one is ridiculous.

    • Try six months. My cousin was arrested simply because his Ford truck looked like a truck used in a farm burglary. We live in a rural (duh) county, half the people drive pickups.

      His bail was set at $50,000 and he could not raise that much. His hearing kept getting delayed, when he finally got to see a judge the charges were dismissed. This was a 50 year old white man who had lived in the county all his life.

    • You mean because a single one of the ~35,000,000 blacks in America did die from a knee to the neck? Are you retarded or just need a dictionary to look up the word lucky?

  • by Vinegar Joe ( 998110 ) on Tuesday July 13, 2021 @06:32PM (#61579615)

    Please send your complaints of racism to Detroit's Chief of Police:
    https://detroitmi.gov/governme... [detroitmi.gov]

  • Undecided (Score:5, Interesting)

    by Geekbot ( 641878 ) on Tuesday July 13, 2021 @06:35PM (#61579641)

    Undecided on facial recognition. I think I land on the "con" side. But the blurb here makes a poor case. The fault doesn't seem to be with facial recognition as much as prosecutorial misconduct and poor police work.
    I wouldn't completely disbelieve his statement about eating either as I could see how someone could possibly be poorly processed but I'd take it with a grain of salt for sure. It also makes me question the article since it has nothing to do with facial recognition and seems to make some unconnected emotional appeal that makes me wonder why the author feels he has to use that gimmick.

    • Not unprecedented (Score:4, Informative)

      by Bruce66423 ( 1678196 ) on Tuesday July 13, 2021 @06:43PM (#61579679)

      DEA forgot about someone they'd locked up

      https://en.wikipedia.org/wiki/... [wikipedia.org]

    • Re:Undecided (Score:5, Informative)

      by Anonymous Coward on Tuesday July 13, 2021 @06:44PM (#61579681)

      ... I wouldn't completely disbelieve his statement about eating either as I could see how someone could possibly be poorly processed but I'd take it with a grain of salt for sure. It also makes me question the article since it has nothing to do with facial recognition ...

      A lot of white people have the attitude "getting arrested by mistake is no big deal, mistakes happen-- you get taken to the police station and after a while they see they made a mistake and apologize, no real harm except you waste an hour or two."

      This is just pointing out that it's not always that way if you're not white.

      (...think I'll post this as AC, since it's likely to draw flames).

      • by AmiMoJo ( 196126 )

        It's not even true all the time if you ARE white.

        In the UK the cops sometimes like to punish or torture suspects by e.g. subjecting them to strip searches or leaving them locked up for hours and hours while they "decide what to do". Protesting your innocence is only likely to make them more inclined to do this in the hope of getting something incriminating when you are tired and upset.

        I'm sure it's even worse in the US.

    • Re: (Score:3, Informative)

      I don't doubt the part about going without food or water. I've known several people who have gone through the local county holding center and they all report the same experience. It's a technique used to break you, when the cops want to make a statement.

      • by hattig ( 47930 )

        Which in any sane country would be classed as torture, and any resulting confession would be ruled inadmissable.

    • Re:Undecided (Score:4, Informative)

      by ShanghaiBill ( 739463 ) on Tuesday July 13, 2021 @07:08PM (#61579781)

      I don't know what Detroit jails are like, but when I was in jail, I was held in a large area with bunk beds and about 100 other guys. The area with individual cells was only used for violent people, and this guy wasn't violent.

      We got three meals a day. The only vegetarian option was peanut butter, but I had no problem swapping my baloney for extra PNB.

      Overall, everyone was friendly and cooperative. It was a great opportunity to practice my conversational Spanish.

      The only conflict was over the TV. We would vote on the channel, but these days there are so many channels that it was often difficult to reach a clear majority for one show.

      This was at the Santa Clara County Jail in San Jose, CA.

      There should be a review site for jails, so criminals can pick a good jurisdiction to ply their trade.

      • Metro Detroit is now 78% black. The city is infamous as being brutal, it's historically a city dominated by unions and blue collar workers. Santa Claria is known as "Santa Claricita" among Hispanics to reflect their cutesy politics and snobbish self-image. It's a technology distrct in California, with fewer than 3% black residents who are strongly outnumbered by Asians.Its rates of violent crime are _far_ lower.

        Charging this innocent man for a crime he did not commit with such poor evidence is a horrific mi

        • Providing individual cells for each inmate is much more expensive, in both facilities and staffing.

          So if even a prosperous place like SCC uses a barracks-style bunkroom, then a poor city like Detroit certainly should.

          • It's unsurprising, especially in a jail rather than in a prison. A bulk temporary holding area is not a surprise. 30 hours without food and water is a pretty clear violation of the 8th amendment protections against torture, and may violate the 14th amendment rights against violating other rights without due process of law. It's not shocking given some other history of abuses in urban jails in poor neighborhoods.

      • So who the fuck cares how a white asshole is treated in a San Jose jail?
        • So who the fuck cares how a white asshole is treated in a San Jose jail?

          I was treated the same as the non-white and non-asshole people.

          It was an egalitarian jail.

      • by hattig ( 47930 )

        So you're saying to get a individual cell with some privacy, you should start a fight?

      • The only conflict was over the TV. We would vote on the channel, but these days there are so many channels that it was often difficult to reach a clear majority for one show.

        Obviously they should be using ranked choice voting.

        They could probably do it better than New York City did for their mayoral election, too. San Jose criminals tend to be smarter than NYC citizens.

        I kid, I kid. NYC citizens are just as dumb as San Jose criminals.

    • Re:Undecided (Score:4, Interesting)

      by mosb1000 ( 710161 ) <mosb1000@mac.com> on Tuesday July 13, 2021 @07:45PM (#61579877)

      Yeah, this article doesn't make much of a case for why facial recognition should not be used. If anything, it highlights the need for prosecutorial reform, and is, at best, tangentially related to the issue of facial recognition. My feeling is that the technology will improve with time, and it would be a mistake to ban it at such an early stage. It could end up reducing the rate of these kinds of problems. If these kinds of laws become very prevalent, it could easily be the case that we will be looking back on them in a few years wondering what we were thinking.

      • by hattig ( 47930 )

        Regardless, there needs to be a lot more checks and balances with facial recognition. The software should not have biases, or be clearly limited to certain use cases which are clearly communicated and available to the defence. The quality of the input imagery needs to be factored into the output, where it gives matches - a poor image as in this case should clearly result in a 'fail' or 'low match probability' result, rather than what appears to be 'best effort guess' presented as a good match.

        And possibly u

  • ... he was misidentified ... after they used a grainy image from the surveillance footage.

    I thought Police and the like could always enhance those photos/videos on the fly to read even the blurriest images. I mean, that's how it works on -- literally, not figuratively -- all TV shows and movies.

    Detective: Can you enhance that license plate in the photo taken at a 270 degree angle, at night, through the fog, from space?
    Tech Guy: No problem, hold on ... [types quickly, image zooms and clarifies]
    Detective: Nice.

    Then there's this -- WTF:

    He was then picked from a photo lineup by the store security guard who wasn't actually present for the incident.

  • by Antique Geekmeister ( 740220 ) on Tuesday July 13, 2021 @07:45PM (#61579875)

    Bias applies a mental choice. "Fails on black faces" is unsurprisng: doing recognition or analysis of original images which are considerably darker reduces the ability to make fine distrinctions with analog-digital tools, or even with purely digital analysis. Like analyzing moving objects, it tends to hide data, not that the viewer or the software is somehow electing to ignore what data they get.

    • Bias applies a mental choice.

      Not so. A word can have multiple meanings (though I think “implies” was the word you meant to use there), and this is one such case. When we talk about people’s biases, we’re talking about meanings 1 or 2. When we talk about biases present in computational systems, we’re either talking about meaning 4 or a narrower take on meaning 1, i.e. as a tendency/trend, perhaps as the result of negligence by humans.

      https://www.dictionary.com/bro... [dictionary.com]
      bias
      noun
      1. a particular tendency, trend,

    • Bias applies a mental choice. "Fails on black faces" is unsurprisng: doing recognition or analysis of original images which are considerably darker reduces the ability to make fine distrinctions with analog-digital tools, or even with purely digital analysis. Like analyzing moving objects, it tends to hide data, not that the viewer or the software is somehow electing to ignore what data they get.

      This is some dumb shit, frankly.

      BuT cOmPuTeRs CaNt Be BiAsEd with the implication that there's no racial bias. Ba

      • No, it's not dumb shit. Photographers and film makers know that darker tones of skin require different lighting in order to obtain the same detail you'd get from lighter skin. Darker skin reflects less light than lighter skin. Same deal if you were comparing filming a dark suit versus one of a light shade. The detail of the latter is captured more easily in lower light conditions.

        Footage captured in settings where this can't be considered (e.g. security cameras) will result in lower quality images, making i

        • No, it's not dumb shit.

          It really is. Bias does not require conscious choice as the OP claimed. It's even a basic statistical concept.

          You've given a lot of reasons why the algorithms will give less reliable results against an identified subset of the dataset (a.k.a. bias). I mean, yes? I'm arguing there are lots of reasons for the algorithms to be biased. From bugs to basic statistics and everything in between.

          • It's a tenet of "critical race theory" that all physical, social, and statistical distinctions are racist, aimed at oppression and abuse, that they are all founded in fraudulent bias aimed at controlling the targets of that oppression. It makes using the word "bias" to describe merely a statistical correlation very difficult.

            Let's therefore be very careful about the word "bias". The most common uses in modern English imply a negative and unfair rejection of something, precisely what can and should concern p

            • It's a tenet of "critical race theory" that all physical, social, and statistical distinctions are racist,[...]

              Is it though? I've generally found that people who are vocal about such things on slashdot are wildly wrong.

              The most common uses in modern English imply a negative and unfair rejection of something, precisely what can and should concern people about facial recognition tools.

              I have no idea what you're trying to argue. Now you seem to be saying face recognition can be biased (though "rejection" is sp

              • I'm suggesting that people can and do interpret purely statistical correlation as deliberate, malicious racial bias. Referring to statistical bias carelessly when the concern of many critics of facial recognition is that it has both implicit and explicit racial bias, often given credibility by its relatively poor performance for black people.

                • Thing is, sufficiently advanced incompetence is indistinguishable from malice.

                  It doesn't ultimately make any meaningful difference if you're discriminated against by a system written by a malicious racist vs one written by an incompetent programmer under time pressure from an indifferent and incompetent manager.

                  And accidental statistical bias that lines up with skin color is still racial bias whether it was intended to exist or not.

                  • It makes a practical and a profound legal difference whether it's a result of malicious racism or accidental correlation. Correlation can be solved technically, by recognizing that the system does not work well and is more likely to make false positive IDs with black faces. Malicious racism, used to keep colored people out of the workplace or public venues by misidentifying them without consequence, is much more difficult to defeat.

                    • It makes a practical and a profound legal difference whether it's a result of malicious racism or accidental correlation.

                      Wat. So you're telling me that it would have made a difference to the person in this story if the programmer was incompetent instead of racist? How? Not in a nebulous future way but in an absolute, concrete way.

                      Correlation can be solved technically, by recognizing that the system does not work well and is more likely to make false positive IDs with black faces

                      Wow the word "bias" has reall

                    • Comment removed based on user account deletion
                    • Noticing a physical or behavioral characteristic is not "racism". There is also no reason so far to presume that it's deliberately racist. They should have been noted during development, but let us not blame on malice what is easily explained by the actual difficulty of the problem. There are many good articles explaining this, and many refer to this paper:

                      http://proceedings.mlr.press/v... [proceedings.mlr.press]

  • by PastTense ( 150947 ) on Tuesday July 13, 2021 @07:53PM (#61579893)

    Since the beginning of police, they have always used facial recognition though it wasn't done by computers. And it was often (probably usually) wrong. For example decades ago you used to see photos of the FBI's ten most wanted posted in the local post office. And someone would say: Hey that guy looks like my neighbor Joe--and the police would come and check it out. And certainly recently everyone has seen video from a bank or store camera on TV of a robber. People call in as to who they think it could be and the police would check out these possibilities.

    The problem is that the police treat human facial recognition with skepticism, but they don't with computer facial recognition.

  • I severely dislike articles like this that seem to make the point that its not OK because it affects some groups more than others. THAT IS NOT THE POINT. It is not OK because no one should have that ability over anyone else. It should not even be possible to track people's whereabouts after the fact. The fact the data exists, whether it is license plate scanners or facial recognition or anything that allows your trail to be pieced together after the fact should not be allowed and should not exist.

  • "Six federal agencies, including the FBI and the U.S. Postal Service, reported using it during the 2020 Black Lives Matter protests that followed the police murder of George Floyd."

    Why and how is the US Postal service involved in policing protests?

    • Maybe somebody wrecked/damaged a post office or even just a post office box?

      You put a mailbox up in front of your house, it's technically theirs. Somebody wrecks it, it is actually a federal offense, enforceable by USPS postal inspectors.

      • A mail truck was torched in Minnesota...

        Supposedly this is why they started monitoring protests last spring: because of the potential threat to Postal Service workers and buildings. An uptick in threats against Postmaster General Louis DeJoy probably didn't help either...
  • The simple mistake is rolling out a new technology prematurely, i.e. before they've worked out its limitations & feasibly reliable applications. Beta testing when citizens' livelihoods & liberty are literally & directly at stake is irresponsible, unethical & those tech companies should be held legally liable. The courts also have a responsibility to be circumspect about these new technologies & require a reasonable burden of proof of their validity & reliability (evaluations by indep
    • The courts also have a responsibility to be circumspect about these new technologies & require a reasonable burden of proof of their validity & reliability (evaluations by independent experts, i.e. scientific research), as they would any other kind of evidence.

      Yes they should. Unfortunately that would throw out half of the results of fingerprint matching and they can't have that, so they definitely won't do any such thing.

  • by bloodhawk ( 813939 ) on Tuesday July 13, 2021 @11:45PM (#61580431)
    Facial recognition just like lineups and video footage are tools to assist with locating the right person. Misuse of the tools shouldn't be blamed on the tool, it is a failure of the police process and procedures. Any tool that helps to catch the right person (within reason) should be available, especially automated tools that can help to narrow down potential suspects in large groups. Just they have to be trained to correctly use them.
  • Nice try. It won't work. ;)

    (Before you get triggered: Of course I do not believe they are /actually/ lizards. It's a funny meme, nothing more. Like Zuckerberg being a robot. Stop treating everyone who says they are lizards as if they meant it. Because otherwise they will start meaning it even if they didn't.)

  • Take publicity shoots of 100 famous actors and actresses, plus the whole cast of Avengers, X-Men and Watchmen, plus photos of all US presidents, and run them through this facial recognition software. Then publish the photos of the close to 200 people that were found on a website, so we can all see how well this software works.

    I think what they are getting wrong is how much being "recognised" by the software means for the probability that you committed the crime. And what exactly does it mean if you are t
  • There shouldn't be a law against using facial recognition by law enforcement, BUT it should be used as an extra tool, not as definite proof. It isn't any different from having some person watching tons of video's in the hope of finding a specific person, only this way it's much MUCH faster. After the system has identified possible targets it should always be checked (and rechecked by a different person) if it's the culprit they are looking for.
    Most cases when I heard about wrongful arrest in regard to facia

    • Seriously, eyewitness identification is incredibly WRONG and pretty easily manipulated even unintentionally. Just look into the deep topic! They have even gotten into childhood facial recognition where people surrounded by just a few races learn to recognize differently so outside races really do look "all alike" to them and it's difficult to change at a later age.

      An AI system is needed because humans already SUCK; naturally, the AI is going to need diverse face training unlike most humans have. If it can j

  • Man Wrongfully Arrested By Facial Recognition Tells Congress His Story

    No one was ever arrested by facial recognition. He was arrested by police who don't know how to do their job, or alternately, know their job is to keep the citizenry oppressed and were in fact doing it.

    Writing that headline could only be done after deep-throating an entire department's worth of cops.

  • Face recognition can be useful for police to narrow the range of suspects. The problem is that police are in general idiots, and therefore should not be playing with matches, in this case face recognition matches.

It is easier to write an incorrect program than understand a correct one.

Working...