Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Crime

Wrongfully Accused by an Algorithm (nytimes.com) 352

In what may be the first known case of its kind, a faulty facial recognition match led to a Michigan man's arrest for a crime he did not commit. From a report: On a Thursday afternoon in January, Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from the Detroit Police Department telling him to come to the station to be arrested. He thought at first that it was a prank. An hour later, when he pulled into his driveway in a quiet subdivision in Farmington Hills, Mich., a police car pulled up behind, blocking him in. Two officers got out and handcuffed Mr. Williams on his front lawn, in front of his wife and two young daughters, who were distraught. The police wouldn't say why he was being arrested, only showing him a piece of paper with his photo and the words "felony warrant" and "larceny." His wife, Melissa, asked where he was being taken. "Google it," she recalls an officer replying. The police drove Mr. Williams to a detention center. He had his mug shot, fingerprints and DNA taken, and was held overnight. Around noon on Friday, two detectives took him to an interrogation room and placed three pieces of paper on the table, face down. "When's the last time you went to a Shinola store?" one of the detectives asked, in Mr. Williams's recollection. Shinola is an upscale boutique that sells watches, bicycles and leather goods in the trendy Midtown neighborhood of Detroit. Mr. Williams said he and his wife had checked it out when the store first opened in 2014.

The detective turned over the first piece of paper. It was a still image from a surveillance video, showing a heavyset man, dressed in black and wearing a red St. Louis Cardinals cap, standing in front of a watch display. Five timepieces, worth $3,800, were shoplifted. "Is this you?" asked the detective. The second piece of paper was a close-up. The photo was blurry, but it was clearly not Mr. Williams. He picked up the image and held it next to his face. "No, this is not me," Mr. Williams said. "You think all Black men look alike?" Mr. Williams knew that he had not committed the crime in question. What he could not have known, as he sat in the interrogation room, is that his case may be the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm, according to experts on technology and the law. A nationwide debate is raging about racism in law enforcement. Across the country, millions are protesting not just the actions of individual officers, but bias in the systems used to surveil communities and identify people for prosecution.

This discussion has been archived. No new comments can be posted.

Wrongfully Accused by an Algorithm

Comments Filter:
  • Not racism (Score:5, Insightful)

    by groobly ( 6155920 ) on Wednesday June 24, 2020 @09:09AM (#60221556)
    This isn't about racism. It's about idiot police departments and lying marketers overselling vision technology. The same thing has happened with fingerprints. Was that racism?
    • Re:Not racism (Score:5, Insightful)

      by Ogive17 ( 691899 ) on Wednesday June 24, 2020 @09:12AM (#60221574)
      Why not? The arresting officers could have compared the photo to the man they were arresting to see the obvious discrepancy. They also didn't have to act like complete dicks the entire time.

      Treat people like shit, get treated like shit.
      • Re:Not racism (Score:5, Insightful)

        by JeffOwl ( 2858633 ) on Wednesday June 24, 2020 @09:26AM (#60221616)
        The police were lazy and incompetent. It might be related to racism on the part of the officers, but I don't know that for sure from what I saw here. I'm pretty darn white and I got pulled over several years ago because the local police just assumed a description someone gave of a person and a truck was me, in spite of the color of the truck being wrong. I didn't have to spend the night in jail, but they did give me a bunch of crap for something they would have known wasn't me if they had bothered with due diligence after the initial report or even stopped for a minute to question their assumptions.
        • Re:Not racism (Score:5, Insightful)

          by BringsApples ( 3418089 ) on Wednesday June 24, 2020 @09:51AM (#60221746)

          And there it is: "Due Dilligence". THIS is the problem with using this technology. Those cops aren't trained to question the mechanisms that are put in place. Don't forget that police are also a fraternity. Going against the grain isn't exactly in the police mindset.

          Had they simply observed REALITY, then they'd have known this wasn't the right guy. I hope he gets a lawyer and sues.

          • Re:Not racism (Score:5, Informative)

            by Freischutz ( 4776131 ) on Wednesday June 24, 2020 @10:29AM (#60222000)

            ... Those cops aren't trained ...

            That's the actual problem. I just watched an instructor describe his experiences with US police departments. What bowled him over was that a frighting number of US cops had next to no training. Some US police forces do a good job at that to be sure but a large number don't. According to him these guys should spend 20% of their time training marksmanship, training trigger restraint under stressful conditions, learning when to violently subdue suspects and when not, training de-escalation methods, training negotiations, training in how to wrestle resisting suspects without killing or injuring them, learning about drugs, their effects, and as in this case what can and can't facial recognition do for you. The way he described it, one is pretty amazed some US cops do as little of this as seems to be the case because the pressure is so great to be out there and make arrests (some of these cops actually get punished for not meeting arrest quotas) that there isn't even money or time for adequate training.

            • Re:Not racism (Score:5, Insightful)

              by cascadingstylesheet ( 140919 ) on Wednesday June 24, 2020 @12:16PM (#60222558) Journal

              ... Those cops aren't trained ...

              That's the actual problem. I just watched an instructor describe his experiences with US police departments. What bowled him over was that a frighting number of US cops had next to no training. Some US police forces do a good job at that to be sure but a large number don't. According to him these guys should spend 20% of their time training marksmanship, training trigger restraint under stressful conditions, learning when to violently subdue suspects and when not, training de-escalation methods, training negotiations, training in how to wrestle resisting suspects without killing or injuring them, learning about drugs, their effects, and as in this case what can and can't facial recognition do for you. The way he described it, one is pretty amazed some US cops do as little of this as seems to be the case because the pressure is so great to be out there and make arrests (some of these cops actually get punished for not meeting arrest quotas) that there isn't even money or time for adequate training.

              Gee. if only we had elected officials who were in charge of the local policies and training for these big city departments. If only we knew who they were, so we could choose different ones. If only they were all from the same party for the last 40 - 60 years so it would be easier to figure out what to do.

              • by ljw1004 ( 764174 )

                Gee. if only we had elected officials who were in charge of the local policies and training for these big city departments. If only we knew who they were, so we could choose different ones.

                I think it's the union chiefs who are in charge of the local policies. In charge in the sense of "hey council official, it's a nice ward you've got there -- it would be a shame if anything were to happen to it..." And we don't elect these union chiefs.

            • Comment removed (Score:4, Insightful)

              by account_deleted ( 4530225 ) on Wednesday June 24, 2020 @12:20PM (#60222588)
              Comment removed based on user account deletion
              • by Baleet ( 4705757 )

                Police brutality comes from a deep seated and fundamental belief by police officers that they are better than everyone else. If they didn't have this belief, they would not have been attracted to the job.

                What study are you basing that on? With no citation, this should not be modded "Insightful". Sounds like the kind of thing someone pulls out of one's own biases. Someone who never even spent a shift riding out with police officers.

            • by Shotgun ( 30919 )

              I would emphasize the learning to wrestle part of your post.

              I cringed when I watched the Rayshard Brooks incident unfold. Not because of the shooting, but the ridiculous circumstances that led up to it. I can't get over how UNNECESSARY it was.

              Two officers had the man down, and couldn't hold him there. I wrestled in college, and even with a highly trained opponent, it wasn't that hard to hold onto a man who was already taken down. A simple arm bar would have kept Rayshard from being able to move, instead

        • by Ogive17 ( 691899 )
          Were you arrested or just questioned during the traffic stop?
        • Now this is a great debate example for the years. We clearly have two entirely separate possible instances of racism and possible great arguments on both sides. (1) Were the arresting officers racist? Maybe. They knew that Mr. Williams was black when they decided to call him at work to tell him he was being arrested. They already had time to compare the camera image to the database image themselves. But they still went through with the arrest at the home and left Mr. Williams in a cell for hours. Wou
        • Re:Not racism (Score:4, Interesting)

          by PoopMonkey ( 932637 ) on Wednesday June 24, 2020 @11:36AM (#60222368)
          I had a similar experience. I was pulled over and handcuffed at gunpoint due to them "matching" my car to recent shooting in the area. The color of the car they were looking for wasn't the color of my car. The size/type of vehicle didn't match mine either. But both were close enough (in their opinion) for them to stop me.
        • My son's work was swatted his boss had a truck of the same model but different year and color of one used in an armed robbery and the police where called in a tip by some angry person who had their stuff repossessed. The officers even with a video tape of the actual truck seeing the difference in color and package (aka no sun roof) still impounded the truck and arrested everyone and held them for 4 hours while they searched the business and the truck. Then the bussiness owner sued the crap out of them.

        • by codeButcher ( 223668 ) on Wednesday June 24, 2020 @12:37PM (#60222666)

          in spite of the color of the truck being wrong

          Was the truck black, perhaps???

      • Re: (Score:2, Insightful)

        Why not? The arresting officers could have compared the photo to the man they were arresting to see the obvious discrepancy. They also didn't have to act like complete dicks the entire time. Treat people like shit, get treated like shit.

        Shitty detective work, aggressively ignorant, and completely inept at their jobs. They may be prejudiced, but given the information in the story, it's hard to tell that from just being sorry-asses. Don't dilute the evil of racism by trying to apply it to every other evil action. It's true that racists are shitty people, but every shitty person isn't necessarily a racist. People are more than one dimension, even the shitty ones.

      • The arresting officers could have compared the photo to the man they were arresting to see the obvious discrepancy.

        Correct, but that is clear evidence of incompetence, not racism. It seems to me that someone capable of this level of stupidity is unlikely to be capable of doing any better regardless of the race of the person involved.

        • by DeVilla ( 4563 )

          They might try harder depending on race.

          I didn't see a picture from the surveillance camera. Was that in the article somewhere? I'd like to find a side-by-side to use as a poster child of what artificial stupidity means.

        • by Ogive17 ( 691899 )
          So when these incidents happen predominately to black people, is it just their bad luck?
    • Re:Not racism (Score:5, Informative)

      by currently_awake ( 1248758 ) on Wednesday June 24, 2020 @09:25AM (#60221614)
      Facial recognition programs have a lower accuracy rate with non whites, this is a known issue. Depending on the police department, this might be considered a feature not a flaw.
    • Re: (Score:2, Insightful)

      by Bert64 ( 520050 )

      Facial recognition software generally has more difficulty with darker faces especially when combined with poor lighting, this isn't racism - it's a limitation of the current technology.

      Broken or over-sold technology is not racist.
      Lazy and incompetent officers are not racist.

      Both of these problems can and do happen to white folks too.

      They are just playing the racism card to trigger a response from the media.

      • Re:Not racism (Score:5, Insightful)

        by AvitarX ( 172628 ) <me AT brandywinehundred DOT org> on Wednesday June 24, 2020 @09:59AM (#60221788) Journal
        If the use of broken technology leads to more false arrests by lazy cops on a particular race I'd say it's starting to look a lot like institutional racism.
        • by Bert64 ( 520050 )

          Is it broken, or simply limited?

          Identifying white faces is easier than identifying black faces, this is due to how light and optics work. Improved technology may be able to narrow the gap in time, but this is simply a fact.

          If cops are being lazy that's on the cops.

          If this technology is better at identifying white faces, it is actually going to result in MORE ARRESTS OF WHITE CRIMINALS. Now is that racist?

          Black criminals are MORE LIKELY TO GET AWAY WITH THEIR CRIMES because the technology is less able to ide

      • Re:Not racism (Score:5, Insightful)

        by mysidia ( 191772 ) on Wednesday June 24, 2020 @10:04AM (#60221832)

        Facial recognition software generally has more difficulty with darker faces especially when combined with poor lighting, this isn't racism

        Yes it is. Its called a racially-biased system. Image systems are often racially biased because of the developers and subject matter used in originally developing such systems. Government officials relying on the report from a racially-biased system without verifying that report is a racist action by the government official.

        Even if they confirmed the reports; its still racist if the government officials' actions in doing so created more burdens or more scrutiny in the process towards people of one race than another race.

      • by cusco ( 717999 )

        Poor lighting, and security cameras tend to be uniformly crappy (I work in physical security). Security systems are an expense, not an asset, so the low bidder will almost always get the installation contract. Retail is only slightly better than home systems for quality of equipment and installation, it's very cutthroat and the only way to make any money doing it is to slap shit together and get on to the next job. I thank all the gods that I haven't had to work in that sector of the industry, but have w

    • Re:Not racism (Score:5, Insightful)

      by jonsmirl ( 114798 ) on Wednesday June 24, 2020 @09:29AM (#60221626) Homepage

      Yes, this is shoddy police work. Facial recognition is supposed to be only one of multiple factors leading to an arrest. Did they ask his employer if he was at work? Did they check his phone for his location at that time? Is there any evidence to place him at the crime scene? Did they show a photo of this man to eye witnesses in the store? Doing a normal amount of investigative work would have shown this was the wrong suspect. All of these factors can lead to mistakes, but if multiple factors place him at the scene of the crime then he should certainly be brought in for questioning.

      Facial recognition is going to make mistakes. Every investigative technique has the potential to make mistakes. That's why you need confirmation from multiple sources. If the police arrested him off from this single facial recognition hit, then it is the police who are not doing their job.

      We should not abandon tools because they make mistakes, instead we should design a system that catches these mistakes and that's why you need evidence from multiple sources. If we abandoned every tool that made a mistake, we'd have no tools to work with.

    • Re:Not racism (Score:5, Informative)

      by drew_kime ( 303965 ) on Wednesday June 24, 2020 @09:34AM (#60221638) Journal

      This isn't about racism. It's about idiot police departments and lying marketers overselling vision technology.

      I'm not saying the marketing doesn't oversell, but ...

      “This document is not a positive identification,” the file says in bold capital letters at the top. “It is an investigative lead only and is not probable cause for arrest.”

      And ...

      In Mr. Williams’s recollection, after he held the surveillance video still next to his face, the two detectives leaned back in their chairs and looked at one another. One detective, seeming chagrined, said to his partner: “I guess the computer got it wrong.”

      They could have done that without arresting him. Like, maybe have the arresting officers bring a copy and hold it up next to him before putting cuffs on?

      And finally ...

      When the case was called, the prosecutor moved to dismiss, but “without prejudice,” meaning Mr. Williams could later be charged again. Maria Miller, a spokeswoman for the prosecutor, said a second witness had been at the store in 2018 when the shoplifting occurred, but had not been asked to look at a photo lineup. If the individual makes an identification in the future, she said, the office will decide whether to issue charges.

      Once they compared the images to his actual face, it was clear that it wasn't him. But if another witness from 5 months prior misidentifies him, they're keeping the option to charge him again. That's not the facial recognition company's fault.

      • by AvitarX ( 172628 )
        Maybe have a judge sign off on an arrest warrant.

        One that has a picture of the criminal AND the person they want to arrest.

        The arresting officer really shouldn't be the one making that call on the spot, the whole purpose of a warrant is to present enough evidence for probable cause.
        • by HiThere ( 15173 )

          Unfortunately, a lot of judges rubber-stamp warrants. So it should be a better approach, but it might well not be. After all, then they've got the official approval for the arrest.

        • The arresting officer really shouldn't be the one making that call on the spot, the whole purpose of a warrant is to present enough evidence for probable cause.

          They shouldn't be the only one responsible for the decision. But anyone at any stage in the process should be able to look at what's going on and say, "Hey wait, this doesn't look right."

          Unless you believe the suspect is a flight risk, or is a clear danger to someone else, you can always go back and arrest them the next day after you check your evidence. Especially in a case like this, where it's property crime from nearly half a year ago.

    • They've already sent email, from his wife, and are soliciting for donations and letters to congress. The body of their form letter is as follows:

      Face surveillance specifically has been found to have racially biased error rates against people of color. And if law enforcement continues to be allowed to use invasive face surveillance systems, they will unquestionably be used to target communities of color â" exacerbating over-policing within them.

      We have already seen this play out in real life. Military-grade surveillance equipment has been deployed by police during protests for Black lives. And law enforcement reliance on face recognition technology has resulted in wrongful arrests.

      Congress needs to act now by zeroing out funds for these invasive technologies at the federal level, also denying funds to local and state law enforcement for this purpose, and passing legislation to halt the use of face recognition and other discriminatory technologies.

      Personally, I think the use of facial recognition by law enforcement and militarization of police should both be illegal; regardless of race.

      • by HiThere ( 15173 )

        I agree that the militarization of the police should not only be stopped, but should be reversed.

        About facial recognition, I'm less sure. Certainly it should have an oversight board WITH TEETH that is secured against regulatory capture. I.e., nobody who has ever work as a regulator can ever receive ANY emolument from either a police department or a company selling facial recognition aids. And criminal penalties. And provision that a case can be brought by a private citizen if the DA doesn't want to both

    • Re:Not racism (Score:4, Insightful)

      by hey! ( 33014 ) on Wednesday June 24, 2020 @10:53AM (#60222132) Homepage Journal

      There's different kinds of racism. There's *ideological* racism, where the problem is someone's belief. And there's structural racism, where the problem is someone's inability or unwillingness to see that the deck is stacked against certain groups.

      If you build a facial recognition system, and test it on white subjects, and sell it to white policemen who use it on the black populace, that's structural racism.

    • Re:Not racism (Score:5, Informative)

      by Stolovaya ( 1019922 ) <skingiiiNO@SPAMgmail.com> on Wednesday June 24, 2020 @11:38AM (#60222376)
      The bottom line is that this is abuse of power by police. Not saying why he's arrested. Not being helpful to his wife and family. They didn't act like law enforcement, they acted like mafia thugs, and that shit needs to stop. Racism might be a factor, but to be honest, it's irrelevant. He was wrongly arrested, and it's likely the cops who did this will continue to act like thugs.
  • by Tom ( 822 ) on Wednesday June 24, 2020 @09:09AM (#60221558) Homepage Journal

    The issue here isn't the algorithm.

    It's that apparently nobody in PD thought of comparing an actual picture of the guy with the picture of the culprit.

    • Did the NY Times run a comparison picture? I can't tell anymore - ever since their site began playing with scripting to try to block people accessing it with ad blockers, some photos don't show up and I can't tell if they're supposed to be ads, or photos in TFA.

      A quick google search brought up this photo [npr.org]. If that's all investigators had to go on, you can immediately see the problem. They were relying entirely on the algorithm because the suspect photo was of such poor quality that no human could ever hop
      • by AmiMoJo ( 196126 )

        The algorithm should just give an error message when the source image is too poor to give a reliable match, or when the probability of a match is less than 99.99%. If it gives a result the cops will use it.

    • The issue here isn't the algorithm.

      The issue here isn't only the algorithm.

    • by AmiMoJo ( 196126 )

      The officers who went to arrest the guy would have had to do the check themselves and being very low down the hierarchy would probably have dragged him in anyway, not wanting to deal with having to tell the detective that they got the wrong guy.

  • Axel Foley they aren't. No police investigation should rely on just one piece of evidence.
  • Need police reform (Score:5, Insightful)

    by Trailer Trash ( 60756 ) on Wednesday June 24, 2020 @09:13AM (#60221578) Homepage

    One of the reforms that I've mentioned before is making it a felony for officers to arrest someone without stating the cause for arrest at the time of the arrest. This is why. He had no reason to believe they were acting in good faith since they couldn't tell him why they were arresting him.

    I believe it should be a felony for all officers at the scene (yes, that's their last day in law enforcement) and should allow self-defense to kick in.

    I don't know what I'd do in a situation like that.

    • by Bert64 ( 520050 )

      I believe in some countries it's already a legal requirement to tell someone the specific crime they are being arrested for at the point of arrest.
      "I am arresting you on suspicion of XXXX" etc...

      • by OMBad ( 6965950 )
        It is in the US. The person in question was arrested was told his rights and what he was being arrested for. Seems less "outrageous" every time more facts are uncovered isn't it? Odd that the MSM doesn't want to put the facts in the article though, isn't it?
        • by ledow ( 319597 ) on Wednesday June 24, 2020 @09:48AM (#60221730) Homepage

          Arrest != Charge != Conviction.

          They can arrest someone on suspicion of anything, really. So long as that suspicion is not unreasonable. That's a question for a long court case.

          But you can be entirely innocent, arrested "on suspicion of" and basically there is no comeback. Whoops. Sorry. De-arrest. Which is as it should be (otherwise the number of people who walk free because of some process issue would be stupendous, or you wouldn't be allowed to arrest people that you really should be arresting).

          Arrest is an inconvenience. But once it's clear they have no grounds for charge, they have to release you. It's when there really was no reasonable reason for the arrest (as may be here), or that the charges are clearly not supported by the evidence that things are drastically wrong.

          Any other middle-ground really just results in a kind of anarchy, one way or another.

          • by nealric ( 3647765 ) on Wednesday June 24, 2020 @10:01AM (#60221804)

            But being arrested is often more than an "inconvenience." It can be a punishment in and of itself. If you or your family does not have access to bail money, you may spend time in jail (very much a punishment)- sometimes a long time. A few years ago, a teenager in New York committed suicide after spending nearly two years in jail without even getting a trial for a crime he didn't commit.

            You may end up having to pay a bail bondsman to get out (the extremely high interest is not refundable). If you are a single parent or your spouse is out of town, your children may be traumatized by being taken away by child protective services without any idea what happened to you or knowing when you will return. You may be roughed up or injured in the process of being arrested. If you are in jail for a few days, you might lose your job. Employers generally aren't excited to hear you didn't show up to work because you are in jail.

            Then, even after charges are dropped, you will have an "arrest record" will show up on background checks from potential employers. You will often have to pay money to have that record expunged even if the charges were dropped. Then, after all that, scammy sites all over the internet will post your mugshot so that anybody who googles your name will find it. They will offer to take that mugshot down for a "small" fee of a few hundred dollars (and of course because there are many such sites, it will be a game of whack a mole).

            Getting arrested can be a life changing event even if you are never charged with a crime.

            • by Kjella ( 173770 )

              I don't think you're wrong, hell we know just a false accusation can ruin lives. That doesn't mean the police should refuse to investigate/search/detain/arrest suspects or the DA should not press charges unless it's a slam dunk conviction. While I'm sorry when innocents are caught up in that, we can't make the system such that anything we couldn't prove beyond a reasonable doubt becomes a demand for restitution. You should be able to get compensation for things that are obvious mistakes and incompetence lik

            • by Trailer Trash ( 60756 ) on Wednesday June 24, 2020 @03:53PM (#60223652) Homepage

              "But being arrested is often more than an "inconvenience." It can be a punishment in and of itself."

              Cops even have a saying for this: "You can beat the rap, but you can't beat the ride". The arrest itself is a punishment in many cases and they arrest people simply because they can't do anything else to harm them. Some cops truly are despicable.

          • Arrest != Charge != Conviction. They can arrest someone on suspicion of anything, really. So long as that suspicion is not unreasonable. That's a question for a long court case.

            But in this case, they arrested him without telling him why they were arresting him. And when they were asked by his wife where they were taking him, they replied "google it."

            This is not acceptable behavior; those policemen should be fired.

    • How does that relate to this story? In the summary it says the police had a felony warrant for larceny and presented the paperwork at the time of arrest. On that note: let us not forget the person who signed the warrant. They are supposed to be the check that all the ducks are in a row and they apparently are just rubber stamping this stuff.
    • One of the reforms that I've mentioned before is making it a felony for officers to arrest someone without stating the cause for arrest at the time of the arrest. This is why.

      And tell someone where they're being taken. We don't "disappear people now, do we?

    • A felony, and grounds for all charges to be dismissed*.

  • by dmay34 ( 6770232 ) on Wednesday June 24, 2020 @09:19AM (#60221584)
    This isn't a case of a wrongful arrest for to an algorithm. It's a case of a wrongful arrest due to lazy (and racist) investigators. The algorithm didn't arrest him. Lazy police officers did. It's hardly the first time.
  • by scamper_22 ( 1073470 ) on Wednesday June 24, 2020 @09:24AM (#60221602)

    This has far less to do with the use of facial recognition and much more to do with process and officer behavior.

    False identification is just going to happen regardless of whether it is done by a computer or people.

    The story here is actually.
    1. The officer not giving the detention details to his wife when asked. That's just unprofessional and dehumanizing.

    2. Once they had the man in cuffs and arrested, they didn't take 2 minutes to look at the photos and compare him manually or gather other evidence. They also didn't give a brief overview to the wife of what is going on. Again, dehumanizing. If it looked suspicious, they could just tell the man and his wife that. As well as being kept for so long.

    As to the use of facial recognition. I wouldn't want to take that tool away from police. However, it is something that I think you need a certain level of accuracy for. Like I doubt police can use a radar gun for speeding if it's +- 30 mph. They need some level of accuracy, maybe within 1 mph.

    For facial recognition like this to be used, the company should need to prove a certain level of accuracy. Just based on reading the article, it looks like the system has a much lower accuracy rate for black people. So maybe such systems cannot be used for black people until they get their accuracy rate up to speed. It's never going to be perfect and you'll never subtype every single group of people. But hey if you're in Michigan, it should work properly on black people. Maybe it's not tuned for Inuit people, but Michigan has enough black people that it should be part of their testing.

    Again, just like radar guns. Suppose a radar gun gives worse readings on a car painted black as opposed to red or a car with curves vs straight lines. That would come up and procedures would be based on it.

    I actually don't like that big firms like Microsoft or IBM are not willing to sell to the police. As proven here, the police will simply go to other companies. In this case, a Japanese supplier. The big American firms who are more aware of racial issues and have amazing technology, could really work to make the system work for all people and publish accuracy numbers for various races. If they can't get their black identification up, they can disable the facial recognition for black people. Be responsible with the technology, because I guarantee you someone somewhere in the world is going to develop and sell the technology.

  • Facial recognition (Score:5, Interesting)

    by ledow ( 319597 ) on Wednesday June 24, 2020 @09:43AM (#60221686) Homepage

    It's why all the bullshit about the UK police's use of facial recognition is really not worth worrying about.

    To my knowledge, despite years and years and years of trails, including at major events and on public streets, nobody has been arrested on the basis of facial recognition alone.

    There have been people it's flagged as someone else. Who identified themselves as someone else. And happened to have something in their pockets that they shouldn't have when stopped. But it's never flagged someone who was then arrested for the thing they were flagged for.

    It's a load of nonsense and after years of effort and millions of pounds spent, it still produces far more false positives than anything useful, and hasn't resulted in any related arrests (I'm welcome to be corrected, maybe it's got a few people, but certainly not enough to justify its use).

    Literally, you do better just stopping randomly selected people, in terms of arrestable offences. Enough people would have cannabis in their pocket or a blade, or whatever just at random than those you'll catch because of face recognition flagging.

    Give it up already. It's why I don't have any worries about it as a surveillance or identification tool. It's basically useless, and all the stats and experiments say so. Try to charge someone purely on the basis of facial recognition and things like this happen - and get thrown straight out of court, if they even get that far.

    It's crap. Give it up.

  • Am I the only one shocked that the officers couldn't even stop to give the address to the precinct to the wife, but instead just told her to fucking GOOGLE IT?

    • Am I the only one shocked that the officers couldn't even stop to give the address to the precinct to the wife, but instead just told her to fucking GOOGLE IT?

      I'm not sure why that's shocking to you. It's maybe mildly unprofessional, but depending on her behaviour at the time I could certainly understand it. I've seen plenty of people freaking out, screaming, yelling abuse, and at the same time demanding information. Anyone who behaves that way towards a police officer certainly shouldn't expect any courtesy in return.

      Of course we don't know exactly how the whole thing went down; it's possible that she was completely calm and being entirely pleasant towards th

  • The same could have happened without AI or algorithms if someone really looking like you did the misdeed.

    When you live in a country with a shitty judicial system, and you're part of an oppressed minority, why the fuck don't you make sure you've got alibi an for 100% of your time?

  • Refusing to tell him why he was arrested and telling his wife to google it exposes the height of their arrogance.

    Being too stupid to realise that face recognition is bullshit arrogance at their own incompetence.

    I have no idea why the US people put up with this nonsense. It is just beyond reason that enough people do not want to live in society free from state terrorism, fearful people are compliant people.

  • by rickyslashdot ( 2870609 ) on Wednesday June 24, 2020 @09:48AM (#60221732)

    Jerk. Where's the open access link to the story - instead of being buried behind a 'pay-me' link to the NYTIMES !

  • by Way Smarter Than You ( 6157664 ) on Wednesday June 24, 2020 @09:51AM (#60221742)
    Did he wave his rights?

    Where the fuck was his lawyer?

    In a situation like that you NEVER talk to cops without your lawyer present. Anything you say can be twisted. Even a public defender is better than nothing.

    He could have spent years in prison until this got sorted out. If ever.
  • by jandoe ( 6400032 ) on Wednesday June 24, 2020 @10:03AM (#60221824)

    Justice system in US has a history of abusing non-scientific methods to put people in jail. We've seen this already with lie detectors, hair or blood spatter analysis and now again with AI. No one in the jury understands this methods so just pay an 'expert' to say they are 100% sure it's a match and case closed.
    In normal countries judicial experts are called to testify by the judges, not prosecutors or lawyers. If the judge needs some technical analysis he asks for an expert opinion. Someone is randomly chosen from a pre-approved list of experts and asked to analyse the evidence. There's no 'my experts vs your expert' nonsense. That's why in the US FBI could lie for 20 years about how accurate their hair analysis methods were but we didn't have similar large scale scandals in Europe.

  • If I can't have the Source code then you must acquit.

  • by Nkwe ( 604125 ) on Wednesday June 24, 2020 @10:14AM (#60221904)
    According to the NPR article [npr.org], the lead report (which came from the facial recognition system) specifically said that the match was not probable cause for arrest. While this is a minor technology fail, it is a major police fail.
  • The insurance industry and the banks have been using lots of less than correct and biased information to set premiums and credit for years.

I do not fear computers. I fear the lack of them. -- Isaac Asimov

Working...