Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Privacy Security United States

Live Facial Recognition Is Coming To US Police Body Cameras (medium.com) 50

Wolfcom, a company that makes technology for police, is pitching body cameras with live facial recognition to law enforcement groups across the United States, OneZero reported Thursday. From a report: It's a move that pushes against industry norms: Axon, the largest manufacturer of body cameras in the United States, declared last year that it would not put the invasive technology in its hardware, citing "serious ethical concerns." NEC, which sells live facial recognition elsewhere in the world, has also not sold it to U.S. law enforcement. Wolfcom claims to have sold body cameras to at least 1,500 police departments, universities, and federal organizations across the country. It has been developing live facial recognition for the Halo, Wolfcom's newest body camera model, according to documents and a video obtained by OneZero through public records requests. This new initiative makes Wolfcom the first major body camera provider in the United States to pursue live facial recognition, a controversial stance given a nationwide push from privacy advocates to ban the technology.
This discussion has been archived. No new comments can be posted.

Live Facial Recognition Is Coming To US Police Body Cameras

Comments Filter:
  • Outlaw it (Score:5, Insightful)

    by bluefoxlucid ( 723572 ) on Thursday March 05, 2020 @01:22PM (#59799942) Homepage Journal
    Ban facial recognition now.
    • Re: (Score:2, Insightful)

      While I agree, I think it would be naive to think it'll be banned. The powers that be want all surveillance all the time, live tracking on every citizen (except the 'chosen ones' i.e. rich), and complete accounts of all activities among the citizens. And they won't stop until they have it.

      What us commoners want is no longer any of their concern.

      • But maybe this would entice police to actually have their body cameras on ?

        Pretty hard to say "camera told me suspect was X, but we have no footage?" (except of course, we lost it/wasn't recording/we destroyed it)

        I'd rather have full surveillance of the police than no surveillance of the police, personally. Of course I'd prefer data retention laws that make sense, etc. as well. (that probably won't happen). And I'd prefer if the police were actually on the hook for behaving badly.

        Probably the only part t

        • Cop: "Oh sorry I killed that black kid. My body camera told me he was sex offender/terrorist."
          Judge: "Fine. Case dismissed."
      • Aside from developing a "Scramble Suit" [wikipedia.org]....what can be done to thwart facial recognition?

        Will wearing hats with high intensity infrared LED's on help blind these type cameras?

      • While I agree, I think it would be naive to think it'll be banned. The powers that be want all surveillance all the time, live tracking on every citizen (except the 'chosen ones' i.e. rich), and complete accounts of all activities among the citizens. And they won't stop until they have it.

        What us commoners want is no longer any of their concern.

        The live tracking of the haves is infinitely more valuable intelligence than that of the commons.

        Do you believe the American President's fealty to Comrade Putin is random or inspired?

    • by AHuxley ( 892839 )
      Why? The face has to be in the system to be of use... ie a criminal.
    • Hard to ban facial recognition, because it doesn't look any different than a normal surveillance camera from the outside. No practical way to enforce the ban. And it's guaranteed that the worst among us, the gestapo and various megacorps, would enjoy secret exemption from that law.

      Better to ban all surveillance cameras facing public spaces (including POPS). Yes that's a pretty extreme move - but totalitarianism is an extreme threat to our democracy. Easy to enforce, applies equally to everyone.

      Either everyo

  • by sandbagger ( 654585 ) on Thursday March 05, 2020 @01:30PM (#59799990)

    Say the camera generates a false positive, how will the cops on the beat handle it? Tail and get a second image taken then try to confirm his identity? An immediate rugyy tackle?

    I am also interesting how they report or even calculate false negatives, where a criminal walks in front and is not identified. Very difficult to calculate unless you're claiming it magically works perfectly. Of course, being able to get a handle on that number will be critical to understanding if it's worth the expense, nevermind the impact upon civil liberties.

    • nevermind the impact upon civil liberties

      I wonder if the same people who were clamoring for all police to have body cameras only a few years ago and now waking up the terrible realization of exactly what it entails.

      Realistically, I think the only sane moves are to universally ban body cameras or for everyone to have body cameras. If the police footage happens to go missing, at least you'll have your own. Well that or get to work on a scramble suit.

    • by Rick Schumann ( 4662797 ) on Thursday March 05, 2020 @01:43PM (#59800056) Journal
      The same way most police have always handled cases of mistaken identity: by arresting you, abusing you, tossing you into a holding cell, interrogating you and treating you like you're already a convicted felon, wrecking your entire day (or several days), perhaps wrecking your reputation, then when they have it jammed down their throats by a lawyer that they fucked up, tossing you back out onto the street without so much as an apology.
      Nothing good will come of cops having live facial recognition cameras on them all the time. Nothing.
      • I was arrested when I was 16 because I looked almost like this other kids twin and they had a photo even I had to admit we looked alike. They didn't release me until after they where able to identify me, when my mother showed up ready to go to war.

        Then again white male average height, average weight/build, brown eyes, brown hair, common haircut, blue jeans, t-shirt, no tattoos, scars, or distinguishing features.... I have no idea how they find people by descriptions.

        • I was arrested when I was 16 because I looked almost like this other kids twin and they had a photo even I had to admit we looked alike. ...
          I have no idea how they find people by descriptions.

          Well you should, you've experienced it; they detain whoever is nearby and looks similar, and they keep doing it until they find the person or give up. (they rarely find the person they're looking for until the person gets stopped for some other reason, except in cases where they find the person at their home or workplace)

        • by Holi ( 250190 ) on Thursday March 05, 2020 @03:28PM (#59800468)
          I was arrested as a teenager after being mistaken for a white supremacist who had hospitalized a Jewish lawyer. had a 2 foot tall mohawk at the time. I was thrown in jail had my picture taken and when they came back they said they showed it to the lawyer in the hospital and he id'd me. several days later after I was released they caught the guy, he was a skinhead and looked nothing like me.
          • I was arrested as a teenager after being mistaken for a white supremacist who had hospitalized a Jewish lawyer. [...] they caught the guy, he was a skinhead and looked nothing like me.

            Ironically, this is one of the areas where computer facial recognition would actually help. Eyewitness identification is notoriously inaccurate [scientificamerican.com]. And there have been experiments concocted which result in witnesses misidentifying an innocent bystander as the criminal. If the police have an APB out on a suspect that they have

      • by Ungrounded Lightning ( 62228 ) on Thursday March 05, 2020 @02:27PM (#59800238) Journal

        ... then when they have it jammed down their throats by a lawyer that they fucked up, tossing you back out onto the street without so much as an apology.

        You forgot rinse and repeat.

        If the system mistakes you for some felon, it will CONTINUE to do so. So this will happen over and over.

        A woman I know well has personal experience with something similar. When she was in college, a prostitute was done in by a serial killer. That girl was missing, so the cops were looking for her. She looked enough like the missing girl that a policeman picked her up, tried "cop trick questions", convinced himself that he's found the missing girl, and added her name to the girl's records as an alias.

        For decades police who, for instance, did a name check as she was passing through a heavily patrolled neighborhood, would find that and generally treat her like they do other officially not-nice people.

        Eventually she found out what had happened and had the records purged - but the cops have many interconnected databases and the disappearance of the record looked to their automation like the cleaned database just hadn't heard about it yet, and would helpfully tell it. So it kept reappearing.

        This went on for decades after they'd found the body, convicted the serial killer, and a true-crime book had been written about it. She took to carrying that book around in her car, to show any policeman who started grumping at her after calling in her name in a "who's that? Let's check the license" traffic stop.

        Eventually, when she got a concealed weapons permit - requiring an FBI check, the FBI responded to her warning about this by adding a "Do not confuse with ..." notation to THEIR files. Once this had propagated the harassment from the fouled identifications went way down. But it still turns up occasionally.

        Facial recognition has the opportunity for this. Also, facial recognition programs often make far more mistakes on racial groups different from those who programmed them. (Sort of an automated version of "They all look alike to me.") So there's lots of opportunity, not just for personalized harassment of individuals - especially twins or family members of real fugitives - but for racial discrimination as well.)

        • As someone who works in biometrics, your point on "racial groups different from those who programmed them" is true, but also, in a way, worse than that. Dark surfaces simply reflect less light and have less reflection than light surfaces. Particularly in low-light scenarios, this is a problem for many algorithms in recognizing features.

          While I would disagree with putting the label racist on the author or the algorithm, the effect is real: any algorithm will have weaknesses and perform differently for c
        • Yes, agree with everything you're saying, and adding this: because it's a machine, and "machines don't lie", and "machines don't make mistakes", there'll be even less room to argue a mistake has been made.
      • The same way most police have always handled cases of mistaken identity: by arresting you, abusing you, tossing you into a holding cell, interrogating you and treating you like you're already a convicted felon, wrecking your entire day (or several days), perhaps wrecking your reputation, then when they have it jammed down their throats by a lawyer that they fucked up, tossing you back out onto the street without so much as an apology.

        Reminds me of watching the TV show Cops 20-30 years back. The cops arrested people who clearly did nothing wrong, saying on camera "let the judge figure it out". Meanwhile the poor schmuck is sitting in jail, and has spent umpteen thousand $$$ on a lawyer only to finally get in front of a judge, who says "why are you here? Go away".

        That whole "Innocent until proven guilty" flies out the window when you need a few thousand $$$ to go before a judge.

        Oh, you got arrested and don't think you need a lawy

    • Facial recognition doesn't work when everyone is wearing masks for fear of the latest pandemic. Sounds like a waste of tax payer money to me.

  • With every cop in the country wearing a camera that's transmitting live all the time to some server farm somewhere, where did you think all that wireless bandwidth was going to come from? 5G isn't for consumers, it isn't for 'Internet of Things' sensors in farmers' fields like someone around here claimed, it's for more surveillance of citizens in general, and now we see the mechanism by which that will be implemented.
  • Once they have the pattern recognition on board, they should add weapon recognition (concealed or otherwise).

  • Right now, when police look at people, they have inherent bias because of how you dress, how poor you look, or the color of your skin.

    But if you get a bit of data on everyone you see - name, any kind of known criminal history - suddenly you see everyone around you very differently. You no longer rely on those crutches of pre-conceptions, because you have actual information on who is a potential problem and who is not.

    Furthermore, just the fact that when you see someone you see a name, brings an additional

  • Maybe it will result in fewer cops shooting innocent people by mistake. It’s not as good as the police changing their policies on needlessly shooting people.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Maybe it will result in fewer cops shooting innocent people by mistake. It’s not as good as the police changing their policies on needlessly shooting people.

      But this will result in greater confidence of mistaken identity.
      "Looks like the guy, let's check" versus "That's the guy, shoot to kill."

    • by Aighearach ( 97333 ) on Thursday March 05, 2020 @03:12PM (#59800428)

      Maybe it will result in fewer cops shooting innocent people by mistake. It’s not as good as the police changing their policies on needlessly shooting people.

      Or more likely, it will result in cops shooting more innocent people, because they feel more sure that they are a Bad Person.

    • by Agripa ( 139780 )

      Maybe it will result in fewer cops shooting innocent people by mistake. It’s not as good as the police changing their policies on needlessly shooting people.

      False positives will increase the number of shootings simply by increasing the number of encounters.

  • by lamer01 ( 1097759 ) on Thursday March 05, 2020 @02:45PM (#59800302)
    So, we can have all the civil liberties infringed upon but we cant have a recording when the police abuse their power or screw up? Maybe these cameras should send the video feed live to headquarters and have a superior officer approve/disapprove escalation of force.
  • My father and his brothers all had the same taste in women. Family pictures shows we have similar coloring and facial features. Some of them are career criminals. I would rather not have CrimFinder Beta getting me tackled and carted off. It gets worse for those with darker skin, they get less resolution from the camera and more false positives. Terrible idea.
  • I guess I might be alright with this...as long as we're allowed to use similarly invasive data-gathering methods against police officers, politicians and their associates. And of course, it would be necessary to archive and cross-reference pictures, names, locations, etc. for use in relevant civil rights matters.

  • Today, the company is based in a small, out-of-the way office in Pasadena, California,
    and claims to have sold more than 1,000,000
    body cameras to 1,500 agencies in more than 35 countries.

    There are still unanswered questions about Wolfcom's live facial recognition software.
    The company has made no claims about how accurate its software actually is
    or what databases it accesses to match suspects.

    The company has that "scam" feel about it. Let's see their real sales figures.

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...