Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Privacy AI The Courts United States

Boston Votes To Ban Government Use of Facial Recognition (cnet.com) 51

Boston is now the largest city on the East Coast to ban facial recognition technology for municipal use. They join cities like San Francisco, Oakland, California, and Cambridge, Massachusetts. CNET reports: The ordinance passed unanimously and will prevent the capital city from using facial recognition technology or obtaining software for conducting surveillance using the technology. "Boston should not be using racially discriminatory technology and technology that threatens our basic rights," City Councilor Michelle Wu said at the hearing on Wednesday.

The ordinance comes with exceptions, like allowing city employees to use facial recognition for authentication purposes such as unlocking their own devices. City officials can also use facial recognition technology for automatically redacting faces in images. But they can't use it for identifying people. The vote came from City Councilor Ricardo Arroyo, who sponsored the bill with Wu.
"This is a crucial victory for our privacy rights and for people like Robert Williams, who have been arrested for crimes they did not commit because of a technology law enforcement shouldn't be using," said Carol Rose, executive director of the ACLU of Massachusetts."Lawmakers nationwide should follow suit and immediately stop law enforcement use of this technology. This surveillance technology is dangerous when right, and dangerous when wrong."
This discussion has been archived. No new comments can be posted.

Boston Votes To Ban Government Use of Facial Recognition

Comments Filter:
  • Federal law supersedes State law, so now all surveillance in Boston will be federal surveillance.
    • Feds will have their own federal cameras installed alongside city's cameras?
      • by OMBad ( 6965950 )
        Most facial recognition services are provided by private companies like Palantir, NEC, etc. This is just the ACLU trying to stop the tide. You can do "facial recognition" with $100 in hardware. Too silly.
        • Oh, I forgot that this bullshit is apparently legal in the US. Oh, well.
          • by OMBad ( 6965950 )
            Errrr yeah...I hate to break it to ya again, but it isn't "illegal" where you live. It is widely deployed across the EU and Asia. Wow. For a supposed "tech site" the people here are woefully ignorant of what is going on.
            • Actually, no. It's not "widely deployed" where I live. For starters, for example the state would have to be willing to pay for it, which it isn't. As for illegality, recording the public if you're not a government entity *is* illegal.
              • by OMBad ( 6965950 )
                If you are living in Asia or the EU you are 100% incorrect. Amazing how ignorant people are here.
                • I live in a EU country, and again, no, it's not "widely deployed" where I live. Unless you're suggesting the existence of some secret "EU cameras" infiltrating my country.
      • by hey! ( 33014 )

        We know for a fact that the FBI uses hidden cameras in public places to watch people they consider national security threats because footage has been used as evidence in cases, and because of FBI court briefs arguing that footage from hidden cameras in public places should not be subject to FOIA requests.

    • Grim, but still an important win I think.

      Maybe if State bans become more frequent, it could establish a general distrust of such technology to the population in general and eventually make it to a federal level.

      Optimistic, perhaps naive... but I have hope!

      • by OMBad ( 6965950 )
        Distrust? Hate to break it to you, but "facial recognition" is already used by almost every American. Ever upload a picture to Facebook, or Google, or Amazon? They are run through facial recognition software. How do you think auto-tagging works?
        • true, but it is in line with my point. These small wins will hopefully help to set up a standard of non-acceptance.

          this comment is also in line with who I was responding to, literally stating despite this the tech already is or will soon be used anyways. I just hope this is part of a movement to shift it into a going away state.

    • by hey! ( 33014 )

      Well, sure, but the devil's in the details. *Which* Federal law are we talking about here? It's not like anyone who works for the Federal Government can violate any local law he wants just because it makes his job easier.

    • Tell the to the pot store down the street from me.
    • Federal law is rarely pitted directly _against_ local law. Boston has a long history of resisting federal authority, including the signing of the Declaration of Independence in 1776, and more recently as a "sanctuary city" for illegal immigrants. Please note: referring to them as "illegal immigrants" is not a moral or racist judgment, it's merely a description of their legal status.

  • This isn't going to end racism nor will it stop racists! It makes racists harder to find, because it were racists who uses technology wrong and thereby have caused the problem. It sure isn't the machines doing it. So now we take away the technology, take the easy way, best look away and hope the problem disappears with it magically.

    • by OMBad ( 6965950 )
      Racism is just the convenient excuse that the ACLU is using to push this through quickly. That is currently the easiest way to push through legislation. But they are doing it because they believe facial recognition is a threat to individual freedom (probably true).
      • But they are doing it because they believe facial recognition is a threat to individual freedom (probably true).

        No amount of cameras and identification systems is going to stop you from committing a crime. The freedom for an individual to become a criminal is at no risk here.

        • Oh? Body cameras have reduced police violence profoundly in districts that provide them. The evidence is often politically tinged, but the results seem promising.

          • It's not the cameras stopping them though. It's their own brains doing it.

          • by malkavian ( 9512 )

            What isn't certain is whether police violence has reduced, or claims of police violence has reduced.
            Either way, it's a step in the right direction of accountability and an increased accuracy for both parties. As law is predicated on evidence, it's a decent addition to any case.

    • by malkavian ( 9512 )

      What racists? This is intended to act as a flag to police so they can use their scant resources to investigate where it seems there's a match.
      If the police chose to come in heavy handed without sufficient evidence, and without a photofit matching the guy, they went off half cocked, and ended up with a clusterfuck. That's what happens when you don't use your senses properly. If the photofit was a good match (enough to warrant actually taking the guy in and investigating further), this this is analogous to

      • What racists?

        When you think this wasn't racism, then ask yourself, how many people need to be arrested and dragged to the police station before it becomes racism?

        It doesn't become racism after it happens to a several thousand people. It's racism when it happens to one.

  • This is how the Butler Ian Jihad began.
  • by rsilvergun ( 571051 ) on Wednesday June 24, 2020 @08:04PM (#60224662)
    I've got More Than a Feeling that it will.

    I'll just see myself out...
  • by Anonymous Coward

    But I thought that facial recognition technology has been demonstrated to be less reliable when used on images of minorities. So while folks would tend to be identified (and apprehended on outstanding warrants) more reliably.

    So yeah. This technology discriminates against white people. Ban it.

  • by istartedi ( 132515 ) on Wednesday June 24, 2020 @10:14PM (#60224954) Journal

    Apparently, they had More Than a Feeling [youtube.com] that this power might be abused.

  • Facial recognition is almost done. The Chinese are already testing a system that recognizes based on body metrics and movement. Wear all the masks and hoodies you want soon it won't matter. You can be identified by the ratio of your limbs, the angle of your back, the way you strike your heel and roll your foot, and so on. There won't be any anonymity. Crying about it and passing laws about whether gummint can see your nose or the color of your eyes is pointless.
  • If the law is good it should apply to EVERYONE, including "city employees", law enforcement, EVERYONE. Equally. Just because you got hired for a job doesn't mean you should get to do things that are bad.

    This starts from the top - Congress and the new president of the US should get the same healthcare and health insurance as the veterans get... as the rest of us get. Defund the ACA and call it "Obamacare"... fine... so long as it affects you like it affects US.

    Boston - good start, but the carve out is una

  • The city can't use it, but if the data is saved a higher level of government can still subpoena the raw video and then apply their own software to it.

    Overall though this seems like an odd time to be proposing these things. I get the general idea behind it, but given the recent rioting activity - which has caused millions of dollars in property damage - the immediate suspicion of motive behind this law is that you're just trying to prevent identification and prosecution of those who participated in the riot

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...