Boston Votes To Ban Government Use of Facial Recognition (cnet.com) 51
Boston is now the largest city on the East Coast to ban facial recognition technology for municipal use. They join cities like San Francisco, Oakland, California, and Cambridge, Massachusetts. CNET reports: The ordinance passed unanimously and will prevent the capital city from using facial recognition technology or obtaining software for conducting surveillance using the technology. "Boston should not be using racially discriminatory technology and technology that threatens our basic rights," City Councilor Michelle Wu said at the hearing on Wednesday.
The ordinance comes with exceptions, like allowing city employees to use facial recognition for authentication purposes such as unlocking their own devices. City officials can also use facial recognition technology for automatically redacting faces in images. But they can't use it for identifying people. The vote came from City Councilor Ricardo Arroyo, who sponsored the bill with Wu. "This is a crucial victory for our privacy rights and for people like Robert Williams, who have been arrested for crimes they did not commit because of a technology law enforcement shouldn't be using," said Carol Rose, executive director of the ACLU of Massachusetts."Lawmakers nationwide should follow suit and immediately stop law enforcement use of this technology. This surveillance technology is dangerous when right, and dangerous when wrong."
The ordinance comes with exceptions, like allowing city employees to use facial recognition for authentication purposes such as unlocking their own devices. City officials can also use facial recognition technology for automatically redacting faces in images. But they can't use it for identifying people. The vote came from City Councilor Ricardo Arroyo, who sponsored the bill with Wu. "This is a crucial victory for our privacy rights and for people like Robert Williams, who have been arrested for crimes they did not commit because of a technology law enforcement shouldn't be using," said Carol Rose, executive director of the ACLU of Massachusetts."Lawmakers nationwide should follow suit and immediately stop law enforcement use of this technology. This surveillance technology is dangerous when right, and dangerous when wrong."
Federal law.... (Score:2)
Re: (Score:2)
Re: (Score:1)
Aww. You tried.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
We know for a fact that the FBI uses hidden cameras in public places to watch people they consider national security threats because footage has been used as evidence in cases, and because of FBI court briefs arguing that footage from hidden cameras in public places should not be subject to FOIA requests.
Re: (Score:3, Informative)
Re: (Score:2)
Re: Federal law.... (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
Grim, but still an important win I think.
Maybe if State bans become more frequent, it could establish a general distrust of such technology to the population in general and eventually make it to a federal level.
Optimistic, perhaps naive... but I have hope!
Re: Federal law.... (Score:2)
It's probably telling that you had to use a 1 instead of an i, and probably telling of the rest of us and our efforts...
I do wish you a wonderful night however, and would also like to extend that many states should offer adult high school. I know many provinces here in Canada do.
Re: (Score:1)
Re: (Score:2)
true, but it is in line with my point. These small wins will hopefully help to set up a standard of non-acceptance.
this comment is also in line with who I was responding to, literally stating despite this the tech already is or will soon be used anyways. I just hope this is part of a movement to shift it into a going away state.
Re: (Score:2)
Well, sure, but the devil's in the details. *Which* Federal law are we talking about here? It's not like anyone who works for the Federal Government can violate any local law he wants just because it makes his job easier.
Re: Federal law.... (Score:1)
Re: (Score:3)
Federal law is rarely pitted directly _against_ local law. Boston has a long history of resisting federal authority, including the signing of the Declaration of Independence in 1776, and more recently as a "sanctuary city" for illegal immigrants. Please note: referring to them as "illegal immigrants" is not a moral or racist judgment, it's merely a description of their legal status.
Re: (Score:2)
That's a relief. How about recognizing a co-worker? Can you still say: "Hi, Karl" in the morning, or will you have to explain, how you knew the name — and have you, per chance, used the forbidden method of face-recognition?
ungh... I think we need to at least have an adult discussion about the differences between a computer (which doesn't need sleep) and a person recognizing another. Yes, a person can remember another by their face, but a person isn't the same as sticking cameras on every street corner systematically recording all movements of all citizens. The way our memories work, you could literally call a person a liar for recalling they were 3 blocks away from where as they testified about a 30 minute period.
Re: (Score:2)
Phew...
How is this wonderful feature a disqualifying one? Or would you ban cops from using e-mails, for example — on account of it never needing sleep? "No, you filthy pigs, if you really want to communicate with each other, do it the old-fashioned way, via mail and carriers — who'll only work 9-5?"
Re: (Score:2)
I don't want to detract from your points, as I think they are important.
Specifically my concern is that cases could be built against people who they shouldn't be. Specifically, if you decide to buy a hot dog on a street corner and think nothing of it as it's part of a normal day, should you have to be tasked with remembering every single detail in case you might be implicated in a crime?
I think you could get me on all sorts of technical edge cases, but I'm not responding with specific accuracy, I'm just sa
Re: (Score:1)
I don't see, how this question is relevant... And the answer would be the same regardless of whether it was a camera, that saw you, or a live policeman.
It, probably, shouldn't be, but this too is not specific to
Re: (Score:3)
Yes, computer memories are perfect — if their analytics aren't (yet) — is that a bad thing?
No. Initially though, it's important to take into account that the data may be flawed, and to be responsible enough to repeat it as such... especially if the results conform perfectly to your previously settled belief set.
TLDR: Listen to opinions and facts that upset what you already, perhaps long ago, settled on as your truth. If, after a good listen, you're able to adapt your position based on new evidence, you're not part of the problem... but you owe it to yourself to convey that revelation to those le
That was the easy way (Score:2)
This isn't going to end racism nor will it stop racists! It makes racists harder to find, because it were racists who uses technology wrong and thereby have caused the problem. It sure isn't the machines doing it. So now we take away the technology, take the easy way, best look away and hope the problem disappears with it magically.
Re: (Score:2)
Re: (Score:2)
But they are doing it because they believe facial recognition is a threat to individual freedom (probably true).
No amount of cameras and identification systems is going to stop you from committing a crime. The freedom for an individual to become a criminal is at no risk here.
Re: (Score:2)
Oh? Body cameras have reduced police violence profoundly in districts that provide them. The evidence is often politically tinged, but the results seem promising.
Re: (Score:2)
It's not the cameras stopping them though. It's their own brains doing it.
Re: (Score:3)
What isn't certain is whether police violence has reduced, or claims of police violence has reduced.
Either way, it's a step in the right direction of accountability and an increased accuracy for both parties. As law is predicated on evidence, it's a decent addition to any case.
Re: (Score:2)
What racists? This is intended to act as a flag to police so they can use their scant resources to investigate where it seems there's a match.
If the police chose to come in heavy handed without sufficient evidence, and without a photofit matching the guy, they went off half cocked, and ended up with a clusterfuck. That's what happens when you don't use your senses properly. If the photofit was a good match (enough to warrant actually taking the guy in and investigating further), this this is analogous to
Re: (Score:2)
What racists?
When you think this wasn't racism, then ask yourself, how many people need to be arrested and dragged to the police station before it becomes racism?
It doesn't become racism after it happens to a several thousand people. It's racism when it happens to one.
And so it begins... (Score:1)
Re: And so it begins... (Score:1)
I hope it gives them Peace of Mind (Score:3)
I'll just see myself out...
Racially discriminatory technology (Score:1)
But I thought that facial recognition technology has been demonstrated to be less reliable when used on images of minorities. So while folks would tend to be identified (and apprehended on outstanding warrants) more reliably.
So yeah. This technology discriminates against white people. Ban it.
They had more than a feeling... (Score:3)
Apparently, they had More Than a Feeling [youtube.com] that this power might be abused.
So last year... (Score:1)
Enough with the carve-outs! (Score:1)
If the law is good it should apply to EVERYONE, including "city employees", law enforcement, EVERYONE. Equally. Just because you got hired for a job doesn't mean you should get to do things that are bad.
This starts from the top - Congress and the new president of the US should get the same healthcare and health insurance as the veterans get... as the rest of us get. Defund the ACA and call it "Obamacare"... fine... so long as it affects you like it affects US.
Boston - good start, but the carve out is una
City (Score:2)
The city can't use it, but if the data is saved a higher level of government can still subpoena the raw video and then apply their own software to it.
Overall though this seems like an odd time to be proposing these things. I get the general idea behind it, but given the recent rioting activity - which has caused millions of dollars in property damage - the immediate suspicion of motive behind this law is that you're just trying to prevent identification and prosecution of those who participated in the riot