Microsoft Turned Down Facial-Recognition Sales On Human Rights Concerns (reuters.com) 46
An anonymous reader quotes a report from Reuters: Microsoft recently rejected a California law enforcement agency's request to install facial recognition technology in officers' cars and body cameras due to human rights concerns, company President Brad Smith said on Tuesday. Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures. AI has more cases of mistaken identity with women and minorities, multiple research projects have found.
On the other hand, Microsoft did agree to provide the technology to an American prison, after the company concluded that the environment would be limited and that it would improve safety inside the unnamed institution. Smith explained the decisions as part of a commitment to human rights that he said was increasingly critical as rapid technological advances empower governments to conduct blanket surveillance, deploy autonomous weapons and take other steps that might prove impossible to reverse. Smith also said at a Stanford University conference that Microsoft had declined a deal to install facial recognition on cameras blanketing the capital city of an unnamed country that the nonprofit Freedom House had deemed not free. Smith said it would have suppressed freedom of assembly there.
On the other hand, Microsoft did agree to provide the technology to an American prison, after the company concluded that the environment would be limited and that it would improve safety inside the unnamed institution. Smith explained the decisions as part of a commitment to human rights that he said was increasingly critical as rapid technological advances empower governments to conduct blanket surveillance, deploy autonomous weapons and take other steps that might prove impossible to reverse. Smith also said at a Stanford University conference that Microsoft had declined a deal to install facial recognition on cameras blanketing the capital city of an unnamed country that the nonprofit Freedom House had deemed not free. Smith said it would have suppressed freedom of assembly there.
important considerations for all skin tones (Score:2, Informative)
"Smith explained the decisions as part of a commitment to human rights that he said was increasingly critical as rapid technological advances empower governments to conduct blanket surveillance, deploy autonomous weapons and take other steps that might prove impossible to reverse."
Yet again, Windbourne is retarded AF (Score:1)
NOBODY is concerned about there being freedoms in China, BECAUSE THERE AREN'T. Protecting the US public from these things is all this does, and you think Microsoft is helping the Chinese military prepare to fight a war, lol?
You are retarded.
Re: (Score:1)
NOBODY is concerned about there being freedoms in China, BECAUSE THERE AREN'T.
Irrelevant. It's software, once it's developed it can be copied, pasted, and have some strings changed to make it work anywhere.
Re: (Score:3)
Seriously, this is funny. MS and Google pretend to care about human rights here, but then in China, they will happily help Chinese gov murder their own.../quote> How, where and when?
Bullshit, did you RTFA? (Score:3)
You are 100% correct, except for MS. FTFA:
I mean, Cuba, Iran and NK aren't even options due to sanctions, so that pretty much means China.
Good. (Score:3)
I don't have a lot of positive things to say about Microsoft but this is something they correctly evaluated. It may have been purely because they foresaw the awful PR they would get as a result in the future but it was still a good call.
Re: (Score:2)
Re: (Score:1)
There are four cases.
1. Facial recognition correctly identifies person as non-criminal. End of problem.
2. Facial recognition correctly identifies person as criminal. End of problem.
3. Facial recognition INcorrectly identifies person as criminal. Person uses normal identification on the spot to correct problem. Again, end of problem.
4. Facial recognition INcorrectly identifies person as criminal but they have no
Re: (Score:2)
Bollox. I've already had this problem with a plate reader where the back end system had my license incorrectly flagged as suspended.
You showed the current registration and the problem was solved, yes? Your problem is different than a human calling in your plate and being told your plate was suspended exactly how?
You're totally ignoring the implications of incorrect data and the security of that data and the constitutional impacts such as the right to travel.
I covered the implication of an incorrect false positive, which is no different than a current false positive.
This has nothing to do with the "right to travel".
Microsoft Turned Down Facial-Recognition Sales (Score:3, Insightful)
FTFY.
Not human rights (Score:2)
Turning down a sale because they are concerned about the technology's accuracy is not turning it down based on a human rights concern. If it was a human rights concern, they'd have turned down the sale because they were concerned it would be misused by law enforcement.
Re: (Score:1)
I'm not arguing that this technology is not rife with the possibility of easily doing something heinous with it. I'm totally on board with that. I'm arguing that, in this case, Microsoft refusing to sell it because it didn't work properly is not a rights issue. It's a functional issue. I'd be impressed if they refused to sell it to law enforcement, not because it didn't work, but because it lowers the activation energy required for abuse so much that it essentially is a catalyst for it. Microsoft shoul
So they admit they are racist misogynists (Score:1)
Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures.
So that naturally leads to the question, why the heck are they training these things with mostly white and male pictures? Couldn't find any pictures of women on the internet?
Re: (Score:2)
Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures.
So that naturally leads to the question, why the heck are they training these things with mostly white and male pictures? Couldn't find any pictures of women on the internet?
Well, it's a California law enforcement contract, and, ahem, clearly, [occidentaldissent.com] those damn white males are the target market in a round up the usual suspects.
I bet Facebook didn't say no (Score:1)
Zuck will do anything to make a buck. Especially if it involves exploiting others.
Not everyone in prison has been convicted (Score:2)
This is still a violation of human rights and privacy rights.
Train them on Women and people of color. (Score:2)
Look, when you identify the source of a problem, you don't ignore it, you don't cease using the technology, you FIX IT.
This is not a new issue. Ten years ago, this issue was so widely known, a TV show made fun of it ( Better Off Ted,Season 1, episode 4, Racial Sensitivity, [imdb.com]).
So if we have not sufficiently trained AI to detect people of color and women, then start TRAINING them to do it.
Or you could just stop trying to track everyone without a warrant.
Re: (Score:2)
1. According to the original article, "AI has more cases of mistaken identity with women and minorities, multiple research projects have found." So yes, they do have some evidence for it.
2. The Officers in the field would be likely to arrest and/or detain and/or question anyone that the system flagged, yes.
3. It is highly probable that the users (read: Police Officers) would be incapable of evaluating the effectiveness/correctness of the system. The agency involved was asking for installation in the rea
How to make great facial recognition technology (Score:1)
2. Talk to police, mil, property owners, and governments in Africa, Asia, South America about the advanced tech they can get for "free".
3. Install the hardware like with the Domain Awareness System in Lower Manhattan. https://en.wikipedia.org/wiki/... [wikipedia.org]
4. Work as a private, public, gov partnership in Africa, Asia, South America.
5. Gather the global math needed to detect all the variation in the human face globally.
6. At an airport, port, bus station, rail and on every
We really need penis-recognition software. (Score:2)
This kind of hurts... (Score:2)
When a major corporation cares more about civil rights than law enforcement, that points to serious problems.
I tend to assume that "bad cops" are just isolated incidents that get a lot of attention, but this may force me to reconsider my assumptions.
Bake the cake! (Score:1)