Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Microsoft Privacy Technology

Microsoft Turned Down Facial-Recognition Sales On Human Rights Concerns (reuters.com) 46

An anonymous reader quotes a report from Reuters: Microsoft recently rejected a California law enforcement agency's request to install facial recognition technology in officers' cars and body cameras due to human rights concerns, company President Brad Smith said on Tuesday. Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures. AI has more cases of mistaken identity with women and minorities, multiple research projects have found.

On the other hand, Microsoft did agree to provide the technology to an American prison, after the company concluded that the environment would be limited and that it would improve safety inside the unnamed institution. Smith explained the decisions as part of a commitment to human rights that he said was increasingly critical as rapid technological advances empower governments to conduct blanket surveillance, deploy autonomous weapons and take other steps that might prove impossible to reverse.
Smith also said at a Stanford University conference that Microsoft had declined a deal to install facial recognition on cameras blanketing the capital city of an unnamed country that the nonprofit Freedom House had deemed not free. Smith said it would have suppressed freedom of assembly there.
This discussion has been archived. No new comments can be posted.

Microsoft Turned Down Facial-Recognition Sales On Human Rights Concerns

Comments Filter:
  • by Anonymous Coward

    "Smith explained the decisions as part of a commitment to human rights that he said was increasingly critical as rapid technological advances empower governments to conduct blanket surveillance, deploy autonomous weapons and take other steps that might prove impossible to reverse."

  • by Gravis Zero ( 934156 ) on Wednesday April 17, 2019 @05:27PM (#58451392)

    I don't have a lot of positive things to say about Microsoft but this is something they correctly evaluated. It may have been purely because they foresaw the awful PR they would get as a result in the future but it was still a good call.

  • by rsilvergun ( 571051 ) on Wednesday April 17, 2019 @05:37PM (#58451426)
    On PR Concerns.

    FTFY.
  • Turning down a sale because they are concerned about the technology's accuracy is not turning it down based on a human rights concern. If it was a human rights concern, they'd have turned down the sale because they were concerned it would be misused by law enforcement.

  • Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures.

    So that naturally leads to the question, why the heck are they training these things with mostly white and male pictures? Couldn't find any pictures of women on the internet?

    • Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures.

      So that naturally leads to the question, why the heck are they training these things with mostly white and male pictures? Couldn't find any pictures of women on the internet?

      Well, it's a California law enforcement contract, and, ahem, clearly, [occidentaldissent.com] those damn white males are the target market in a round up the usual suspects.

  • Zuck will do anything to make a buck. Especially if it involves exploiting others.

  • This is still a violation of human rights and privacy rights.

  • Look, when you identify the source of a problem, you don't ignore it, you don't cease using the technology, you FIX IT.

    This is not a new issue. Ten years ago, this issue was so widely known, a TV show made fun of it ( Better Off Ted,Season 1, episode 4, Racial Sensitivity, [imdb.com]).

    So if we have not sufficiently trained AI to detect people of color and women, then start TRAINING them to do it.

    Or you could just stop trying to track everyone without a warrant.

  • 1. Get funds. A lot of money.
    2. Talk to police, mil, property owners, and governments in Africa, Asia, South America about the advanced tech they can get for "free".
    3. Install the hardware like with the Domain Awareness System in Lower Manhattan. https://en.wikipedia.org/wiki/... [wikipedia.org]
    4. Work as a private, public, gov partnership in Africa, Asia, South America.
    5. Gather the global math needed to detect all the variation in the human face globally.
    6. At an airport, port, bus station, rail and on every
  • I understand Bezos and Amazon are hard at work.
  • When a major corporation cares more about civil rights than law enforcement, that points to serious problems.

    I tend to assume that "bad cops" are just isolated incidents that get a lot of attention, but this may force me to reconsider my assumptions.

  • Bake the cake. There's no lawful reason, I am ASSURED by progressives, to refuse to bake the cake based on what the customer intends to do with it. And in this case the product is not even custom-made. The AI is the AI is the AI. The fact that the customer intends to use it as designed to be used, but which would bring to light a defect in the product, is double plus good hypocrisy. If the product is for sale, and a non-prohibited buyer (munitions, etc) is lining up to purchase it, you must sell it to them.

Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari

Working...