Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses Government Google United States

Google Wants To Work with the Pentagon Again, Despite Employee Concerns (nytimes.com) 51

Three years after an employee revolt forced Google to abandon work on a Pentagon program that used artificial intelligence, the company is aggressively pursuing a major contract to provide its technology to the military. From a report: The company's plan to land the potentially lucrative contract, known as the Joint Warfighting Cloud Capability, could raise a furor among its outspoken work force and test the resolve of management to resist employee demands. In 2018, thousands of Google employees signed a letter protesting the company's involvement in Project Maven, a military program that uses artificial intelligence to interpret video images and could be used to refine the targeting of drone strikes. Google management caved and agreed to not renew the contract once it expired.

The outcry led Google to create guidelines for the ethical use of artificial intelligence, which prohibit the use of its technology for weapons or surveillance, and hastened a shake-up of its cloud computing business. Now, as Google positions cloud computing as a key part of its future, the bid for the new Pentagon contract could test the boundaries of those A.I. principles, which have set it apart from other tech giants that routinely seek military and intelligence work. The military's initiative, which aims to modernize the Pentagon's cloud technology and support the use of artificial intelligence to gain an advantage on the battlefield, is a replacement for a contract with Microsoft that was canceled this summer amid a lengthy legal battle with Amazon. Google did not compete against Microsoft for that contract after the uproar over Project Maven.

The Pentagon's restart of its cloud computing project has given Google a chance to jump back into the bidding, and the company has raced to prepare a proposal to present to Defense officials, according to four people familiar with the matter who were not authorized to speak publicly. In September, Google's cloud unit made it a priority, declaring an emergency "Code Yellow," an internal designation of importance that allowed the company to pull engineers off other assignments and focus them on the military project, two of those people said. On Tuesday, the Google cloud unit's chief executive, Thomas Kurian, met with Charles Q. Brown, Jr., the chief of staff of the Air Force, and other top Pentagon officials to make the case for his company, two people said. Google, in a written statement, said it is "firmly committed to serving our public sector customers" including the Defense Department, and that it "will evaluate any future bid opportunities accordingly."

This discussion has been archived. No new comments can be posted.

Google Wants To Work with the Pentagon Again, Despite Employee Concerns

Comments Filter:
  • Ultimately, Google is going to do what is best for their bottom line. Unless the PR hit tanks the company, it's smarter to dump the employees that tell Google to have morals.
    • It would be nice to see them bring back their old 'dont be evil' slogan.
    • by Brain-Fu ( 1274756 ) on Wednesday November 03, 2021 @03:15PM (#61955191) Homepage Journal

      There is nothing immoral about developing weapons. Morality comes in to play when one decides when to use those weapons.

      In the real world, their are real threats. There ARE evil people out there who want to kill you and your loved ones and take everything you have for themselves. And they have weapons. We MUST have better weapons in order to protect ourselves from them. Therefore, it is morally correct to develop weapons. That includes weapons that use AI.

      There may be additional moral obligations like exercising due diligence in ensuring that the AI will be effective. Failing on that front would be immoral. But weapons development in-and-of itself is merely a practical necessity.

      • You're talking like and adult who realises the world is shades of grey. Unfortunately the sort of people who object are kidults still at a schoolchild level of awareness of the world and realpolitik and still believing people are neatly divided into Good and Bad.

      • That includes weapons that use AI

        No, morality would require that your weapons be safe. Intended target an obvious exception. A fully autonomous weapon may be inherently unsafe, we just don't know yet.

        • And why do you think this contract is for some sort of fully autonomous weapon? The summary says cloud computing.

          Of course, more powerful weapons under full human control do increase the power of humans to kill. I was really disappointed by the recent airstrike in retaliation for the suicide bombing at Bagram Airbase. You look hard enough for terrorist activity, you talk yourself into seeing it.

          • by drnb ( 2434720 )

            And why do you think this contract is for some sort of fully autonomous weapon?

            I don't. This thread is about the morality of building weapons. Read the thread to get up to speed.

      • by ljw1004 ( 764174 )

        There is nothing immoral about developing weapons. Morality comes in to play when one decides when to use those weapons.

        That assertion need a heck of a lot more justification.

        If there existed a weapon which could only be used in an immoral manner, does or doesn't that make it immoral to develop the weapon in the first place -- since you know the inevitable outcome of your development? What if it's not an inevitable outcome, but a 99.9% likely outcome? What if you're developing a weapon solely so that people can be trained on how to disarm it?

        The first two that come to mind are cluster bombs and land mines, both of which are

        • by vlad30 ( 44644 )

          AI is an interesting case. We tech-savvy folk on slashdot know that the foreseeable state of AI is really just not very good pattern recognition that fails in numerous situations. We can therefore make a solid prediction that autonomous AI-powered weaponry will have a high mistarget rate in the foreseeable future. I think it's wide open for discussion as to whether this makes it inherently immoral.

          I read this a lot on slashdot whether it be AI, Fusion, electric garden tools. "Oh it will never work because it doesn't work now and when I tried 10 years ago it failed and I can't even be bothered to google or do some research"

          In every aspect of technology there is progress sometimes slow sometimes a bit faster and you make breakthroughs when you try. If you google properly i.e. get past the sponsored pages you will find a lot of breakthroughs each one step closer to the dream goal.

          The electric garden

      • Sure. But not everyone wants to work for a defense contractor. It is an entirely different work environment from consumer software, hardware, or internet services. I tried it once my first job out of college at Lockheed Martin. And it was goddamned miserable for many reasons totally unrelated to whether or not it was moral to build weapons. So I quit, packed up, moved to San Jose, and never looked back.

        Set morality totally aside... if a Silicon Valley employer lured me in with the promise of one work e

      • Nuclear weapons and biological agents are examples where your statement becomes grey. The risk of a rogue person wielding those weapons and the damage that could result are serious considerations. National and international policy can shift on that basis - see North Korea. Back here in the US, we've had generals put in place safeguards post-2020 election to counter the rogue-agent-in chief:

        From https://www.washingtonpost.com/politics/2021/07/16/power-up-top-us-general-warned-post-election-coup-new-book-deta

      • by larwe ( 858929 )

        Morality comes in to play when one decides when to use those weapons.

        Morality comes into play long before that; it comes into play even before beginning weapons development. Would it be moral to develop, intentionally, a bioweapon so dangerous and virulent that a single drop of it escaping the lab would kill or maim every person in the world? Would it be moral to develop a weapon with effects so unpredictable that you don't know whether it's the equivalent of a rifle bullet or a nuclear warhead, and in fact it could span that entire spectrum of effects over time, randomly? P

      • by vlad30 ( 44644 )
        Also If you develop very good weapons and demonstrate that capability others will be afraid of the result if they attack you for fear of the end result. USA vs Russia and the cold war as an example. However things have changed terrorists don't care if they die to further their agenda and will wait a lifetime if necessary (Afghanistan) and a country which has a billion plus people would happily go to war to reduce their population and blame all the deaths on the country initially defending themselves. The o
    • by hey! ( 33014 )

      Once upon a time, workers wore blue shirts, had minimal education, and were lucky to have a job; managers wore white shirts, had a little more education, and told the workers what to do. In that world your strategy of dumping the employees who asked questions is a no-brainer. Workers brought nothing of any value to the organization that wasn't rooted unquestioning obedience to their manager's orders.

      Often I've been amazed by high tech managers who shoot themselves in the foot by not retaining or valuing h

      • by drnb ( 2434720 ) on Wednesday November 03, 2021 @03:55PM (#61955353)

        ... had minimal education ...

        Uh, no. There is an education process for both "blue" and "white" collar jobs. Don't confuse college with education, its merely one source.

        Otherwise you can be a college educated idiot like former NYC Mayor Mike Bloomberg who thinks farming is trivial.
        "I can teach anybody – even people in this room, so no offense intended – to be a farmer. It’s a process. You dig a hole, you put a seed in, you put dirt on top, you add water, up comes corn."

        ... managers wore white shirts, had a little more education ...

        No, workers and managers had different educations. Management, leadership, is a skill to be learned, just like a trade. Also the best managers tended to be educated in both the trades skills and the management skills.

      • Often I've been amazed by high tech managers who shoot themselves in the foot by not retaining or valuing high performing employees, and it boils down to this: they still act as if they work in a blue collar/white collar world. Google doesn't operate in that world.

        So while it offends many people that employees have strong opinions about what the company should and should not do, Google can't dismiss employee concerns out of hand without considering how deep, wide, and strategically placed opposition is.

        Unless the employee things the election was stolen or vaccines should not be mandated by their jobs, correct?

    • Comment removed based on user account deletion
    • Yup - I would expect that all of the 2018 revolt team are long gone now.
  • Some Google (and Amazon) employees are also against Project Nimbus, the Israeli government's next big effort to oppress Palestinians, and have written an open letter (https://www.theguardian.com/commentisfree/2021/oct/12/google-amazon-workers-condemn-project-nimbus-israeli-military-contract). Google and Amazon will ignore that too. Money counts for more than morality.

  • If they dont like it they know where the door is. Employment isnt a right , it's a privilege , something these self important virtue signalling kids should remember.

    • by Anonymous Coward
      Clean up your language. "Privilege" language is for driver education class, persons under institutional stricture, royalists, and Marxist revisionists. Employment is an agreement, not a grant.
    • Employees have the right not to work for an abusive, megalomaniacal company who decided to discard the slogan "Don't be evil" because it was impacting their profit margins. Employers do not have the right to force people to work for them.
      • by Viol8 ( 599362 )

        "Employees have the right not to work for an abusive, megalomaniacal company"

        Indeed they do, and as I said, they know where the door is.

  • The US Military is one of the larger parts of the gov, of course Google wants in that.
  • Don't like it?
    Don't accept the fat google paycheck.

    Far be from me to ever sympathize with google but the idea that a business owner is going to be 'woked' out of doing business with whomever they want by their employees (particularly utopian millennials) is fucking insane.

    • Re: (Score:1, Insightful)

      Comment removed based on user account deletion
      • by Bodie1 ( 1347679 )

        "Noted."

      • Re: (Score:2, Insightful)

        by PDiddly ( 7030698 )

        "The notion that employees shouldn't raise moral concerns and try to steer their employer in a positive direction (and should be discouraged from doing so) is not merely psychopathic and evil, it's also bad business."

        So you believe Christian employees should be able to raise their moral concerns doing business with homosexuals, Jews or Muslims and direct the company to not do business with or hire members of those groups? Or is it just the morals you agree with that get that protection?

        People that believe i

        • by Whibla ( 210729 )

          "The notion that employees shouldn't raise moral concerns and try to steer their employer in a positive direction (and should be discouraged from doing so) is not merely psychopathic and evil, it's also bad business."

          So you believe Christian employees should be able to raise their moral concerns doing business with homosexuals, Jews or Muslims and direct the company to not do business with or hire members of those groups?

          You have an extremely odd notion of what Christianity entails. Still, I suppose this might seem to be an adequate strawman when you're clutching at straws.

          Or is it just the morals you agree with that get that protection?

          People that believe in being prepared for war because it exists and will happen again, do they get to steer their employer in what they consider a positive direction?

          Morals and even the simple words "Positive Direction" are not some universal things. If you believe it is good for the people from Google in the article to do it, then you must agree that when these others do it is good also.

          I'm not the OP, but: Yes ("Eating babies in the cafeteria is strictly prohibited");
          Yes (Like the OP said, not listening to your employees is bad business);
          True (what some consider moral others consider supporting immorality, hence immoral);
          And, there are very few things I must do, but broadly speaking yes. It's called tolerance, and is the basis of civil

          • "You have an extremely odd notion of what Christianity entails"
            Amusing that you didn't answer the question.

            The question posed, quite clearly, was "if you insist that employees are RIGHT to foist their morals on their employer" are you fine with people with what are (to you) aberrant morals - the particular flavor or denomination is IRRELEVANT, of course, but the only point you keyed on.

            Almost like you wanted to avoid the question.

            • by Whibla ( 210729 )

              Amusing that you didn't answer the question.

              I answered the three questions that were not completely ridiculous. Some questions however just don't deserve an answer.

              "... employees should be able to raise their moral concerns..."

              The question posed, quite clearly, was "if you insist that employees are RIGHT to foist their morals on their employer" are you fine with people with what are (to you) aberrant morals

              I disagree. One of these things is not like the other. Words have a defined meaning; you don't get to change or expand this definition then pretend we're talking about the same thing.

              Almost like you wanted to avoid the question.

              Like I said, some questions simply aren't worthy of a reply.

              However, if you insist on a restatement of the question: "Am I happy with someone, who takes a moral stance in opposition to me, raising moral concer

              • Fair points, and thanks for the well-considered answer. (And yes, FWIW, PP is exactly the right organization in that context - congrats (?) on understanding at least that nuance of increasingly-Byzantine shitty American political terrain).

                To be crystal clear:
                - I don't have any problem with an employee stating their moral belief to their employer, and advocating a direction for the business
                - my problem lies mainly with a milquetoast employer listening.

                I have no problem with someone at Google saying "don't w

      • "If we had a culture in which the importance of employees was recognized "
        They are recognized.
        Or did you forget that whole 'paycheck' part?

        Oh, wait, no, I get it: you want to RUN the company without any of the risk or responsibility.

        I like how you put words in my mouth: who said that they shouldn't raise moral concerns? Then it's even better when you argue against the words you put in my mouth. I believe that rhymes with 'shtrawman'?

    • They say "Google employees" in articles, but really there are very few actual Google employees. Most of them are what Google calls TVC (Temporary, Vendor, Contractor) and are employed by companies like Accenture or a subcontractor to Accenture. They get paid chicken feed compared to the relatively few actual Google employees. It's basically Microsoft's Permatemps 2.0, only this time they found a loophole in the law to exploit.

  • by Anonymous Coward

    The outcry led Google to create guidelines for the ethical use of artificial intelligence, which prohibit the use of its technology for weapons or surveillance, and hastened a shake-up of its cloud computing business.

    Google employees got upset about video surveillance? What is Google if not a mass surveillance company? It makes its money by knowing everything about everybody, all of the time. The services it provides just enable that data gathering activity.

    • "The outcry led Google to create guidelines for the ethical use of artificial intelligence" - and who would be so naive to think that Google would actually give a hoot about these guidelines?
  • Except for military of every other superpower in the world. Chinese companies have no choices regarding supporting their military with the best tech they can offer. US companies are ethically (not legally) obligated to do the same for our military to maintain balance of power. Yes, despite US war crimes in Middle East. China/Russia in control of Middle East will not be better. US "merely" wants countries to act its way in international arena. We don't seek to take territory for our total control or put ethn

  • it's a huge pile of Data and money they want to slurp at.

Your password is pitifully obvious.

Working...