Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Microsoft The Almighty Buck The Courts

Taylor Swift Reportedly Threatened To Sue Microsoft Over Racist Twitter Bot (digitaltrends.com) 84

When an artificially intelligent chatbot that used Twitter to learn how to talk unsurprisingly turned into a bigot bot, Taylor Swift reportedly threatened legal action because the bot's name was Tay. Microsoft would probably rather forget the experiment where Twitter trolls took advantage of the chatbot's programming and taught it to be racist in 2016, but a new book is sharing unreleased details that show Microsoft had more to worry about than just the bot's racist remarks. Digital Trends reports: Tay was a social media chatbot geared toward teens first launched in China before adapting the three-letter moniker when moving to the U.S. The bot, however, was programmed to learn how to talk based on Twitter conversations. In less than a day, the automatic responses the chatbot tweeted had Tay siding with Hitler, promoting genocide, and just generally hating everybody. Microsoft immediately removed the account and apologized.

When the bot was reprogrammed, Tay was relaunched as Zo. But in the book Tools and Weapons by Microsoft president Brad Smith and Carol Ann Browne, Microsoft's communications director, the executives have finally revealed why -- another Tay, Taylor Swift. According to The Guardian, the singer's lawyer threatened legal action over the chatbot's name before the bot broke bad. The singer claimed the name violated both federal and state laws. Rather than get in a legal battle with the singer, Smith writes, the company instead started considering new names.

This discussion has been archived. No new comments can be posted.

Taylor Swift Reportedly Threatened To Sue Microsoft Over Racist Twitter Bot

Comments Filter:
  • by burtosis ( 1124179 ) on Wednesday September 11, 2019 @08:04PM (#59183644)
    What did Microsoft think was gonna happen when they forced it to gaze upon a portal to the deepest pits of human hell for the human equivelant of centuries?
  • We've seen this shit before. It's all good fun.
  • Uh, what? (Score:4, Insightful)

    by msauve ( 701917 ) on Wednesday September 11, 2019 @08:14PM (#59183680)

    Taylor Swift reportedly threatened legal action because the bot's name was Tay.

    What a bitch. (If it were a guy, I'd say "asshole", does that make me politically incorrect/sexist these days?)

    • Re:Uh, what? (Score:4, Insightful)

      by Stormwatch ( 703920 ) <(rodrigogirao) (at) (hotmail.com)> on Wednesday September 11, 2019 @08:18PM (#59183688) Homepage

      Don't mince words: what a retarded cunt!

      • by mi ( 197448 )

        Retarded snowflake. It just is not the same without a hint of racism, you know.

        Gives me no pleasure to type this, but the rights not exercised — including the right to be a racist — are lost...

        • by Stormwatch ( 703920 ) <(rodrigogirao) (at) (hotmail.com)> on Wednesday September 11, 2019 @08:35PM (#59183740) Homepage

          Lolwut? Snowflake has nothing to do with race.

          • Everything has everything to do with race. Aren't you paying attention? Your enemies intend to deny you even your own humanity. They claim you are wrong, evil, and without standing. And should just be put away.

          • by mi ( 197448 )

            "Snowflake" is how a Black may refer to a White person [urbandictionary.com].

            A "Yellow Snowflake" would be a reference to a Latino or, sometimes, to an Asian. Especially offensive, if I must explain everything, because of the allusion to soiled snow...

        • The right to claim your three character affected nickname is infringed upon by an entirely unrelated software project?

          Um, no. Countersue for frivolous and abusive process. Ask for a dollar in damages. Pay your own attorneys.

      • by AmiMoJo ( 196126 )

        Except that she didn't do it. It even says so right in the summary. Her lawyer did.

        Do you think she told her lawyer to do that? It's doubtful. Mostly likely on retainer to do brand protection, same as the McDonald's one who sue business owners called McDonald or the RIAA goons acting on behalf of the artists.

        Outrage seems to so often be misdirected these days.

    • by Xenx ( 2211586 )
      But, asshole is non gender specific. In fact, it's not even limited to our species. It makes a good blanket insult. If you want a good derogatory term for a guy it would be bastard, or maybe son of a bitch, but the second implies negatively upon their mother as well and that may not be warranted.

      As to whether it's sexist, probably to some. But, personally, people are too god damn sensitive about that crap. If you knowingly use terms that the person you're talking to hates, it makes a you a dick... and at l
    • Re: (Score:3, Funny)

      I do not understand the notion that using foul language to upset someone must be done with politeness. Seems like something a fucking idiot would believe.

    • by Ogive17 ( 691899 )
      Oh please, if you had a billion dollar industry relying on your good name, you'd be sure to protect it as well.

      Just look at our thin-skinned POTUS, threatens anyone who disagrees.
    • Wha? This is a name? Her name? Nickname?

      Really. Fabricated controversy. Who or what can we blame for this?

    • by King_TJ ( 85913 )

      Just wait until someone codes an app for Mac or iOS that she doesn't approve of, using Swift!

      https://developer.apple.com/sw... [apple.com]

    • If you were a "guy" you would reconize that a bot named Tay specifically targeted toward teens was clearly named after her instead of pretending to be a jealous teenage girl yourself, when we all know you tuck your dick, dress like a girl, and pretend to be female.
  • Will she be suing Taye Diggs, as well?

    https://www.imdb.com/name/nm0004875/?ref_=nv_sr_1?ref_=nv_sr_1 [imdb.com]

  • by G-Man ( 79561 ) on Wednesday September 11, 2019 @08:48PM (#59183792)

    ...you probably think this bot is about you.

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Wednesday September 11, 2019 @09:19PM (#59183834)
    Comment removed based on user account deletion
  • by 0111 1110 ( 518466 ) on Wednesday September 11, 2019 @09:20PM (#59183838)

    The similarities were just too great. Brainless and racist both.

  • Question ... (Score:4, Insightful)

    by Retired ICS ( 6159680 ) on Wednesday September 11, 2019 @11:12PM (#59184094)

    What is a "Taylor Swift" and why do I (or anyone for that matter) care?

    • What is a "Taylor Swift" and why do I (or anyone for that matter) care?

      In the Yellow pages it was a very quick with the needle and thread haberdasher back where I lived as a kid.

    • What is a "Taylor Swift" and why do I (or anyone for that matter) care?

      Britney Spear's replacement. She seems to be approaching the losing it stage though so there will be a new one before too long. I wouldn't worry about it.

    • Comment removed based on user account deletion
  • by Solandri ( 704621 ) on Wednesday September 11, 2019 @11:40PM (#59184138)
    Phonemes [wikipedia.org] are individual units of sound. English only has 44 of them [dyslexia-r...g-well.com] - 24 consonants, 20 vowels, giving 480 one-syllable combinations (960 if you include combinations where the vowel leads).

    If we're going to allow celebrities to claim ownership of these single-syllable combinations, we're going to run out of them awfully quickly. Any nobody will be able to use any short names for anything without running afoul of the celebrity(ies) "owning" them. Down this path lies madness.
    • It's obvious the lawsuit was frivolous, but it'd cost Microsoft more to deal with it than to just change the name.
  • But Swift's nickname is Tay Tay.
  • Series! (Score:4, Funny)

    by MancunianMaskMan ( 701642 ) on Thursday September 12, 2019 @04:43AM (#59184574)
    in other news: Ms Swift is suing the AMS for promoting the Taylor series, which purports to be able to "decompose" just about anything into a "polynomials" to arbitrary accuracy.
  • "When the bot was reprogrammed, Tay was relaunched as Zo."

    Now Zoe Saldana will be suing or Zooey Deschanel or ...

  • by MitchDev ( 2526834 ) on Thursday September 12, 2019 @06:19AM (#59184728)

    Based on the story, I'm assuming Taylor Swift is slang for "fucking moron"

  • self-absorbed adjective
    self-absorbed | \ self-b-srbd , -zrbd\
    Definition of self-absorbed
    : see Taylor Swift

  • She was probably considering about search suggestions and she didn't want to come up as a result with a genocidal bot in the same page or suggestion box.
  • This is the same person who intends to re-record (and re-sell) her early catalog because she made a bad deal at the beginning and doesn't own the rights to her own early work. Yeah, people should pay because YOU made a bad deal. Piss off, Taylor Swift.
    • And for the longest time the Beatles (actually Lennon/McCarthy) didn't own the rights to their music.

      https://www.theguardian.com/mu... [theguardian.com]

      • by flippy ( 62353 )

        And for the longest time the Beatles (actually Lennon/McCarthy) didn't own the rights to their music.

        https://www.theguardian.com/mu... [theguardian.com]

        Yep, and they survived JUST FINE without re-recording the music and asking their fans to pay for it again. That's the part that's a dick move.

        • Like many other things, if you don't like it don't buy it. If she wants to re-record her songs for her own control for her benefit more power to her. If you don't like it don't buy it.

          OTOH it maybe interesting to listening to her redo the early stuff with a different ear and and a different approach, who knows? Recording artists often look over their catalog and sometimes wish they had a chance to do things differently and she maybe taking that chance to do so.

          She can only ASK, not compel. There's lots more

No spitting on the Bus! Thank you, The Mgt.

Working...