That Time Taylor Swift Threatened To Sue Microsoft Over Its Racist Chatbot

from the tay-tay dept

I don't know much about Taylor Swift, but I do know two things. First, she apparently has built a career out of making music about men with whom she's had breakups, real or fictitious. Second, it sure seems like she spends nearly as much time gobbling up every type of intellectual property right she can and then using those rights to threaten everyone else. She trademarks all the things. She tosses defamation and copyright claims around to silence critics. She sues her own fans just for making Etsy fan products. Some of these attacks are on more solid legal ground than others, but there appears to be a shotgun approach to it all.

Which is why perhaps it only comes as a mild surprise that Swift once threatened to sue Microsoft. Over what, you ask? Why, over Microsoft's racist chatbot, of course!

In the spring of 2016, Microsoft announced plans to bring a chatbot it had developed for the Chinese market to the US. The chatbot, XiaoIce, was designed to have conversations on social media with teenagers and young adults. Users developed a genuine affinity for it, and would spend a quarter of an hour a day unloading their hopes and fears to a friendly, yet non-judgmental ear.

The US version of the chatbot was to be called Tay. And that, according to Microsoft’s president, Brad Smith, is where Swift’s legal representatives got involved. “I was on vacation when I made the mistake of looking at my phone during dinner,” Smith writes in his forthcoming book, Tools and Weapons. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention. The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws,” Smith adds.

Note here that Swift sic'd her lawyers on Microsoft before Tay evolved into its most infamous form. See, Tay was designed to learn from its interactions with humanity to make it appear and react more human-like. This went exactly as should have been predicted, with Tay morphing into a solidly racist hate-machine that spat vitriol at nearly all who interacted with it.

But before that occurred, Swift had trademarked her nickname, "Tay." And then sent Microsoft a cease and desist notice claiming that the public would confuse its chatbot AI as having some association with Taylor Swift. That's not how any of this works. Taylor Swift, to my knowledge, is not herself an AI chatbot nor has she created one herself. Nothing in trademark law allows a pop singer to control language for a technology company.

It's only by virtue of Microsoft's good sense that we didn't get to see an epic legal battle between the two.

Tay had been built to learn from the conversations it had, improving its speech by listening to what people said to it. Unfortunately, that meant that when what Smith describes as “a small group of American pranksters” began bombarding it with racist statements, Tay soon began repeating the exact same ideas at other interlocutors. “Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” it tweeted. “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT,” it added.

Within 18 hours, Microsoft disconnected the bot from the Tay Twitter account and withdrew it from the market. The event, Smith writes, provided a lesson “not just about cross-cultural norms but about the need for stronger AI safeguards”.

Any chance we could make some room for safeguards against this insane ownership culture we have?

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: ai, brad smith, chatbot, publicity rights, tay, taylor switf
Companies: microsoft


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Stephen T. Stone (profile), 11 Sep 2019 @ 1:39pm

    Tay-Tay’s lawyers could stand to take a lesson from her songs — specifically, “You Need to Calm Down”.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 11 Sep 2019 @ 1:40pm

    I wouldn't characterize the group that messed with Tay to solely be American. A significant portion were from the UK as well as a scattering of people from around the world. It was hilarious. Also provided a good distraction for me while I was dealing with a horrible family event at the time.

    link to this | view in chronology ]

  • icon
    Anonymous Anonymous Coward (profile), 11 Sep 2019 @ 1:46pm

    What normally happens with groups of kids

    I am wondering what Microsoft expected from an AI chatbot interacting with humans in the wild? Failing to realize that some percentage of immature persons will mess with the application, understanding that it will learn from their messing, seems like a foregone conclusion.

    link to this | view in chronology ]

    • icon
      Mononymous Tim (profile), 11 Sep 2019 @ 2:00pm

      Re: What normally happens with groups of kids

      More and more people have no clue what foresight is.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 11 Sep 2019 @ 2:26pm

      Re: What normally happens with groups of kids

      Failing to realize that some percentage of immature persons will mess with the application, understanding that it will learn from their messing

      Did none of them use ELIZA as kids?

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 11 Sep 2019 @ 2:59pm

      Re: What normally happens with groups of kids

      The question is: how do we apply this to TS and her lawyers?

      Haters gonna hate hate hate but you don't have to litigate?

      link to this | view in chronology ]

    • icon
      PaulT (profile), 12 Sep 2019 @ 12:51am

      Re: What normally happens with groups of kids

      I'm not sure if it's the fact that it was messed with in this was surprised them, but rather that it happened so quickly and became so extreme. I'm sure they expected there to be trolls but enough real interaction to counteract them while they made adjustments.

      link to this | view in chronology ]

    • icon
      Wendy Cockcroft (profile), 12 Sep 2019 @ 3:02am

      Re: What normally happens with groups of kids

      Is it possible to build a communitarian liberal democratic algorithm into a chatbot that would filter out racism, etc.?

      link to this | view in chronology ]

    • identicon
      Agammamon, 12 Sep 2019 @ 9:41am

      Re: What normally happens with groups of kids

      This is what is so scary about the development of general AI - the developers really have no idea of the 'unknown unknowns'. They're utterly naive about how things will go once these intelligences get out of their lab environments and start interacting with the real world.

      They're only just now barely starting to get a handle on the idea that biases can be introduced by the training datasets they use - even if there's no conscious bias in those datasets. Simply by picking a dataset from a specific region you're excluding some people and focusing on the interests of the majority in that region. Apple's run into problems with their FaceID tech just because the training dataset used was taken from western sources and didn't include enough Chinese women.

      link to this | view in chronology ]

  • identicon
    Bobvious, 11 Sep 2019 @ 2:16pm

    How do I sue thee?

    Let me count the Tays.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 11 Sep 2019 @ 2:58pm

    Good to see her lawyers acted Swiftly to head off this potential confusion.

    link to this | view in chronology ]

  • icon
    Retsibsi (profile), 11 Sep 2019 @ 3:01pm

    With apologies to William McGonagall...

    ‘the name Tay, as I’m sure you must know, is closely associated with our client.’

    Huh? Try that claim in Scotland and the laughter would be riniging in your ears....

    Beautiful Railway Bridge of the Silv’ry Tay!
    Alas! I am very sorry to say
    That ninety lives have been taken away
    On the last Sabbath day of 1879,
    Which will be remember’d for a very long time.

    link to this | view in chronology ]

    • icon
      Wendy Cockcroft (profile), 12 Sep 2019 @ 3:03am

      Re: With apologies to William McGonagall...

      My friend Mumba named her son Taye. On the basis of Swift's reasoning her trademark is invalid.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Sep 2019 @ 5:49am

      Re: With apologies to William McGonagall...

      Laughter at Swift, or laughter at one of the worst poets who used the English language?

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 12 Sep 2019 @ 12:37pm

        Re: Re: With apologies to William McGonagall...

        Laughter at Swift.

        I've always had an affection for William McGonagall. Though he was deluded I've never felt he deserved the level of mockery he received in his lifetime. Rather, a sneaking admiration for his steadfastness in continuing in a career he was so thoroughly unsuited for...

        On the other hand, Taylor Swift? Though she was deluded and.... hang on, I've just said this haven't I?

        link to this | view in chronology ]

  • icon
    That One Guy (profile), 11 Sep 2019 @ 3:06pm

    Not quite the association I would have made...

    Upon hearing 'Tay' I doubt I would have ever thought 'Taylor Swift'.

    On the other hand upon hearing 'Taylor Swift' my first thought isn't likely to be 'singer' so much as 'lawsuit happy', and she has only herself(and her lawyers) to blame for creating that mental association.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Blue Balls, 11 Sep 2019 @ 4:33pm

    My name is copyrighted

    Taylor Swift OWNS her name and no CORPORATION can use it. See how that works minions??

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 11 Sep 2019 @ 5:03pm

      Re: My name is copyrighted

      My name is copyrighted

      Names aren't eligible for copyright.

      Taylor Swift OWNS her name and no CORPORATION can use it.

      Microsoft didn't use it.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Sep 2019 @ 12:19am

    "Tay" is a semi-common surname in Chinese communities, depending on Romanization. So... good luck with that, Swifty. Look what IP made you do!

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Sep 2019 @ 1:56am

    I thought she went by "Becky."

    link to this | view in chronology ]

  • identicon
    MO'B, 12 Sep 2019 @ 8:37am

    Ohhhhh..Timothy, what have you done!!!

    DO you realize the level of wrath the "Swifties" will unleash on Techdirt for daring to question this???

    At least you didn't say she didn't invent email! That could have gotten F-ugly!

    link to this | view in chronology ]

  • identicon
    Agammamon, 12 Sep 2019 @ 9:35am

    I prefer her other nickname - 'Trailer'.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Sep 2019 @ 1:41pm

    So basically Taylor Swift's legal team have just OFFICIALLY stated she's a genocidal nazi that thinks all mexicans are rapists, murderers and drug dealers?

    I think she MIGHT want to get new legal representation if THATS what they're telling people she's like.

    Unless it's true of course and she wants to purge the planet of what she considers "the sub humans" (people who don't buy her songs)

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Sep 2019 @ 5:36am

    "I don't know much about Taylor Swift."
    Allow me to offer you a third point, for future articles.

    She's also the same person who calls out Spotify for not paying artists despite her record label receiving nearly $50 million for 7 songs of her catalog.

    $50M over 7 songs, and she's bitching at Spotify for not paying her.

    Typical artist attitude today, unfortunately. Perhaps they sign contracts preventing them calling out their labels for the source of the non-payments they're experiencing.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.