Brad Smith reveals Taylor Swift had bad blood with Microsoft's Taybot

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

When most of us think of Tay we think of Microsoft’s Twitter chatbot gone disastrously wrong. Introduced in 2016, the chatbot made some offensive and racist tweets after a “coordinated” attack.

It turns out Taybot had some haters even before it turned bad. In his new book, Tools and Weapons, Microsoft president Brad Smith revealed that Taylor Swift’s team had taken offence to the name of the bot.

“I was on vacation when I made the mistake of looking at my phone during dinner,” Smith writes. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention.

“The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws,” Smith adds.

In the end Taylor Swift’s team was probably right not to have Taybot associated with her nickname, with the bot spouting noxious statements such as “Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” and “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT.”

Peter Lee, the Corporate Vice President of Microsoft Research apologized for Tay’s tweets and the bot was permanently removed soon after, never to come back again.

Read more revelations in Brad Smith’s book here.

Via The Gaurdian

More about the topics: ai, microsoft, microsoft research, tay

Leave a Reply

Your email address will not be published. Required fields are marked *