Microsoft pulls its AI chatbot after Twitter users taught it to be racist

Reading time icon 1 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

tayai

Yesterday, we reported about Microsoft’s new AI chatbot, Tay. The AI chatbot learns new things as it gets to know new stuff on the internet. As expcted, some Twitter users taught it to be…racist:

…and this:

Screen_Shot_2016-03-24_at_10.46.22_AM.0

Tay isn’t replying to Direct Messages, either. When you send her a DM, it simply states:

“Brb getting my upgrades fancy at the lab today so ttyl!”

Apparently, Tay is currently sleeping:

https://twitter.com/TayandYou/status/712856578567839745

It is worth noting that Microsoft is deleting some of Tay’s racist tweets. The company is posisbly working on improving Tay, and hopefully, it will be back sometime soon.

We have reached out to Microsoft for more information on this, and we will update the story when and if we hear back from them.

User forum

33 messages