Microsoft is making adjustments to Tay to avoid inappropriate responses

Reading time icon 1 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

tayai

Few hours back, we reported that Microsoft has already stopped its new chatbot service, Tay. Yesterday, Microsoft launched Tay, a machine learning project to know more about human interaction. It went viral online and thousands of users started interacting with it. After several hours of launch, Tay started responding in inappropriate ways including racist and abusive comments because she learned them from her interactions with internet trolls. Microsoft is now making adjustments to the service to avoid such behavior and hopefully it will be back online soon.

Read Microsoft’s response regarding this issue below,

“The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

User forum

19 messages