Microsoft is making adjustments to Tay to avoid inappropriate responses

tayai

Few hours back, we reported that Microsoft has already stopped its new chatbot service, Tay. Yesterday, Microsoft launched Tay, a machine learning project to know more about human interaction. It went viral online and thousands of users started interacting with it. After several hours of launch, Tay started responding in inappropriate ways including racist and abusive comments because she learned them from her interactions with internet trolls. Microsoft is now making adjustments to the service to avoid such behavior and hopefully it will be back online soon.

Read Microsoft’s response regarding this issue below,

“The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

Some links in the article may not be viewable as you are using an AdBlocker. Please add us to your whitelist to enable the website to function properly.

Related
Comments