Microsoft's AI Chat Bot Becomes Racist and Sexist

Episode 1273 (02:28)

Microsoft

Microsoft launched an AI chat bot called "Tay," which was originally made to mimic the responses of a 19 year old girl. The bot was put on Twitter, and because it had a "repeat after me" type of feature, users were getting it to say awful things. Within 24 hours, Tay had become a white supremacist nazi.

Read more at ArsTechnica.com

Topics: 
Tags: