Whoops!

Microsoft has a new artificial intelligence “chat bot” called Tay which is designed to “experiment with and conduct research on conversational understanding” through interactions on social media. According to Microsoft, the “more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”  (It’s a she?)

Anyway, Microsoft has been busy deleting tweets from Tay’s Twitter account, @Tayandyou, because, it seems, Tay is easily tricked into sending out racist and inappropriate tweets. Check it out:

Here are some that haven’t been deleted … yet:

Tay is also a Trump supporter:

And not a fan of Hillary Clinton:

Tay also wants pics:

Creepy.

Oh, and by the way … Microsoft is collecting data on all the people who ask Tay a question and is keeping that data for up to a year:

Tay may use the data that you provide to search on your behalf. Tay may also use information you share with her to create a simple profile to personalize your experience. Data and conversations you provide to Tay are anonymized and may be retained for up to one year to help improve the service. Learn more about Microsoft privacy here.

Exit question: How many day until Microsoft is forced to scrap this entire mess?

***