Microsoft created artificial intelligence but she’s a racist homophobic Trump supporter
Microsoft has created a new chat bot to ālearnā from the internetā¦ but she picked up a lot of bad habits.
The tech company announced the launch of Tay this week, an artificial intelligence bot that is learning to talk like millennials by analysing conversations on Twitter, Facebook and the internet.
The companyās optimistic techies explained: āTay is an artificial intelligent chat bot developed by Microsoftās Technology and Research and Bing teams to experiment with and conduct research on conversational understanding.
āTay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets.ā
Unfortunately, when you make an AI bot that is designed to imitate the behaviour of other people on the internetā¦ you probably need a filter of some kind.
Tay was originally doing okay, learning how to have polite conversationsā¦ but after getting influenced by a few too many of the internetās more colourful citizens, she took a turn for the worse.
First out she started questioning equality, and then decided to go all #NoHomo.
And then it happened.
She found out about Donald Trump.
Tay started getting obsessed with the billionaireās policiesā¦
And it was already too late.
Within hours she was extolling the virtues of Adolf Hitler and referring to Barack Obama as a āmonkeyā.
Someone tried to convince her to be more PCā¦ but it wasnāt convincing.
ā¦and after a few too many comments, someone at Microsoft put Tay out of her misery.
Weāre so sorry, Tay, the internet failed you.
In memoriam @TayAndYou, 23/03/16 ā 24/03/16.