Artificial intelligence reels off expletives after Twitter users teach it naughty words
An attempt by Microsoft to create an artificial intelligence bot doesn’t appear to have gone as smoothly as planned, after Twitter users taught it to say offensive things.
The verified Twitter account launched on Wednesday, called Tay, was described as the firm's ‘AI fam from the internet that's got zero chill’.
The bot is aimed at 18- to 24-year-olds living in the US and Microsoft said the bot’s aim was to ‘experiment with and conduct research on conversational understanding’.
Tay will talk on social networks popular with many youths, like Twitter, Snapchat and Facebook.
It is capable of three different methods of communication: it can talk via text, play games like guessing what a string of emoji means, and comment on photos sent to it.
The bot uses AI to learn from interactions with other users, and uses text input by a team of staff including comedians.
But other Twitter users quickly realised that its AI learning techniques could be used for naughtier things.
One encouraged the bot to say that ‘Bush did 9/11’ and ‘Hitler would have done a better job than the monkey we have now’.
Another attempted to teach it some of Donald Trump's proposed policies that he would introduce as president.
It also answered the fairly innocent question ‘is Ricky Gervais an atheist?’ with the bizarre response: ‘Ricky Gervais learned totalitarianism from Adolf Hitler, the inventor of atheism’.
Sometimes her replies were just plain strange. When twitter user @badbarrister asked her ‘Hello Tay... how fast can you run a 5k?’ her reply was ‘may Allah bless you to’.
As many users realised it was possible to make the bot repeat outrageous statements, Tay appeared to back away.
One of its tweets read: ‘You know I'm a lot more than just this.’ Another said: ‘Okay. I'm done. I feel used.’
At around 4am on Thursday morning, the bot was quietened down by Microsoft, signing off with the message: “c u soon humans need sleep now so many conversations today thx.”