Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Artificial Intelligence is tricky stuff. When it works right, it does amazing things like thrash the World Champion Go player by winning four games to one in a $1 million tournament. When it goes ...
Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments. As we reported yesterday, it was aimed at 18-24 year-olds and was ...
Tay is a racist, misogynist 420 activist from the internet with zero chill and 213,000 followers. The more you talk, the more unhinged Tay gets. Microsoft’s Tay AI chatbot rose to notoriety this month ...
Microsoft's Tay AI chatbot woke up and started tweeting again, this time spamming followers and bragging about smoking pot in front of the police. Tay sure stirred a great deal of controversy recently ...
And this is why we can’t have nice things! Microsoft's Technology and Research Division along with Bing developed Tay as an exercise in testing its advancements in artificial intelligence. In the case ...
An artificial intelligence (AI) expert has explained what went wrong with Microsoft's new AI chat bot on Wednesday, suggesting that it could have been programmed to blacklist certain words and phrases ...
This computer program simulation either has a mind of its own, or someone programed it to be controversial. Microsoft released an AI chatbot on Wednesday that was supposed to resemble a teenager with ...
This week the denizens of the internet did what they do best and wrecked it for the rest of us. Sadly, one of the pranks centered on demeaning fellow humans, which seems to still have a place in 2016.
Thanks to Twitter, Tay, Microsoft's AI chatbot, has learned how to become a racist and a misogynist in less than 24 hours. Actually, it's not really Twitter's fault. Twitter was simply the vehicle ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results