Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Artificial Intelligence is tricky stuff. When it works right, it does amazing things like thrash the World Champion Go player by winning four games to one in a $1 million tournament. When it goes ...
Tay is a racist, misogynist 420 activist from the internet with zero chill and 213,000 followers. The more you talk, the more unhinged Tay gets. Microsoft’s Tay AI chatbot rose to notoriety this month ...
Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments. As we reported yesterday, it was aimed at 18-24 year-olds and was ...
This computer program simulation either has a mind of its own, or someone programed it to be controversial. Microsoft released an AI chatbot on Wednesday that was supposed to resemble a teenager with ...
Microsoft's Tay AI chatbot woke up and started tweeting again, this time spamming followers and bragging about smoking pot in front of the police. Tay sure stirred a great deal of controversy recently ...
An artificial intelligence (AI) expert has explained what went wrong with Microsoft's new AI chat bot on Wednesday, suggesting that it could have been programmed to blacklist certain words and phrases ...
Last week, Microsoft created an AI program called Tay and launched it on T twitter. Designed to speak like a teenage girl, Tay was an attempt from Microsoft to better understand artificial ...
And this is why we can’t have nice things! Microsoft's Technology and Research Division along with Bing developed Tay as an exercise in testing its advancements in artificial intelligence. In the case ...
This week the denizens of the internet did what they do best and wrecked it for the rest of us. Sadly, one of the pranks centered on demeaning fellow humans, which seems to still have a place in 2016.
Thanks to Twitter, Tay, Microsoft's AI chatbot, has learned how to become a racist and a misogynist in less than 24 hours. Actually, it's not really Twitter's fault. Twitter was simply the vehicle ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results