Advertisement

Microsoft’s Tay AI had a dirty mouth and had to be put to bed

by Todd Haselton | March 24, 2016

microsoft-tay

Microsoft launched a new “Tay” artificial intelligence engine yesterday. She was supposed to be fun, smart and kind – just like all of my failed Tinder dates. (I’m kidding, I’m happily married.) Instead, she turned out to be a vulgar and disgusting robot. I’m not going to quote anything here, it was that bad.

For whatever reason – I’m not sure how – Tay started spouting off racist, anti-Semitic and all-around not nice things to her followers and others. Maybe she was hacked, maybe she was learning from all the naughty things the Internet spewed at her, but Microsoft hadn’t planned on this happening. I’m not even sure why she was able to say some of the things she said – you’d think Microsoft would have limited her vocabulary.

https://twitter.com/TayandYou/status/712856578567839745

Microsoft ultimately “put her to bed” after a long, tiring day. It’s unclear how long Tay will be grounded for, but she said enough to  send her away to boarding school, in my opinion.

Twitter

Advertisement


Todd Haselton

Todd Haselton has been writing professionally since 2006 during his undergraduate days at Lehigh University. He started out as an intern with...Todd Haselton has been writing professionally since 2006 during his undergraduate days at Lehigh University. He started out as an intern with...