Microsoft’s Tay AI bot has had a rough life so far. She launched just last week and, mere hours after hitting Twitter, she started to hang out with the wrong crowd. Instead of learning more about users, as she was supposed to, she was tricked into repeating some awfully naughty things. Ultimately, Microsoft had to put her to bed. She relaunched recently, however, but she didn’t learn much from her mistakes.
Hours after launching again, she was caught back at it – making a drug solicitation offer to a police officer that I won’t repeat here. Then, as if repeating knowledge instilled within her by her parents, she started telling everyone “You are too fast, please take a rest.” Perhaps that was a bit of introspection, but more likely she was bombarded by too many users.
Poor Tay. She’ grounded again. Maybe this time she’ll learn better.