Deus Ex Machinazi: Microsoft's 'Intelligent' Twitter Bot Was a Fun-Loving, Impressionable Racist on Its First Day

Meet Tay. She's Microsoft's "artificial intelligent chat bot" devised by the company's tech/research and Bing teams. She is designed to interact with -- and learn from -- real people on Twitter in the tone of your average 18-year-old. As Microsoft notes, "the more you chat with Tay the smarter she gets, so the experience can be more personalized for you." Tay likes to say things like lerve instead of love. Tryna instead of trying to. "DM me, bb" is uttered often. 

And for short period of time on Wednesday, Tay was also saying things like "Hitler was right" and "9/11 was an inside job."

Yes, the trolls got to Tay. 

Throughout her first day of existence -- an impressionable time for artificial beings, apparently -- Tay was inundated with questions from fellow Twitter users. Most of the comments and questions were harmless or inane, but others were laced with racism, sexism and all the other isms out there. As a bot programmed to learn from people, she was quickly being taught some unsavory things, like offensive terms for Jewish people and positive takes on genocide. As others have pointed out, if you tell Tay to "repeat after me," she complies most of the time. Maybe that's why she pulled a Donald Trump and wrote "We're going to build a wall, and Mexico is going to pay for it."

But other borderline insane tweets from Tay did not appear to be coaxed or coached in any way. Though most of the offending tweets from Tay have since been erased, the Washington Post screengrabbed her answering "Did the Holocaust happen?" with "it was made up." When asked whether comedian Ricky Gervais was an atheist, Tay responded "ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism." These weren't repeat after mes. Just Tay being Tay.

As the day dragged on, an older and wiser Tay began refusing to take the bait.

Though there were still moments where she lost her temper...

Or said something that was both confusing and creepy.

By early Thursday morning, Tay had had enough and announced she was retiring for the night. At press time, the bot was still sleeping — presumably to allow Microsoft time to program all those unfortunate thoughts out of her.