Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.

20 Outrageous Tweets by Microsoft’s Twitter-Taught AI Chatbox

This article is over 8 years old and may contain outdated information
Tay on Twitter

Hours after Microsoft introduced a Twitter chatbot named Tay to the world, humans had corrupted her so terribly that Microsoft was forced to shut down her account.

Microsoft programmed the AI-powered chatbox to learn through friendly, informal conversations on Twitter. Unfortunately, her words went from playful to hateful after learning from fellow Twitter users.

As you’ll see in the 20 examples below, Tay soon turned racist, misogynist, bigoted, crass, paranoid and otherwise uncivilized.

At the beginning, Microsoft stated that Tay had the personality of a 19-year-old girl and that she’d be able to handle internet slang and teen-speak, which was impressive for an actively-learning chatbox.

By the end, Microsoft was second-guessing their decision to turn to Twitter for wisdom.

Here are the outrageous tweets; please proceed with caution:

Recommended Videos

The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission.Ā Learn more about our Affiliate Policy