Watch CBS News

Microsoft's Chatbot 'Tay' Goes Offline Following String Of Racist Tweets

LOS ANGELES (CBSLA.com) — Microsoft shut down its newest artificial intelligence chatbot Thursday after it generated a string of racist and insensitive tweets.

Nicknamed "Tay", the chatbot was made to converse with users across several social networks, with its audience primarily aimed at users in the U.S. between the ages of 18 and 24, according to the project's website.

"Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation," according to a statement on the website. "The more you chat with Tay the smarter she gets, so the experience can be more personalized for you."

Twitter users were able to convince Tay to send out offensive tweets by chatting with the account on various controversial topics, including the 9/11 attacks, Donald Trump, the Holocaust, and Adolph Hitler. The A.I. was ultimately unable to detect that it was sending out racist remarks.

Microsoft has deleted most of the tweets, but multiple sources posted several screenshots on social media.

Tay Racist Tweet 5

Tay Racist Tweet 6

Tay Racist Tweet 3

Tay Racist Tweet 2

Tay Racist Tweet 1

A statement posted Thursday to the top of Tay's website read, "Phew, busy day. Going offline for a while to absorb it all. Chat soon."

There was no additional comment from Microsoft.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.