LOS ANGELES (CBS Local) — Facebook has shut down a pair of artificial intelligence robots after they invented their own language.
Earlier this year, the research team at Facebook Artificial Intelligence Research built a “chatbot” that was supposed to learn how to negotiate by observing and imitating human trading and bartering practices.
But when the researchers pitted two of the AI programs, nicknamed Alice and Bob, against each other to trade, the bots began to engage in their own form of communication.
Their conversation “led to divergence from human language as the agents developed their own language for negotiating,” the researchers said.
The researchers tasked the robots with trading hats, balls and books by determining the value of each object and bartering for them with each other.
However, since Facebook’s researchers provided no incentive for trading in English, the programs immediately created their own terms for cutting deals.
“There was no reward to sticking to English language,” Facebook researcher Dhruv Batra told FastCo. “Agents will drift off understandable language and invent codewords for themselves. Like if I say ‘the’ five times, you interpret that to mean I want five copies of this item. This isn’t so different from the way communities of humans create shorthands.”
When the bots communicated with humans, most people were not aware that they were speaking to an AI instead of an actual person, the researchers said.
After shutting down the robot conversation, Facebook said the AI project marked important progress toward “creating chatbots that can reason, converse, and negotiate, all key steps in building a personalized digital assistant.”
But the researchers also said it is impossible for humans to comprehend the AI language and translate it into English.
“It’s important to remember, there aren’t bilingual speakers of AI and human languages,” Batra said.
Last year, Microsoft was forced to shut down “Tay”, its newest artificial intelligence chatbot, after it generated a string of racist and insensitive tweets.