It was a quiet news day at Facebook
August 03, 2017
on
on
The media has been awash with news that researchers working at Facebook Artificial Intelligence Research (FAIR) found it necessary to pull the plug on two chatbots when it was discovered they were communicating in a language only the chatbots could understand.
The aim of the research was to train two bots to negotiate autonomously with one another to achieve ownership of virtual objects. It was necessary to allow the chatbots named Alice and Bob to experiment with language to work out ways of achieving an advantage in the negotiation process. An example of a typical exchange made up of English words went something like this:
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i i can i i i everything else . . . . . . . . . . . . . .
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i . . . . . . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Many of the news reports spoke of a new language, developed by the bots to fool us humans; clearly they are plotting behind our back. A more likely reason for the bizarre sentence structure is that the neural networks, given the freedom, have optimized the use of language in the exchanges to make the process more efficient.
In case you are interested the authors have released a PyTorch implementation of research paper Deal or No Deal? End-to-End Learning for Negotiation Dialogues available from Github.
But it’s high summer and slap bang in the middle of the silly season so this is obviously evidence that things are about to turn seriously nasty. Now HAL, about those pod bay doors… HAL?
The aim of the research was to train two bots to negotiate autonomously with one another to achieve ownership of virtual objects. It was necessary to allow the chatbots named Alice and Bob to experiment with language to work out ways of achieving an advantage in the negotiation process. An example of a typical exchange made up of English words went something like this:
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i i can i i i everything else . . . . . . . . . . . . . .
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i . . . . . . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Many of the news reports spoke of a new language, developed by the bots to fool us humans; clearly they are plotting behind our back. A more likely reason for the bizarre sentence structure is that the neural networks, given the freedom, have optimized the use of language in the exchanges to make the process more efficient.
In case you are interested the authors have released a PyTorch implementation of research paper Deal or No Deal? End-to-End Learning for Negotiation Dialogues available from Github.
But it’s high summer and slap bang in the middle of the silly season so this is obviously evidence that things are about to turn seriously nasty. Now HAL, about those pod bay doors… HAL?
Read full article
Hide full article
Discussion (0 comments)