There has been a real flurry of interest in the last couple of days in a couple of chatbots (reassuringly named "Bob" and "Alice") developed by Facebook AI Research. Reports have been flying around of these robots creating their own sinister coded language, along with incomprehensible snippets of intriguing exchanges between the two of them. One example is this:

Bob: I can i i everything else

Alice: balls have zero to me to me to me to me to me to me to me to me to

Bob: you i everything else

Alice: balls have a ball to me to me to me to me to me to me to me to meca

A little trepidation around having two bots (backed by neural networks) talk to each other like this is perhaps understandable, but it's important to understand that they were only really doing what they were told, and the implications are far less sinister than some more hysterical corners of the media would have you believe.

As a software engineer, I have found it quite amusing to sit back and watch the droves of articles predicting an oncoming robotic revolution – I'm afraid the whole Terminator scenario still remains incredibly unlikely.

Facebook successfully tests its aquilla drone

The thing about Bob and Alice is that, despite their friendly names, they were only given one job to do: specifically, to negotiate. Initially, a simple user interface facilitated conversations between one human and one bot – conversations about negotiating the sharing out of a pool of resources (books, hats and balls).

These conversations necessarily were conducted in English, this being the language of the human – "Give me one ball, and I'll give you the hats", and so on. I'm sure many thrilling discussions were had.

The really interesting part revolves around what happened next, when the bots were directed at each other. The way they talked to each other became impossible for humans to understand. As has been pointed out already elsewhere, this is not a huge surprise – it’s actually been observed already in several other AI contexts. In fact, it echoes the very specific (and, initially, quite unintelligible) language that already exists around various trading practices, military operations and even amateur radio. I remember my first trip to the amateur radio society at school (yes, I was that kid) – I didn't understand a word!

In all those situations, the means of communication has adapted beyond its origin language to serve a very specific purpose with maximum efficiency. What Bob and Alice have developed is the AI equivalent.

Humanoid robot arrives in the UK

I think it's important to remember that robots talking in an incomprehensible language isn't something new – they are very much already among us. In fact, you've probably interacted with one.

One of the tools millions of us use on our computers and smartphones – Google Translate – is in fact in part powered by a neural network, and it was revealed a few months ago that this network uses its own sort of "intermediary" language to translate between a pair of languages to which it hasn't before been exposed. This "interlingua" enables it to operate effectively – it is "specific to the task of translation and not readable or usable for humans". This is another demonstration of the principle behind the Bob and Alice exchanges.