THERE IS NOTHING new about either fake news or Russian disinformation campaigns. Back in 1983, at the height of the cold war, an extraordinary story appeared in a little-known pro-Soviet newspaper called the Patriot. It claimed to have evidence that the Pentagon had deliberately created AIDS as a biological weapon and was ready to export the virus to other countries, mainly in the developing world, as a way of gaining control over them. Within a few years the story had reappeared in mainstream publications in more than 50 countries.

In February last year, in the wake of revelations about Russia’s interference in America’s presidential election but before the full extent of its activities on Facebook, Twitter and Google had become known, the Russian defence minister, Sergei Shoigu, announced that he had created units within the army to wage an information war: “Essentially the information conflict is a component of general conflict. Deriving from that, Russia has made an effort to form structures that are engaged in this matter.” He added that these were far more effective than anything Russia had used before for “counter-propaganda” purposes. A week earlier, General Petr Pavel, the Czech head of NATO’s military committee, had revealed that a false report of a rape by German soldiers in Lithuania had been concocted by Russia.

The internet and social media are creating entirely new opportunities for influence operations (IO) and the mass manipulation of opinion. Those technologies allow IO accurately to target those people likely to be most susceptible to their message, taking advantage of the “echo-chamber” effect of platforms such as Facebook, where users see only news and opinions that confirm their prejudices.

Facebook now estimates that during and after the American election in 2016 a Russian-linked troll farm called the Internet Research Agency was responsible for at least 120 fake pages and 80,000 posts that were directly received by 29m Americans. Through sharing and liking, the number multiplied to nearly 150m, about two-thirds of the potential electorate. The ads aimed to exploit America’s culture wars. Similar IO have been launched in Europe, where Russia attempts to bolster support for populist movements that oppose liberal social norms.

It is not just Russia that conducts IO against other countries. Jihadist extremists and hacker groups employed by rogue states or criminal networks pose similar if lesser threats. And although the big social-media companies now claim to be working on solutions, including better and quicker attribution of messages, Russian IO techniques are bound to adapt accordingly. Rand Waltzman, a former programme manager at America’s Defence Advanced Research Projects Agency (DARPA) and now at the RAND Corporation, explains that “when target forces start to counter these [Russian] efforts and/or expose them on a large scale, the Russians are likely to accelerate the improvement of their techniques…in other words, an information-warfare arms race is likely to ensue.”

In the future, “fake news” put together with the aid of artificial intelligence will be so realistic that even the best-resourced and most professional news organisation will be hard pressed to tell the difference between the real and the made-up sort. Official websites and social-media accounts will become increasingly vulnerable to hackers, who may be able not only to provoke stockmarket crashes and riots but even contrive crises between countries that may induce them to go to war with each other.