We’ve all gone a bit bot-mad in the past few weeks. Automated accounts have invaded our civic life – especially pesky Russian ones – and politicians on both sides of the Atlantic have woken up to the fact that a new propaganda war is taking place online.

Bots – which is of course short for robot – are essentially accounts which can be programmed to automatically post, share, re-tweet, or do whatever the programmer chooses. Creating a bot is extremely easy, and huge amounts of cheap bots are available on dark net markets for next to nothing. There are millions of harmless bots out there doing all sorts of helpful and funny things, including breaking news stories. But Russia twigged early that bots can also be usefully deployed to influence public opinion. It has been using them for years. During the Senate investigation into Russian meddling in the US election, Twitter revealed the handles of some 36,746 Russian-linked bots. They had tweeted a total of 1.4 million times in the two months before the election, and the accounts had been viewed almost 300 million times.

This new world of pseudonyms, virals, and digital public opinion is becoming murky. It’s not always easy to tell humans and bots apart, because some bots behave like humans; and some humans behave like bots. One academic report earlier this year tried to measure Labour bots during the election. It estimated that any account which tweets over 50 times a day on a single hashtag is a bot. Myself and colleagues at Demos took a closer look at this – and it turned out that many of these ‘bots’ were in fact fanatic Labour supports who were tweeting so frenetically they looked machines. Equally, improvements in machine learning mean bots are looking more and more human. Soon, it will be very difficult to tell them apart.

Far more worrying than bots though are the paid content producers. A decent amount of the Russian interference appears to emanate from the Russian ‘troll factory’ based in St Petersburg where hundreds of people work 12-hour shifts spreading information that supports the Kremlin’s line. (Salaries of around between £575 - £830pcm). My guess is that a lot of the accounts they run are cyborg – which are half bot, half human. A human operator runs thousands of accounts, adding the odd bit of human content to bots in order to evade standard spam filters.

Russia isn't the only country engaging in this. In the Philippines, members of the ‘keyboard army’ which supports Rodrigo Duterte earn $10 per day for pumping out pro-government spam; in China, the government employs hundreds of thousands of people to post millions of times, mostly to divert attention from subjects the government doesn’t like.

It’s impossible to accurately estimate the number of Russian state-sponsored accounts operating on Twitter and Facebook in the UK: academics here have suggested anything from 13,000-150,000 accounts. The much harder question is whether they had any influence or bearing on the US election or the EU referendum.

This brings me to one particularly infamous Twitter bot with the username @DavidJo52951945. Over the last couple of years this ‘David Jones’ – a true patriot, Union Jack flag avatar – amassed tens of thousands of Twitter followers, and tweeted thousands of times, nearly always divisive content that was pro-Trump and pro-Brexit. A number of other Twitter users thought he looked suspicious, possibly a bot. Citizen sleuths starting digging around. It turned out that Mr Jones was almost certainly a real person, and had a very long record of tweeting content that pushed the Kremlin line, and always did it between 8am and 8pm Moscow time. He tweeted every few minutes, even more during the events in Ukraine and Crimea, and one analyst found he was part of a web of thousands of accounts with similar usernames. The Times outed him as a Kremlin troll - but nobody is entirely sure.

While writing this I spotted another possible clue. People are generally lazy, and pick usernames that mean something. The last 6 digits of his name is 951945. Perhaps that was an auto-generated number, that many trolls and bots use. But perhaps it might be 9/5/1945, which is the most important day in recent Russian history – Victory Day. (Victory Day in Europe is 8 May, but it was past midnight in Moscow when Berlin fell.) When people speculated that Dave was in fact Russian, he removed the number from his handle and is now using the name @DavidJoBrexit.

@DavidJo52951945 was familiar – I’d seen it before. This week I went back through some old work carried out by my Demos colleague Alex Krasodomski-Jones, who’s been analysing political Twitter over the last couple of years. Lo and behold Mr Jones’ name popped up in every one of our studies. His account appeared in tweets about the 2015 general election. He joined in with tweets about televised debates. He tweeted fervently about Brexit and the campaign to leave the European Union. When we ran a project examining criticism of Islam and migration in 2016 @DavidJo52951945 joined the conversation, as he did after the terror attacks in Westminster and London Bridge this year. He even joined in with an angry mob of Twitter users who took offence to comments made by Lily Allen in February this year. He was everywhere.

And while it’s easy to dismiss this as noise, Mr Jones was in fact extremely influential. Earlier this year we carried out a study into the existence of echo-chambers online. We sampled a random selection of 2,500 UK Twitter users who expressly supported one of the major parties. Within this sample, he was the ninth most retweeted account. On this measure – not perfect, but a pretty good random sample – his content was more widely shared than the Guardian and the Independent. And among our sample of Ukip supporters, he came in a close-second to Nigel Farage himself.

Whether this translated into people’s voting is another question entirely. There’s a new liberal quasi-conspiratorial vogue of blaming things they don’t like on bots and trolls, possibly because it’s more convenient to blame Brexit or Trump on some sinister Russian machinations than to face the uncomfortable possibility that they just lost. This is to miss the point. In the bot and troll factory war, the aim is not always to directly change opinion. It’s often aimed at distracting people, to destroy serious debate, sow confusion or anger, troll people, or create the impression of a genuine grassroots movements or social consensus.

On that measure, Dave was successful at getting a lot of hyper-partisan, angry, divisive content in front of untold thousands. Our politics is angry and divisive enough as it is. And although bots probably didn’t swing the referendum, it ought to be worrying enough that what appears to be a Russian paid agent managed to tweet himself right into the middle of our political debate, and nobody had a clue.

Jamie Bartlett is director of the Centre for the Analysis of Social Media at Demos