This article is more than 11 months old

This article is more than 11 months old

Britain needs to take concerted action to reduce the risk of malicious actors in the UK and abroad from contaminating the results of a looming general election, according to a new study that warns of the risks of public “abuse and deception”.

A group of experts say government, political parties and social media companies all need to take immediate action, at a time when there is rising concern within Whitehall about the integrity of the democratic process.

The disinformation age: a revolution in propaganda Read more

Lisa-Maria Neudert, who acted as the secretary to the author of the study, the Oxford Technology and Elections Commission, said there was growing recognition that “manipulation and propaganda which was only thought to happen in authoritarian regimes can happen in democracies like the UK”.

The research calls for:

The Electoral Commission to verify the social media accounts of all candidates and registered campaigners to prevent fraudulent accounts taking advantage.

Political parties to monitor all the digital communications they produce and to maintain an archive in publicly accessible databases to ensure transparency.

All social media companies, including Facebook and Twitter, to warn promptly when they suspect foreign interference or other meddling is taking place.

Concern has been rising in government circles about electoral integrity in the UK. The Cabinet Office is nominally responsible but intelligence agencies have been taking a growing interest in the rise of state disinformation online, principally from Russia and China.

There have been repeated warnings about Russian interference in the 2016 US presidential election, as well as questions about the impact of microtargeted internet advertising during the EU referendum, and the commission’s researchers, at Oxford University, believe the risks remain current.

During the European election campaign of the early summer, they highlighted the sharing of “extremist, sensationalist or conspiratorial junk news” typically involving anti-immigration and Islamophobic sentiment across the continent.

One example cited was the Notre Dame Cathedral fire, in Paris, in April which while not immediately a political story, was manipulated during the campaign period for what amounted to political purposes by Russian and other social media accounts.

They unearthed a trail of propaganda furthering the false idea that the fire in the Gothic cathedral was started by Islamist terrorists, or that plans for reconstructing the Paris landmark would include a minaret.

“We want social media companies to help us understand why such stories are so widely shared by their algorithms,” Neudert said. “They should be required to release information so we can understand why such disinformation so easily spreads.”

Other specific risks highlighted included the use of “non-transparent”, or dark, advertising where money can be spent on targeting voters by political actors without any visibility – unless a record is taken.

Facebook, for example, maintains an archive of political adverts, but not all technology companies do. The reporting is not standardised, meaning that “is usually rendered useless for statistical analysis because of inconsistent or incomplete metrics that make it impossible to compare and understand trends”.

Ministers have outlined plans to require people to bring photo ID to polling stations in order to vote in a future election, but while the proposal was mentioned in the Queen’s speech, on Monday, it was not accompanied by a promise for wider reform of election law.