I’ve previously written about how elections consist of both the procedural and technical processes to collect and tally votes, and the arena of ideas where policies, platforms and promises are debated and discussed.

Of the two, I don’t worry too much about the security of vote tallying procedures and infrastructure. The Australian Electoral Commission (AEC) is the body responsible for the elections and it understands the importance of maintaining the integrity of the process.

Australia’s paper-based voting system is also an advantage—the US Senate Select Committee on Intelligence has recommended using a voter-verified paper trail in US elections—and the AEC can afford to be conservative in incorporating new technology to speed vote counting.

That’s no reason for complacency, of course, and the AEC must be resourced to deliver secure elections. But at least a responsible authority oversees this half of the election security equation, and the risks to election processes can be managed.

I worry much more about the arena of ideas. That public information space consists of a vast array of information producers and consumers, including citizens, traditional and new media organisations, advertisers, content aggregators (such as Facebook and Twitter), politicians and journalists.

Australia has a very strong interest in ensuring a robust public debate that results in the electorate being informed. Yet in this arena of ideas there’s no single stakeholder—or even an assemblage of players—with the incentives, responsibility and authority to ensure that the kind of public discourse that contributes to a healthy democratic election takes place.

Many politicians appear to want to win more than they want to maintain a healthy democracy.

We’ve seen much more public information about how the information space around elections is manipulated. For example, there’s been the public indictment of Russian nationals and the Internet Research Agency (IRA) from Special Counsel Robert Mueller’s investigation into Russian interference in the 2016 US presidential election.

The Russians used fake personas, created hundreds of social media accounts, created and manipulated thematic groups on social media, used bots to amplify messages on social media, bought divisive political ads, manufactured political rallies and paid stunt actors, and created and spread fake news and disinformation.

Most recently, there’s the continuing fallout around Cambridge Analytica, a shady political consulting firm. Cambridge Analytica portrayed itself as a secretive organisation capable of data science wizardry that could swing elections by manipulating social media to prey on people’s hopes and fears. In addition, Cambridge Analytica’s CEO, Alexander Nix, claimed to be capable of organising covert operations to entrap opposition candidates.

Most concerning is that the vast majority of what Cambridge Analytica and the Russians did to influence the 2016 US election was legal and could be employed in future elections (both in the US and in Australia).

Using social media data to manipulate people’s emotions is entirely legal. Politicians trade in hope and fear as a matter of course, and it’s no surprise that they’re searching for technological solutions to boost their campaigns.

In fact, the Russians did only two things that were illegal. The first was to steal email from Hillary Clinton’s campaign and spread it on Wikileaks. The second was to be Russian. If they’d been US citizens, almost everything in their influence campaign would’ve been legal.

I’m not convinced that Cambridge Analytica or the Russian influence campaign made a huge difference to how Americans voted in 2016, but the vagaries of the Electoral College and the small margin in key states make it possible that they did swing the outcome in Trump’s favour.

Regardless of the magnitude of the effect, however, social media is playing an ever‑larger role in our public information space and in our elections. Some of the applications of social media and related technologies seem entirely benign—direct engagement between politicians and voters, for example. But there’s definitely a spectrum of uses, and some of those we’ve seen used by the Russians look to me to be universally unacceptable.

So influence operations and techniques that can be highly divisive—and hence corrosive—in a democracy, and that can be employed for political advantage, are not only legal, but are also difficult to identify at first glance. It would be easy for an unscrupulous political party to use these techniques through third parties to maintain plausible deniability.

These are all concerns that the Canadian government is taking seriously. Canadian intelligence agencies have produced reports with titles like Cyber threats to Canada’s democratic process and Who said what: the security challenges of modern disinformation. Canada’s Chief Electoral Officer has accepted responsibility for countering disinformation around how, when and where to vote. Its Treasury Board President, Scott Brison, has stated: ‘We also expect that social media platforms do everything they can and maintain a responsibility for defending the integrity of our electoral system.’

Relying on foreign social media companies seems a pretty weak thread on which to hang the security of our electoral system and of our democracy. It’s time for all Australian political parties take this threat seriously.