VANCOUVER—Efforts to undermine the Canadian political process are virtually unstoppable in the murky, borderless digital world, say experts who study the spread of disinformation.

And Canada, which faces a federal election this fall, is struggling to contain hackers and disinformation campaigns, said Marcus Kolga, a senior fellow at the Macdonald Laurier Institute’s Centre for Advancing Canada’s Interests Abroad.

“Unfortunately, we’re getting to this a little bit late,” said Kolga, an expert on the foreign policy applications of media. “We really should have been already talking about this two or three years ago, so it’s a huge uphill battle to get Canadians to start paying attention.”

The warning comes on the heels of a report this week from the federal government’s Communication Security Establishment that showed Canadian political parties, candidates and staff have already been the target of at least one state-sponsored hacking campaign ahead of the federal election.

The report warned the same “cyber interference” used by hackers to target countries like the United States are likely being mobilized in Canada.

They include using hacked data to target a candidate’s campaign to try to sway voters by amplifying divisive issues or promoting one party over another.

Read more:

Opinion | Susan Delacourt: For foreign hackers, political parties are our weakest link

After Facebook banned Faith Goldy and white nationalist groups, accounts and ads in their name remained

Canadian political parties already targeted by foreign hacking, electronic spy agency says

These so-called “wedge campaigns” are driven by different players, said Samantha Bradshaw, a doctoral candidate at the Oxford Internet Institute and senior fellow at the Canadian International Council.

“We know there are all kinds of actors who are motivated for both political reasons but also economic reasons,” Bradshaw said. Countries like Russia, China and Iran all use conspiracy-theory websites, proxy organizations and internet trolls who act on behalf of foreign governments to help destabilize Western democracies.

They also include populist movements like the extreme wings of right and left parties who “want to get their ideas on the agenda, or want to get more people talking and discussing their ideologies,” she added.

Some people are motivated purely by profit, she noted, pointing to the oft-cited example of Macedonian teenagers, some of whom earned $50,000 a month in advertising revenue by driving traffic to false news websites during the 2016 U.S presidential election.

The line between foreign attacks and legitimate public opinion is not easy to identify. State-backed agents use proxy organizations such as local embassies as well as former diplomats, media professionals and academics — people Kolga said are often termed “useful idiots” — to promote agendas.

“You can’t stop a Canadian who might be affiliated with one of these organizations — and who might share the views of the Kremlin or Beijing on these issues — from going and spending a hundred dollars of their own money on promoting specific stories,” he said. And if the government tried to censor them, it’s a “slippery slope” that could threaten freedom of speech, which is at the very core of democracy, he noted.

Bots — such as automated Twitter accounts which like, follow, tweet, retweet or direct message other accounts — are also used to amplify divisive issues. An analysis of some of the nearly 10 million tweets from thousands of banned accounts — the vast majority originating in Russia — released in recent months by Twitter shows the Canadian issues targeted by bots include the anti-vaccine and anti-pipeline movements, and anti-LGBTQ and anti-immigration narratives.

Bots also drive engagement by gaming the algorithms of platforms like Google, Facebook and Twitter through the relentless repetition of keywords, phrases and web addresses, Bradshaw said.

“The reason why bots will often do that is to generate more organic engagement. It’s part of the Search Engine Optimization strategy,” she said.

The more key words or URLs are shared, the more likely it is that users will encounter them online. As more people engage with the information, it organically appears higher up in places such as Google News, YouTube’s recommendation feed or on Twitter or Facebook.

“All the platforms are based on virality and popularity and relevance. That’s part of the reason why some bots just amplify the same messages and key words. Sometimes the tweets don’t even make sense. They’re just a bunch of key words because it’s all part of that strategy,” Bradshaw explained.

Loading... Loading... Loading... Loading... Loading... Loading...

But trying to uncover who is behind the bots is nearly impossible. While Bill C-76 — which seeks to promote transparency in social-media advertising — is a step in the right direction, Kolga said foreign powers know how to cover their trails.

“They’re not going to start using the Kremlin credit card when they’re buying the advertisement to try to shift opinion,” he said.

Both Kolga and Bradshaw said promoting media literacy is one way Canadians can build defences against foreign influence. But they said the government needs to take more action.

Kolga thinks major political parties should share what they know about disinformation campaigns, or sign a code of conduct pledging not to spread misinformation even if it gives them a political advantage.

Bradshaw suggested the government should regulate the design of social-media algorithms and oversee companies’ rules around misinformation.

According to Canada’s Minister of Democratic Institutions, “all options are on the table” when it comes to addressing electoral meddling.

Karina Gould said the government is currently looking at several models used by “like-minded” countries.

Australia has a stringent law which requires social-media companies to remove violent content or face fines, while executives can go to jail. And a U.K. Parliamentary committee suggested social-media companies should be fined over “online harms” and overseen by an independent regulator tasked with protecting users and administering penalties.

Gould said Canada is also considering regulations for online platforms, including Google and Facebook, and is looking at a code of conduct to require them to report regularly to the government ahead of elections on how they are fighting misinformation and inauthentic behaviour, and shut down known bot accounts or sites used by foreign actors to manipulate opinion.

“I’m not satisfied with the directions of those conversations,” she said, adding that the discussions are far from over.

On Tuesday, Canada’s national security and intelligence committee announced an investigation into foreign interference and espionage in Canada. The committee chair said its results will be released before the fall federal election.

But disinformation will never be extinguished until Canadian society deals with divisive issues that marginalize some people and bestow privilege on others, said Bradshaw.

“Polarization is not new, same with distrust in our media,” she said. “If we want to address disinformation and polarization, we need to target those problems, which might not necessarily be a social-media response.”

With files from Alex Boutilier and Marco Chown Oved

Read more about: