Social media is, to a very large extent, the new public square. Politicians who master it – from Donald Trump to Justin Trudeau – are the stars of the current era.

Yet the companies that dominate social media – Facebook, Twitter and Google – have been slow to acknowledge their power or accept responsibility for how their platforms are used. And governments have been just as slow to react to the dangers posed when social media is used to manipulate public opinion and influence elections.

That’s starting to change. Both Facebook and Twitter have just announced measures in Canada to make public more information about political advertising on their sites. Facebook Canada also launched what it calls its “Canadian Election Integrity Initiative” – essentially educating political players on how not to be hacked, and citizens on how to consume digital media without being led astray by purveyors of so-called “fake news.”

The social media giants clearly understand that if they don’t get their houses in order, then someone else will do it for them. They’re too powerful to be left alone.

Still, the steps they have taken so far fall far short of what is needed, given their enormous influence and the potential risks to our democracy.

That’s true in the United States, where congressional investigators have discovered that groups linked to Russia placed thousands of ads on Facebook designed to influence the 2016 presidential election. And it’s true in Canada, where a Senate committee this summer warned of holes in our election laws that could allow for foreign interference, especially through social media. At the same time, Canada’s Communications Security Establishment warned of increasing cyber threats to the electoral system.

In the U.S., some legislators aren’t prepared to let the big tech companies simply regulate themselves any longer. Two Democratic senators, joined by Republican John McCain, have introduced legislation that would bring the same level of accountability and regulation to online political advertising as now exists for political ads on television and radio.

Their bill, called the “Honest Ads Act,” would require social media and internet companies to make public detailed information about political advertisers who spend more than $500 on their platforms. That would allow voters and politicians to see who’s behind the ads, what their target audience is, and how often they are viewed. The bill would also update U.S. laws to give online political ads the same level of protection against foreign interference as old-style ads.

These are sensible measures that would bring U.S. election regulations more in line with the reality of today’s political campaigning, which is waged online as much as at traditional rallies or on TV.

Canadian legislators should watch and learn. Facebook and Twitter deserve credit for taking steps in this country to bring more openness to political advertising on their platforms. But it shouldn’t be entirely up to ordinary voters, or even political operatives, to figure out whether they’re being manipulated online by groups masking their true identity.

There’s a role for government in regulating online advertising, at least to the same standards of accountability as exist for ads in print or traditional broadcast media.

At a minimum, Ottawa should give Elections Canada expanded powers to oversee digital political campaigns. The agency should be able to force internet companies to disclose more information about political ads, including who paid for them, where they were placed and who saw them.

Loading... Loading... Loading... Loading... Loading... Loading...

All this would just shed more light on what’s going on in the murky world of political “dark advertising.” It would not – and should not – police the actual content of ads, aside from banning outright hate speech.

That’s a much trickier task. Facebook and Google have both introduced third-party fact-checking systems to weed out so-called “fake news,” with largely disappointing results. In the end, it’s still up to citizens to figure out what, and who, to believe. But they should know far more than they do now about where the information comes from.

Read more about: