Every time you use Facebook, the social media platform learns something about you. From your location, the type of computer or mobile device you use, your relationship status, the TV shows you recommend, the articles and personalities you “like” and dozens of other data points, Facebook can derive with creepy reliability not only your preferences and habits, but also your aspirations and anxieties.

This information is the core of the social media giant’s business. It allows advertisers to target with extraordinary precision the users most likely to be persuaded to purchase particular goods and services. Indeed, with its vast stores of personal data and its expertise at exploiting them, the company has built a tool for micro-targeted persuasion the likes of which the world has never seen.

The case of Cambridge Analytica, an American data analytics company that harvested information from some 50 million Facebook profiles and allegedly used it to influence the outcomes of the 2016 U.S. presidential election and the Brexit referendum in the United Kingdom, provides the latest in a series of cautions about the dangers this tool poses, if unregulated, both to our privacy and to our political process.

The revelations are yet another warning that Facebook is reshaping democracies in ways democracies have not thought enough about; that the company either does not understand or does not take seriously the responsibility that comes with this extraordinary power; and that legislators, including those in Canada, have been too slow to adapt our privacy and elections laws to a changing world.

Consider what the Cambridge Analytica story says about just how easily the laxity of Facebook’s privacy protections can be exploited and how effectively information thus gleaned can be used in the cause of subverting democracy.

As reported by the New York Times and The Observer of London, the company, in partnership with Aleksandr Kogan, a Cambridge University academic, launched a personality-test app on Facebook which they used to mine the personal information of the hundreds of thousands who answered its questions, as well as of millions of the respondents’ Facebook friends.

Facebooks latest privacy scandal involves Trump campaign consultants who allegedly stole data on millions of users in order to influence elections. (The Associated Press)

The company then used that information, along with Facebook’s advertising infrastructure to, as The Observer wrote, “raise anxiety, reinforce prejudices, suppress turnout, amplify partisanship and increase the reach of misinformation and conspiracy theories.” Or as the company’s co-founder, Chris Wylie, a Canadian who blew the whistle on the operation, put it, “We exploited Facebook to harvest millions of people’s profiles… and target their inner demons.”

Cambridge Analytica’s actions are indefensible. But in recent days the conversation has rightly shifted to whether Facebook is taking sufficient responsibility given its enormous power.

The company insists the data mining was conducted “in a legitimate way,” according to its rules, and certainly did not constitute a data breach. It doesn’t seem to understand that that’s hardly exonerating. If anything, it’s worse. How many other operations are similarly exploiting Facebook’s rules?

The company does acknowledge that its rules were broken when the information was shared with a third party. But when the company learned of this two years ago it simply asked that the information be destroyed and trusted Cambridge Analytica when it said it had, though evidence suggests it hadn’t.

Perhaps most troubling, Facebook never informed the 50 million users affected that their data had been illicitly shared.

Britain's information commissioner said on Tuesday she is using all her legal powers to investigate the handling of millions of people's personal Facebook data by the social media giant and by political campaign consultants Cambridge Analytica. (The Associated Press)

In the days since the news broke, Facebook’s stock has fallen precipitously, wiping tens of billions of dollars from the company’s value. It is now clearly in stockholders’ interests, as well as everyone else’s, that the company take swift and significant action to combat the spread of fake news and make good on its promise that “protecting people’s information is at the heart of everything we do.”

Yet early signs are not good. On Tuesday Facebook announced that Alex Stamos, the company’s chief information security officer, who has advocated for more disclosure around Russian interference in the U.S. election and changes to the platform to guard against future meddling, would be leaving the company.

Clearly Facebook will respond to mounting political and financial pressures. But whatever voluntary actions the company does take should not be viewed by governments, including our own, as a substitute for democratic regulation.

Loading... Loading... Loading... Loading... Loading... Loading...

Models for regulation are beginning to emerge. The European Union recently passed a law mandating private companies disclose data breaches within 72 hours and attaching stiff fines to violations of its updated privacy law. In the U.S., a bill is is making its way through Congress that would force the same level of transparency around political advertising online, including on Facebook, as in print or broadcast.

Canada’s privacy commissioner and chief electoral officer have both called for similar measures at home, pointing out that our privacy and elections laws were written for a pre-digital world. We should not wait for a homegrown election-meddling scandal to act.

Of course regulating the internet is challenging and fraught with risk. Its openness must be protected. But we have seen where the libertarian ethos of Silicon Valley, which dogmatically opposes any attempt to regulate, takes us: troll farms and the rousing of “inner demons” to distort democracy. Online, as everywhere else, the idea of regulation cannot be reduced to an infringement on freedom, but is also an essential means to protect rights and democracy.

Correction - March 21, 2018: This article was edited from a previous version that misstated Chris Wylie’s given name.

Read more about: