Countless commentators are declaring our honeymoon with Silicon Valley officially over. From Russian meddling to deepfakes, fake news and beyond, there’s growing consensus that the internet is broken, and lawmakers must fix it.

In particular, there’s increasing concern that online platforms, such as Facebook, aren’t doing enough to curb harassment and disinformation online. In response, many players are rushing to fill the policy vacuum — particularly as the government reviews the Broadcasting and Telecommunications Acts, which will shape how Canadian internet users and companies create and communicate online.

This month, the Public Policy Forum published a new report, Democracy Divided, in which a number of sweeping proposals were floated to address what the authors call “online news threats to democracy.”

The report’s authors have direct access to cabinet during last week’s retreat. Given the Trudeau government’s persistent concerns about online misinformation and hate speech, it’s important they understand how one of the report’s key proposals would fundamentally change the internet.

While there are a few good ideas in the report — like independent audits of algorithmic code, and beefed-up advertising transparency — there’s an unworkable underlying assumption that must be addressed.

Democracy Divided treats online platforms as the 21st century equivalent of traditional media outlets — such as TV broadcasters and news publishers — and wants to regulate them in the same way, by requiring that internet companies be “legally liable for content appearing in their domains.” But this approach is wrong-headed and could lead us down a dangerous path.

It’s tempting to compare platforms to media companies. After all, social media sites have large audiences; they don’t fit neatly into the category of “common carriersm,” such as phone companies, which must treat all calls the same; and sites like Twitter and Facebook are always curating the content a specific user sees through algorithms, so the idea of a “neutral” platform is a myth.

But this does not make these companies equivalent to traditional media. Newspapers hire professional editors whose role it is to vet every piece of content they publish: every headline, op-ed and article. Even comments are often approved and moderated. This is substantially different from a site like Facebook, where content is allowed by default and only taken down if found to be in violation of its terms of service.

Currently, online platforms are not legally liable for actions their users take (with a few exceptions). This is what gives us the freedom to create and share things without having to ask for permission first. If the legal obligations of traditional media were imposed on platforms, these companies would work hard to avoid being sued into oblivion over user-generated content.

To avoid this risk, platforms would need thousands of people or, worse yet, an algorithm reviewing the billions of pieces of content on Facebook before they were posted to determine their legality, newsworthiness, and whether they should be published.

To minimize liability, platforms would, at minimum, proactively remove anything potentially contentious, and, at worst, shut down users’ ability to post content entirely — dismantling thriving online discussions. In a world where platforms were “legally liable for content appearing in their domains,” no company could afford to defend our rights to post political commentary, health and sexual education, satire, and more.

A “more of the same” approach is not what’s needed: new technologies require new ways of thinking. Experts and governments elsewhere are already looking to other areas of the law, such as shipping, public utilities, and banking, to create new ways of governing platforms that balance the public’s right to share information, with legitimate concerns about the spread of fake news and other harmful content.

Loading... Loading... Loading... Loading... Loading... Loading...

Disinformation and online harassment are real problems that urgently need solutions. But less freedom to share, create, and discuss is not the answer.

Instead, the Trudeau government must put democracy squarely into focus and look for solutions that actually create higher information quality — not ones that force a 21st century technology into a 20th century box.

Meghan Sali (@megasali) is a law student at the University of Ottawa, and a former Free Expression advocate at OpenMedia.Josh Tabish (@jdtabish) is a civic engagement and digital policy expert. He is currently a Technology Exchange Fellow with Fight for the Future.

Read more about: