Brooke Binkowski

Opinion contributor

If Facebook wants to show us they’re opposed to hate speech, banning individual trolls is about as effective as a Band-Aid on a sucking chest wound — painful, messy, and worse than useless.

In 2016, when I was managing editor at fact-checking site Snopes.com, we agreed to partner with Facebook to stop a major problem that was rapidly approaching crisis mode. This back when disinformation was still called “fake news,” and no one yet had any idea of the scope or the power of what they were seeing on social media.

I had some trepidation, but I thought it might be a good-faith effort to stop stories that were using fearmongering and lies to corrode not just policy but the most intimate social relationships. I thought we had a chance of changing the system from within.

Read more commentary:

What do right-wing trolls like Jacob Wohl actually want? Easy — I wrote the book on it.

One of my Saudi clients is dead. Another is living, but he could be next: Exiled lawyer

For Jews, America was once exceptional. Now, anti-Semitism is as strong here as in Europe.

Instead, I discovered Facebook’s role in a genocide in the western Myanmar state of Rakhine — which, at the time, the company appeared to downplay. I saw this not just as a massive human rights violation on its own, but also a terrible warning for what was in store for the rest of the world.

Facebook's social media model drives chaos

The model for what happened in Myanmar was simple: Facebook set up a deal with local mobile phone companies to exclude use of its platforms from data restrictions. Many people in the country then got their news stories directly from Facebook, making it fertile ground for algorithmic experimentation on individual and crowd behavior. "Burma is experiencing an ugly renaissance of genocidal propaganda," Matthew Smith, co-founder of human rights organization Fortify Rights, said in 2017. "And it spreads like wildfire on Facebook."

False stories about Myanmar’s Rohingya Muslims played on a decades-long pattern of discrimination against the ethnic group in the region. Rumors spread that they were rapists and thieves, and posts urged they be shot or exterminated.

The results have been devastating. These stories, many of which were pushed on Facebook by government officials, were used to justify driving hundreds of thousands of Rohingya out of their homes and sparking a massive refugee crisis. Since then, Facebook has taken some steps to control the vitriol on its platform, but the efforts so far cannot undo the damage.

Facebook and other social media's true problem is a hellish combination of disinformation, an ever-weakening journalism industry, algorithmic clustering, and sophisticated dark advertising using psychographic research to bombard already-identified users with false or frightening imagery — all in the service of “engagement” revenue.

Which brings me to the most recent ban by Facebook of high-profile individuals that it says spread anti-Semitic content. While I applaud moderation of all corrosive content and other consequences for spreading hateful speech, banning a few people would not have been a solution even if the network had implemented it years ago.

You can already see that those same individuals are able to parlay this ban into charges of personal censorship that are ludicrous, reactionary — and incredibly effective in certain quarters.

Free speech doesn't force others into silence

What is particularly insidious is how this all relies on a fundamental misunderstanding of free speech. When someone speaks in a way that intimidates another individual or group into silence, then that speech ceases to be free. Social media hasn’t learned that yet.

That Facebook has taken any action speaks volumes about how far we have come. But if we don’t keep pushing, it will end here and we’ll all be stuck in a toxic soup of hoaxes and fake stories in a dystopian alternate universe created by algorithms. This will permanently destroy democracies around the world.

In the meantime, if Facebook (and Twitter, and YouTube, and others) truly wish to change for the better, here is what they must do.

Hire more moderators, ethicists and historians. Train them to be ruthless about pruning back disinformation and propaganda. Supporting corrosive disinformation that silences others is not supporting free speech. Make users opt-in to social media algorithms and show us exactly why we see the stories and posts that we see, and give us the power to adjust them.

Facebook must atone for its sins toward journalism. Even before its “pivot to video” metrics fraud — hugely destructive, whether it was intentional or not — Facebook devastated small news organizations that rely on ad revenue and goodwill to survive.

My suggestion: Stop paying fact-checkers directly, which has helped effectively politicize fact-checking. Instead, put money into an independent and transparent foundation to be distributed to newsrooms as annual grants. While $100,000 a year might not mean much to Facebook, that could be everything to a small-town newspaper.

And Facebook must advocate publicly for journalists jailed around the world for investigating human rights abuses justified or sparked by stories pushed on their platforms.

These steps would go further to protect global free speech than anything else Facebook has done so far. To support a free press is to support free speech. To use free speech instead as an excuse for spreading false accusations about individuals and groups to make money from “engagement” — the social media version of the worst parts about television ratings and viewership — is Orwellian doublespeak.

This will require soul-searching and a large cultural shift in Silicon Valley. But humans aren’t faceless meat-sacks who exist solely as moneymakers for Big Tech. Every human deserves a basic modicum of human dignity, not rank exploitation.

If this isn’t enough to encourage social media platforms to live up to their tremendous moral obligation, I’ll pass along something I learned during decades of covering human rights issues in repressive regimes: Impunity never lasts.

Brooke Binkowski is a longtime journalist and managing editor of TruthOrFiction.com. Follow her on Twitter @brooklynmarie.