Facebook’s recent company announcement should scare anyone who values independent and non-profit media.

In response to mounting criticism over its role in spreading “fake news”, as well as research showing that social media has been making people less happy, Mark Zuckerberg has revealed that the company is overhauling its news feed algorithms to de-emphasize “passive content” from brands and publishers and promote “meaningful interaction” with friends and family instead. Facebook says it wants its users to feel positive after using its service, and will display content accordingly. So more pictures of adorable dogs, fewer links to news sites.

Everyone can support efforts to eliminate fake news and improve user experiences. But for media organizations, Facebook’s approach is troubling. Many outlets depend on traffic from sites like Facebook and Twitter, since social media has become the main gateway through which people access content on the internet. If news organizations are “blacklisted”, or buried at the bottom of news feeds, they could see a significant chunk of their audience evaporate overnight.

There is already evidence of what the changes might do. The rollout of algorithm tweaks in other countries has had serious impacts on certain organizations, with at least one subscription news service losing nearly one-third of its Facebook engagements after the tweak, and Cambodian NGOs complaining that the delivery of public service information was disrupted.

Similar changes by other tech companies have also financially damaged small content providers. Last year, when YouTube adjusted its algorithms, ostensibly to combat hate speech, a progressive political commentary show suddenly saw its revenue plummet and had to launch a crowdfunding campaign to survive. Changes made by Google News may have caused some sites to experience drops in traffic as high as 70%.

For independent and nonprofit outlets, these kinds of shifts can be an existential threat. As the editor of a small not-for-profit political magazine that depends on social media for its audience, I’m worried about the future of my publication and livelihood.



In the two years of our magazine’s life, we have managed to build a sizable readership and become financially sustainable, almost entirely thanks to the sharing of our content on Facebook and Twitter. We make our money from print subscriptions, but with newsstands a thing of the past, the only way new potential subscribers find our site is when our articles show up in their feeds.



If our content stops appearing, people will stop visiting our site, and revenue will collapse. Even small drops in traffic can be significant when you’re a tiny organization paying your expenses month to month.

Reducing people’s exposure to journalism is a drastic solution to the fake news problem

Some have praised the changes, citing the urgent necessity of combating “fake news”. The former New Republic editor Franklin Foer, a strong critic of Silicon Valley, said that while media organizations would feel a “sting”, Zuckerberg’s decision was “for the best”.



I understand why fear of misinformation and Russian propaganda might lead some to want to return to a world of cute animals and vacation photos. But reducing people’s exposure to journalism is a drastic solution to the fake news problem, and promoting the posts of “friends and family” over “media” can even worsen the situation by prioritizing rumors over genuine news.

By now, it’s a commonplace to say that monopolistic tech corporations like Facebook and Google have amassed too much power. It’s important to realize, though, what this power actually means: these companies can literally hold the fate of media organizations in their hands. Mark Zuckerberg’s choices have serious financial ramifications for thousands of content providers. Yet this power is totally unregulated and totally unaccountable.

We don’t yet know how serious the consequences of Facebook’s news feed changes will be, because these decisions are made without any transparency. That lack of disclosure is a problem in itself. Facebook keeps its algorithms completely secret, and has been criticized for refusing to explain how it determines what information we see. When one company has so much control over the information that reaches us, we ought to be able to know how that information is sorted and selected.