Although there’s no shortage of damning reports about Facebook, one aspect of the platform seems rarely mentioned: Our favorite organizations—nonprofits, businesses, advocacy groups, and others—continue to actively support Facebook by pumping ad money into it. It’s not just Trump, Beto, political action committees, and big marketing agencies that sustain Zuckerberg’s creation. The platform’s main lifeline, its ad revenue, depends on the organizations we support every day and their social media experts, who essentially act as liaisons between Facebook and millions of marketing departments around the world.

As more young people, investors, and concerned citizens leave Facebook, it makes sense that the corporate marketing industry is sustaining its myth—especially considering that Facebook’s core development strategy is based on embedding its services into a diverse set of marketing applications. Now we’re learning, through extensive interviews with Facebook’s former and current staff, how the platform is attempting to monopolize the social media space and attack its critics and competitors through lobbying, opposition research, and other D.C.-based tools of the trade.

These practices might make more sense if Facebook didn’t have a bummer product. The company’s own studies suggest that using the platform is bad for users’ mental health, and key engineers, investors, and its ex-president have admitted Facebook is psychologically manipulative. But organizations continue to pay social media staffers to create and share content on the platform.

It’s clear that the physical, social, and mental damage done through Facebook can far outweigh its perceived benefits.

The main reason we can’t seem to break up with Facebook, besides our low-key addiction to it, is that it is still perceived as the premiere digital communications hub, partly due to its ability to schmooze stakeholders as high-ranking as Democratic leader Chuck Schumer.

But the kids are catching on to Facebook’s PR tricks and apology tours. (Although young people’s increased use of Instagram, a platform owned by Facebook, doesn’t really help the issue.) Facebook is old news to the younger generation. It’s where older folks get high blood pressure.

According to a 2018 Pew Research Center poll, “around four-in-ten (42 percent) of those 18 and older say they have taken a break from checking the platform for a period of several weeks or more, while around a quarter (26 percent) say they have deleted the Facebook app from their cellphone.”

Pew found age differences in the share of Facebook users who have recently taken some of these actions. “Most notably, 44 percent of younger users (those ages 18 to 29) say they have deleted the Facebook app from their phone in the past year, nearly four times the share of users ages 65 and older (12 percent) who have done so.”

With such a steep decline in use, it’s easy to imagine a world without the platform. Yes, there will always be people willing to serve Facebook’s interests, and there will always be unexplored markets abroad where Facebook can make up its losses. However, the recent mass exodus from the platform signifies that, unless it changes its profit model, Facebook’s days as a respected social media platform might be numbered.

But Facebook won’t just disappear quietly into the night. Social media gurus will keep churning out and boosting Facebook posts, despite the fact they’re spending time in a psychologically manipulative environment — and benefiting from others who spend time there as well. In many ways, these experts are Facebook’s last line of defense—without their cooperation, focus, trust, and money, there’s no future for the platform.

When Facebook Called Me

About a month ago, I received a call from a Facebook representative trying to get me to invest in Facebook ads. The rep reached out to me because I had previously used those services.

The rep was excited to sell me Facebook’s “custom audience” feature, which allows companies to target “lookalike” audiences or Facebook users who share similar attributes to those who already visit certain content.

“When you create a Lookalike Audience, you choose a source audience (a Custom Audience created with your pixel data, your mobile app data or fans of your Page) and we identify the common qualities of the people in it (ex: demographic information or interests),” states a Facebook tutorial on Lookalike Audiences. “Then we find people who are similar to (or ‘look like’) them.”

The key phrase here is “source audience”: data that can be imported to Facebook in order to target similar users. After the second or third call, the Facebook sales rep stopped calling me, perhaps sensing I wasn’t willing to spend money on reaching “lookalike” audiences.

What I was being offered was basically what Brad Parscale, digital media director for Trump’s 2016 campaign, used to target people on Facebook.

“Facebook’s advertising tool called Custom Audiences allowed the campaign to microtarget the people who were most likely to show up to vote,” Parscale said in an interview for PBS. In the interview, Parscale explained how he used Facebook to target his desired audience:

When you decide you’re going to run for president of the United States, now you have hard-matched data with consumer data, matched with voter history, matched with very comprehensive polling data from all over the country. When you do that, you put that into a machine and then you start to learn… how people react in areas, people, and individualize what they call hard ID’d. Hard IDs are then matched with phone numbers, email addresses and everything. By the time all those pieces are put together, then you can actually pull out an audience. You can say I want to find everybody in this portion of Ohio that believes that the wall needs to be built, that thinks that possibly trade reform needs to happen, and so we want to show them a job on trade and immigration. Then you take that, you export it out into Excel file—very simple. You import it into Facebook with PI in it, which is personal information that’s matchable data—you know, addresses, phone numbers, whatever you have—and Facebook has been in the job of scraping that all from you, and then it just matches them. Then you have a little button on your computer that says that audience, and you can use that for all your ads.

One would think that the most obvious solution to this problem would be to address Facebook’s targeting mechanism. Unfortunately, Facebook’s fix to prevent “Russians” from doing what Trump’s team did—announced two years after the 2016 elections—was to label political and social ads as “paid for,” a measure that hardly gets at the heart of the issue. A couple of months after Facebook announced the change, VICE reporters were still able to place “Russian” ads on Facebook by posing as U.S. senators, Mike Pence, and ISIS.

Facebook profits from providing the highest bidder with a shortcut to its users’ minds. Unless it addresses this mechanism, its users will continue to be targeted by anyone with a basic knowledge of the platform.

I understand why someone in Parscale’s position would be excited about reaching millions of people—be it to sell fascism-light to the American public or to simply move up the ladder. After all, more reactions and engagement on Facebook means that digital media employees are doing their jobs well.

At the same time, it’s clear that the physical, social, and mental damage done through Facebook can far outweigh its perceived benefits. Forcing organizations to compete in a pay-to-play environment by sneaking ads in-between public and private communication only benefits the oligarchs and multinational companies that can afford to stand above the rest. This is the definition of elitism.

But it doesn’t seem like Zuckerberg is reconsidering his company’s exploitative profit model. Quite the opposite—in the past couple of years, the reach of organic (meaning non-paid) marketing content has plummeted in favor of more personal news from friends and family. While Facebook’s justification for this change sounds altruistic—“marketing should take a backseat to content by friends and family”—the tilt toward more “personal news” has made it difficult for smaller companies and nonprofits to organically reach their followers.

Without ad money, small organizations are throwing messages in bottles and hoping Facebook’s mysterious algorithm will carry them beyond five likes, even if they already have thousands of followers. I know this because I’ve been on both sides of the coin. I’ve worked for organizations that can’t afford to invest in Facebook and for organizations that can dedicate cash to push content further. While putting money behind content does work, the cost of sustaining an elitist platform that perpetuates unethical behavior and manipulates its numbers doesn’t seem worth it.

Structural Issues and “Russiagate”

As users abandon it, the platform’s structural and ethical issues keep piling up. Just recently, Facebook was in the news for exposing its users’ photos (even if they weren’t posted), buying $9 billion of their own shares (which tumbled more than 22 percent in 2018), and working with right-wing opposition researchers to smear George Soros. As I write this, news is breaking that Facebook provided Netflix and Spotify with access to users’ private messages.

But the narrative that seems to appear most in mainstream media is how Russia used Facebook to interfere in the 2016 U.S. election. Of course, if political commentators presented “Russian interference” in the context of Facebook’s inherently manipulative services, it would become clear that it’s not just Russia interfering in U.S. elections. Anyone with enough money and power (Trump, the Mercer family, etc.) can hijack Facebook’s services for private gain.

Any talk of Facebook’s dirty laundry has to address its profit scheme. It made $39.9 billion in ad revenue in 2017 — more than every other tech company that’s not Google.

While commentators continue to focus on troll farms in Russia and North Macedonia and insinuate they influenced the 2016 U.S. elections—by allegedly suppressing the African-American vote, promoting Green Party candidate Jill Stein, and recruiting “assets”—available data and studies presented by Aaron Maté at The Nation magazine suggest Russian-related efforts during the 2016 U.S. elections were “microscopic in reach, engagement, and spending; and juvenile or absurd in its content.”

We should be wary of efforts to present Facebook as compromised by Russia but otherwise good overall. Those who bifurcate Facebook’s structural issues from its alleged use by the Russians are simply cherry-picking data to augment their own political positions. Simply put, Facebook actively enforces arbitrary political speech rules through thousands of contracted moderators. All in all, “Russian” masturbation-shaming memes are just the tip of the iceberg.

The Medium Is the Message

Facebook’s dilemma validates Marshall McLuhan’s popular sentiment: “The medium is the message.” The medium, Facebook, has literally embedded its ad services into its messaging mechanism (newsfeeds, FB messenger, etc.), which has influenced how those messages are perceived by its users and the public.

All that was needed to popularize Facebook’s Black Mirror-ish qualities was a big enough example of how destructive it can be. And that example came through the bravery of Christopher Wylie, a Canadian whistleblower and ex-employee of the now defunct Cambridge Analytica, who provided documents to The Guardian that described the secret workings to Cambridge Analytica’s relationship with Facebook.

The documents revealed that Cambridge Analytica, a company partly funded by the Mercer family, improperly used Facebook data to build tools that aided Trump’s 2016 campaign.

“I didn’t set out to attack Facebook. Facebook has just been incredibly uncooperative,” Wylie said in one interview. “It hasn’t respected the role of the media and scrutiny and embraced this scrutiny and worked to improve itself.” Wylie’s revelations jump-started a number of official investigations, which continue to erode public trust in the platform.

Any talk of Facebook’s dirty laundry has to address its profit scheme. It made $39.9 billion in ad revenue in 2017—more than every other tech company that’s not Google. People need to start asking why our favorite organizations—Russian, American, and otherwise—are still pumping money into Facebook to reach their audience.

Facebook execs might scoff at the idea of people leaving the platform and the #DeleteFacebook movement, but they’d hear the message loud and clear if people started questioning why their favorite organizations are still investing time and money in an unethical, unhealthy platform.

The mightiest allies in this endeavor are social media gurus who understand Facebook from the inside out and are willing to promote ways to escape its grip on society. Now is the time for those who understand the exploitative practices of social media—as well as the importance of the current political moment—to organize and share knowledge about how the “sausage is made.”

An organized and persistent critique from its most knowledgeable users, combined with support from a rainbow coalition of concerned activists, could nix what little credibility Facebook has left and cut its remaining vestiges from life for good.