Facebook CEO Mark Zuckerberg departs after testifying before a House Energy and Commerce hearing on Capitol Hill in Washington, Wednesday, April 11, 2018, about the use of Facebook data to target American voters in the 2016 election and data privacy. Photo : Andrew Harnik ( AP )

I think we can all safely agree that last year was not great for Facebook. User trust plummeted to record lows as the company faced scandal after scandal. At this point, there are countless reasons users can and even should be wary of Facebook—and yet.


On Thursday, the Wall Street Journal published a 1,000-word screed by Mark Zuckerberg about the company’s data collecting practices titled “The Facts About Facebook.” In it, Zuckerberg makes noise about the company being about “people,” and insists—as he has been for the majority of his company’s 15-year history—that we should trust it. Zuckerberg appears to think the primary reason users have little faith in the company’s ability to responsibly or ethically handle their data is because of its targeted advertising practices, about which he writes: “This model can feel opaque, and we’re all distrustful of systems we don’t understand.” He continues:

Sometimes this means people assume we do things that we don’t do. For example, we don’t sell people’s data, even though it’s often reported that we do. In fact, selling people’s information to advertisers would be counter to our business interests, because it would reduce the unique value of our service to advertisers. We have a strong incentive to protect people’s information from being accessed by anyone else.


So, sure. Let’s start with the ads.



Earlier this month, a Pew Research Center survey found that users do indeed remain largely in the dark about how Facebook tracks their information in order to feed them relevant ads (and off of which it makes heaping piles of money). Of the nearly 1,000 U.S. adults polled for the survey, some 74 percent of those who use Facebook said they had no idea about the site’s “ad preferences” section where activity-based “interests” appear. Fifty-one percent of users said they were “not very or not at all comfortable” with Facebook amassing this information about them.

This data shows that the company has a lot of work to do when it comes to transparency. But additional data indicates that, in fact, the more we know about how Facebook works, the less trustworthy it becomes.

Annual surveys from the Ponemon Institute show that user trust in the social media giant toppled significantly in the wake of the Cambridge Analytica incident, when it was learned that Facebook previously knew that the research firm had obtained the personal data of tens of millions of Facebook users and mostly did nothing. Reporting on the survey in April, the Financial Times said that user trust in Facebook had actually been on the rise before the scandal, but that user confidence in the company to protect their information fell from nearly 80 percent in 2017 to 27 percent last year. That was toward the beginning of the year—then the rest of it happened.


In 2018, we learned that Facebook was data-sharing with other companies like Microsoft’s Bing, Spotify, Netflix, and others in exchange for more information about its users. There were also the revelations that Cambridge Analytica data-scraping was worse than we thought; that Facebook was sharing shadow contact information with advertisers; and that turning off Facebook location-sharing doesn’t stop it from tracking you. That’s obviously totally aside from the George Soros conspiracy theory fiasco; its mishandling of Myanmar genocide; and its standing as a hotbed for rampant misinformation.

As with his year-end Facebook post—which I’ll note here also largely ignored the tsunami of public relations problems the company faced last year—Zuckerberg appears to remain bafflingly optimistic about the function of his company. To be clear, this is the same founder of Facebook who once called users of his product “dumb fucks” for trusting him with their sensitive information.


If users don’t trust Facebook, it’s not because they don’t understand it. It’s because they do.

[Wall Street Journal]