Less than a week ago, Facebook published an extraordinary statement unlike anything in its history. The company acknowledged for the first time that ordinary use of its product could be harmful. "The bad: In general, when people spend a lot of time passively consuming information — reading but not interacting with people — they report feeling worse afterward," wrote the authors, who work on Facebook's internal research team. The company added that more active use of social media, in which users trade messages and comments, "was linked to" improvements in well being.

2017 was a bruising year for Facebook’s reputation

The post arrived unexpectedly, but it was a long time coming. 2017 was a bruising year for Facebook’s reputation. The closest comparison would be 2007, when the company faced a public backlash and advertiser revolt over its controversial Beacon ad tool. But Facebook was then at a fraction of its current size and power. As it reckoned with its immense responsibilities this year, the company was dealt another blow: a handful of high-profile former employees became vocal critics of what the company had created.

For most of their history, the big social networks have been dismissed as toys. Even as the cultural influence of Facebook and Twitter has grown, though you might throw in YouTube, Snapchat and Facebook-owned Instagram as well, their core mechanics of posting, liking, and sharing have been a background concern despite the rise of eerily effective advertising tools.

That all began to change at the end of 2016, as the world witnessed the results of the US presidential election. Russia-linked groups allegedly exploited social platforms to inflame social divisions and promote the candidacy of Donald Trump, earning millions of impressions for paltry sums spent on advertising. Facebook, Twitter, and Google-owned YouTube have been under a microscope ever since — somewhat unfairly, current and former employees there have argued, since the conversation tends to conveniently ignore the effect of fake news and outright propaganda posted elsewhere. (The country seems particularly overdue for a referendum on the influence of our hysterical cable news networks, starting with Fox News.)

But the monomaniacal focus on social platforms' influence on the election had the effect of drawing out some of Facebook's earliest champions and builders. They criticized what they themselves had built.

They criticized what they themselves had built

The first was Justin Rosenstein, now the co-founder at Asana, a collaboration software company. Rosenstein helped to lead development on Facebook's like button, but this year he complained about the psychological effects of social media, and the "bright dings of pseudo-pleasure” that came from friends liking his posts. "It is very common,” Rosenstein told the Guardian, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”

Sean Parker, who was Facebook's first president, seemed to echo Rosenstein's comments at an Axios event last month where he called himself "something of a conscientious objector." "I don't know if I really understood the consequences of what I was saying, because [of] the unintended consequences of a network when it grows to a billion or 2 billion people and... it literally changes your relationship with society, with each other," Parker said. "It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains.”

Roger McNamee, an early Facebook investor, said the company is directly responsible for the misuse of its platform by Russians. "Facebook did not set out to increase political polarization and empower bad actors to undermine democracy, but this outcome was inevitable," he wrote in an October op-ed in USA Today. "It was the result of countless Facebook decisions, all made in pursuit of greater profits. In order to maximize its share of human attention, Facebook employed techniques designed to create an addiction to its platform."

The company avoided getting into a public fight with its defectors until this month. Then Chamath Palihapitiya, who once led Facebook’s user growth team, encouraged his audience at the Stanford Graduate School of Business to take "a hard break" from social media. “I think we have created tools that are ripping apart the social fabric of how society works,” he said. He added that he felt "tremendous guilt" over his time at the company, which made him enormously wealthy. (He then tried to walk back some of his comments the next day, leaving what he actually thinks a mystery.)

Facebook was finally moved to respond. It noted that Palihapitiya had not worked at the company in six years. "Facebook was a very different company back then and as we have grown we have realized how our responsibilities have grown too," it said. "We take our role very seriously and we are working hard to improve. We’ve done a lot of work and research with outside experts and academics to understand the effects of our service on well-being, and we’re using it to inform our product development."

He felt "tremendous guilt" over his time at the company

You could read this statement as an earnest, good-faith response to the company's defectors, or look at it as a deflection. Is Facebook ripping apart the social fabric, to use Palihapitiya's words, or isn't it?

It’s fair to ask why these former employees are speaking out only now, after they have reaped millions helping bring Facebook to a position of global dominance. To some, it feels self-serving. They have little to risk now in complaining about their former employer, but could stand to gain if public opinion turns against Facebook further. None have said what they would have done differently at Facebook, knowing what they know now. Even Palihapitiya said later that the company “overwhelmingly does good in the world.”

Still, the volleys of criticism all set the stage for the company's December 15th blog post, in which it tiptoed into waters previously reserved only for academics, journalists, and Facebook's critics. It laid out, in an admirably straightforward way, a number of studies that had shown News Feed consumption could make people feel worse about their own lives. It also presented research suggesting that Facebook could strengthen the bonds between friends and family, and make them feel better.

Almost as important as what the company's researchers said is what they only suggested: that Facebook itself cannot predict the effects it will have on us, either at the individual or societal level. This explains why it would respond to Palihapitiya not by attempting to refute his fears but instead by pledging to work with outside researchers and use it to "inform our product development."

News Feed consumption could make people feel worse about their own lives

For all the trouble this year, Facebook remains a dominant company. It grew in revenue, profits, and number of users. But as a survey commissioned this year by The Verge illustrated, Facebook also has a trust problem.

Every company is an experiment. And yet even among tech giants, few experiments seem so emotionally laden as Facebook's. The company inserted itself between us and our friends — and between us and the news — and the ultimate result is anyone’s guess. And what you may have experienced in past years as a kind of vague unease about the company had, by the end of 2017, a credibility and a shape.

It was bad enough for Facebook to be hauled before Congress and scolded for its inaction in the face of Russian meddling. (“Do something, or we will,” Sen. Dianne Feinstein told the company’s general counsel in November.) But it was worse when the people who brought Facebook to life embark on a public campaign to distance themselves from it.