Photo

Sometimes, being wrong on the Internet means having to say you’re sorry.

And by now, Facebook is very, very good at saying sorry.

Facebook offered up an apology to its users on Sunday, after it came to light that the company had manipulated the news feeds of more than half a million people so it could change the number of positive and negative posts that appear from their friends. Facebook’s in-house data science team carried out the project, it said, as a way to examine the “emotional impact of Facebook” on its users. Along with two university researchers, the team published the results of the study in an academic journal.

Furor over the project broke out on the Internet over the course of a few days. Adam Kramer, the data scientist in charge of the effort, posted a lengthy apology to his Facebook page shortly thereafter.

But this is hardly the first time Facebook has apologized for its behavior. Over its 10-year history, the company has repeatedly pushed its users to share more information, then publicly conceded it overstepped if an upset public pushed back.

Take, for example, when Facebook first introduced the news feed to the public in 2006. It was the first time a running stream of the actions you took on was were visible to your friends. Users were alarmed, and Mark Zuckerberg, Facebook’s chief executive, took to his profile page to personally apologize.

“We really messed this one up,” he wrote. The company introduced a new set of privacy controls to go with Mr. Zuckerberg’s apology.

Little more than a year later, Facebook was at it again. The company introduced a new product, Beacon, that, when connected to partner web sites like eBay or Fandango, would publish actions taken on those third-party sites back to Facebook for friends to see. Some Facebook users said this violated their privacy, and were irate enough to eventually file a class-action lawsuit.

Again, Mr. Zuckerberg was sorry.

“We simply did a bad job with this release, and I apologize for it,” he wrote on his personal Facebook page. Facebook introduced a way to opt out of Beacon soon after, and eventually dropped the service entirely.

Then in 2009, Facebook changed its privacy settings for users, in what the company characterized as an effort to simplify a set of complicated controls. Some digital rights advocacy groups, however, claimed that the simpler controls tacitly pushed users to share even more information about themselves than before. Users were forced to share their information, for instance, with apps connected to their Facebook accounts.

Six months and many complaints later, Mr. Zuckerberg said he was sorry (sort of) — this time on the editorial page of The Washington Post.

“Sometimes we move too fast,” Mr. Zuckerberg wrote. “We just missed the mark.” Facebook introduced another set of privacy changes to remedy the older, unpopular set.

One of the company’s biggest concessions came in 2011, in the form of a settlement with the Federal Trade Commission, after the agency said Facebook had deceived consumers on its privacy practices.

“I’m the first to admit that we’ve made a bunch of mistakes,” Mr. Zuckerberg wrote, while also noting a batch of privacy “improvements” Facebook had introduced over a period of two years. “We can also always do better.”

Other technology companies also like to seek forgiveness after their errors. In 2012, Tim Cook, Apple’s chief executive, wrote a public apology after the company’s newly introduced Maps product was largely panned by consumers. Google, too, is familiar with issuing its own apologies, not the least of which includes an admission of inadvertently grabbing personal user information from unlocked wireless networks.

But if — or when — Facebook, which declined to comment for this post, has to publicly apologize again, perhaps the company should make sure it’s worth pushing the limits of what people are comfortable with first.

“In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Mr. Kramer, Facebook’s data scientist, wrote.