The furor surrounding Facebook's decision to conduct an experiment that secretly manipulated the News Feed of some users to study emotion contagion reached a peak this weekend, with many calling the act creepy at best, and downright unethical at worst.

Although the editor of the study recently admitted to being "a little creeped out" by the way in which the study was conducted, Facebook itself had not offered any detailed comment on the matter — until now.

In a public post on Facebook, one of the co-authors of the study, Adam D. I. Kramer, a member of Facebook's Core Data Science Team, finally responded to the study's critics.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," wrote Kramer. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper."

But while Kramer's initial statement regarding the company's reasoning behind the study will be a welcome clarification for some, there's still the matter of most import to the study's critics: involving users in a psychological experiment without their consent (lengthy and sometimes vague Terms of Service agreements aside).

After summarizing the study's methodology, then emphasizing that "Nobody's posts were 'hidden,' they just didn't show up on some loads of Feed," Kramer then wades, in indirect fashion, into the delicate territory of how Facebook views the matter of user experiments on the site.

"[A]t the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it…" That's about as close as Kramer comes to directly acknowledging that Facebook covertly manipulated its users for an experiment.

Later in the statement, he does offer a bit of contrition, writing, "[O]ur goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused."

However, based on the vague wording of the statement, it's unclear exactly what Kramer "understands" with regards to user concerns and anxieties. Similarly, rather than directly address the widely voiced concerns regarding the study's involvement of users in an experiment without their knowledge, Kramer instead apologizes for the paper's "description" of the experiment.

Reactions to the statement were swift, with commenters on the post almost evenly divided into camps of support for Facebook and those who still remain troubled by the company's actions.

Facebook study author Adam "Tad Disingenuous" Kramer: "Nobody's posts were "hidden," they just didn't show up on some...Feeds." viz. hidden. — David Auerbach (@AuerbachKeller) June 29, 2014

The comments section has also opened up a direct line to Kramer, giving one notable commenter, NYU journalism professor Jay Rosen, the opportunity to directly question the company regarding the reported involvement of the Army Research Office as one of the backers of the study.

"Thanks for the explanation, Adam," wrote Rosen. "Could you also explain — in your words, I mean — what interest the United States military had in the research?"

Facebook did not immediately respond to a request from Mashable for comment on the military's reported involvement in the study or the company's stance with regards to user concerns about being unknowing participants in a psychological experiment.

Altering the tenor and mood of your social circle—of your friends—to manipulate emotions is *really different* from A/B testing page design. — Erin Kissane (@kissane) June 28, 2014

In the immediate aftermath, it will be difficult to assess the brand damage and loss of trust this latest episode has caused Facebook, but if the voices on Twitter are any indication, for some users, this latest news simply confirmed some of their biggest fears about the site's approach to handling users.

But what is clear is that Facebook, or at least the team behind the study, isn't happy with the negative reactions to the experiment.

"In hindsight," wrote Kramer, "the research benefits of the paper may not have justified all of this anxiety."