Back in late November, the New York Times revealed that Facebook had paid a corporate PR firm called Definers Public Affairs to develop and peddle a smear campaign aimed at some of its Open Society Foundations-funded critics, including members of the F reedom From Facebook coalition .

In response, we asked three basic questions of Facebook, all aimed at the same issue: what did Facebook do with the smear campaign information on the Facebook platform itself? Did Facebook promote the smears on its platform? Did Facebook develop different versions to target different audiences, including Congressional staffers and other influencers, as it does for key advertising customers? And, most important, what is the boundary between Facebook’s own policy interests and the operation of the platform?

Just before the holiday break, Facebook answered our questions in a telephone call with two of its legal and communications staff. The short answer: Facebook asserts that it did not help promote Definers's messages on its own platforms. Facebook said it does not allow its own policy work to be promoted on its platforms (for example, through the ads you see or the posts that show up in your Newsfeed) without clear and unequivocal notice to its users.

This is good as far as it goes. But Facebook must do much more if it wants to regain any of the trust it lost from this episode, especially given the dangerous waters that it chose to swim in.

Facebook must do much more if it wants to regain any of the trust it lost from this episode.

First, while Facebook said this time that it did not use its platform to promote its own policy positions, Facebook needs to state publicly that it will not use its own platform to, for example, secretly further attacks against critics. This should take the form of a clear, written, publicly available policy that Facebook will not use the Facebook platform for its own policy purposes without clear notice. Facebook is of course entitled to take policy positions and even to use its platform to promote them, but it must be crystal clear when it’s doing so.

Facebook has publicly said it was reviewing its policies and procedures concerning its communications work, including with external firms, and that this process is being led by Nick Clegg. A rule ensuring strong separation and transparency requirements—with serious consequences for violations—should be part of that process.

Second, Facebook still needs to find out what actually happened with this information. We know that Facebook’s algorithms often promote controversial and divisive content, and that both Definers and its affiliate the NTK Network have Facebook presences. Facebook might not have needed to intentionally promote this material for it to have circulated widely on Facebook. Facebook needs to investigate how and where this content spread on the platform, and tell its user base.

Finally, Facebook must take steps to ensure that it does not participate in efforts to undermine civil society groups around the world. It is certainly reasonable for Facebook to do research into its political opponents, including the Freedom from Facebook coalition. But Facebook went over the line when it tried to push the story that these groups’ funding from George Soros meant they were were not really grassroots, and that, by actions of the philanthropies he funds, Soros might have been engaged in financial manipulation aimed at Facebook’s stock price.

At best, this betrays a fundamental misunderstanding at Facebook about how nonprofit funding and philanthropy work. More likely, since we know that folks at Facebook know better than this, this was a cynical play into larger, ugly efforts to undermine nonprofit advocacy and the role of civil society in public debate.