Today, a Facebook spokesman reached out to Wired to reverse its previous stance on imagery that promoted violence toward women, stating that a photo it had previously deemed acceptable for the social networking site “should have been taken down when it was reported to us and we apologize for the mistake.”

Last week, Wired posted a story about an Icelandic woman named Thorlaug Agustsdottir, whose photo had been altered by a user on an anti-woman Facebook page to make it look like she had been beaten along with a caption that read: “Women are like grass, they need to be beaten/cut regularly.” Despite numerous reports by Agustsdottir and other users, Facebook determined that the image did not violate its Statement of Rights and Responsibilities, and previously told Wired when asked for comment:

“We take our Statement of Rights and Responsibilities very seriously and react quickly to remove reported content that violates our policies. In general, attempts at humor, even disgusting and distasteful ones, do not violate our policies. When real threats or statements of hate are made, however, we will remove them. We encourage people to report anything they feel violates our policies using the report links located throughout the site.”

The apology is a positive step for the social media site, which has taken flak in the past for the application of its Statements of Rights and Responsibilities, particularly in regard to content that promotes the rape and violence of women, which is considered acceptable if it can be deemed “humor.” This position stands in unfortunate contrast in regards to other Facebook content where this laissez-faire attitude towards free speech often does not apply, such as political pages that discuss pornography (but contain none), and even more ironically, non-sexual images of women breastfeeding their babies — if the feeding is not considered “actively engaged.”

While making calls on what is and isn’t acceptable, obscene, bullying or threatening can be murky territory, the inconsistent application of the rules forbidding those types of content hints at a lack of clear guidelines or litmus tests for Facebook moderators, individuals who seem to be making their best guess at what falls inside or outside of those categories.

Agustsdottir told Wired that despite the apology, she still wants to know “how they’re going to correct the problem and make sure that this isn’t going to happen again. I just want some clarification. These errors are going to manifest again if there isn’t clear enough policy.”