Earlier this week, an administrator for a private Facebook group called “Winchester, MA Residents” received a notification from Facebook that a comment made on the group’s site had been removed.

The comment was made beneath a controversial post about a local high school not using the pledge of allegiance, but what was unusual was that the comment in question neither incited violence nor was it harassing—in fact it seemed quite measured in its tone.

”Yeah that’s an unfortunate conflation of government and religion,” the commenter wrote. “I’m in favor of removing all references to god from all governmental documents and instruments, including our legal tender.”

In the notification to the group administrator, Facebook said only that the post had been removed because it didn’t "follow the Facebook Community Standards.”

The "Winchester, MA Residents” group has only 3,784 members, but a post like the one that was removed this week would likely have come to the attention of Facebook’s moderators if another user reported it to be harassing.

Still, Facebook claims in its Community Guidelines that real humans assess whether a reported post is worthy of being removed. "The number of reports does not impact whether something will be removed,” Facebook claims. "We never remove content simply because it has been reported a number of times.”

In an e-mail to Ars this afternoon, a Facebook spokesperson said, "As our teams review millions of reports a week, they occasionally make a mistake. This content was removed in error, and we apologize for any inconvenience.” As of this afternoon, the post has not been reinstated.

When Facebook removes a page or a profile, its letter to the owner of said page or profile includes a link to appeal the removal. But it’s not clear that the removal of individual posts or images can be contested easily unless the content was erroneously thought to have violated copyright rules.

Facebook has taken flack in recent years for its failure to curb real abusive and harassing behavior while at the same time pearl-clutching over images of women’s naked breasts (including breast feeding, which Facebook only just made an exception for in 2014). Recently, a German photographer posted a series of portraits demonstrating that Facebook will remove a photo of a naked woman faster than it will remove photos that contain racial slurs or other hate speech directed towards refugees and immigrants.

The confusion over what does and doesn’t deserve moderation deepened on Friday, when the BBC reported that Facebook has repeatedly refused a parent’s request to remove an image posted by Britain First, a far-right group in the UK, containing her daughter (13 years old) and another girl (12). The parent of the 13-year old told the BBC that the girls did not know who the men were and did not know that they would use the photo for Britain First propaganda. Facebook, however, apparently told the mother that it could not remove the post because her child was between 13 and 17. The parents of the 12-year-old do not use Facebook, and so they have not reported the photo themselves.

In all of these cases, Facebook hedges considerably against being obligated to remove a post. "Because of the diversity of our global community, please keep in mind that something that may be disagreeable or disturbing to you may not violate our Community Standards." It seems, however, that guideline is not always aligned with the first reaction of Facebook's moderators.