The company confirmed that it had blacklisted posts by ARSA to official Zaw Htay, who posted a screenshot to their Facebook page. The statement says that "dangerous organizations are not allowed to use our services and we also remove content that supports or praises such groups." It's believed that the site has also suppressed images and video of Rohingya Muslims being tortured and killed.

Rohingya Muslims have a contentious status in Myanmar, and are considered a stateless minority in the mostly Buddhist country. They are considered to be deeply persecuted, which prompted a Rohingya militia group to attack police posts in late August, killing 12. In response, Myanmar's government retaliated with a program of violence that Zeid Ra'ad Al Hussein, UN High Commissioner for Human Rights, describes as "clearly disproportionate." More than 300,000 Rohingya have now fled their home after their villages were burned and fleeing civilians have been allegedly gunned down. Al Hussein describes Myanmar's actions as a a "textbook example of ethnic cleansing."

In a statement, Facebook said that it is "only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action." Spokesperson Ruchika Budhraja added that the company is "reviewing content against our Community Standards, and, when alerted to errors, quickly resolving them and working to prevent them from happening again." The company has also affirmed that it chose to mark ARSA as a dangerous organization without the intervention of Myanmar's government.

Subsequently, a Facebook spokesperson sent Engadget the following statement:

"We allow people to use Facebook to challenge ideas and raise awareness about important issues, but we will remove content that violates our Community Standards. These include hate speech, fake accounts, and dangerous organizations. Anyone can report content to us if they think it violates our standards and it doesn't matter how many times a piece of content is reported, it will be treated the same. Sometimes we will allow content if newsworthy, significant or important to the public interest - even if it might otherwise violate our standards. In response to the situation in Myanmar, we are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action. We are carefully reviewing content against our Community Standards and, when alerted to errors quickly resolving them and working to prevent them from happening again."

Facebook has often struggled to balance its desire not to become a hotbed for graphic content with political and cultural sensibilities. The site came under fire in 2016 for blocking a Pulitzer Prize-winning photo from the Vietnam war that featured a naked child fleeing from soldiers. The image breached the site's rules on underage nudity, but was reinstated following an international outcry. At the time, the site explained that its rules were evolving and that mistakes would be made, but will always improve.

Unfortunately, the role of monolithic media broadcaster with the attentions of a billion people comes with responsibilities. Facebook has often publicly rebutted the idea that it has any power, but it's clear that the issue is coming to a head, and fast.