Report says social media giant failed to prevent its platform from being used to incite hatred and fuel violence.

Facebook said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent the social network from being used to incite violence.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” Alex Warofka, a Facebook product policy manager, wrote in a blog post.

“We agree that we can and should do more.”

The report by San Francisco-based Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups, and release additional data about its progress in the country.

Activists and human rights groups say Facebook has allowed people to use its platform to incite hatred and violence, particularly against minority groups such as the Rohingya.

In August, Facebook admitted it had been “too slow” to remove anti-Rohingya hate speech, and banned a number of users from the site.

The BSR report also said Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar’s 2020 elections, and new problems related to the growing use of its WhatsApp messaging service in Myanmar.

Correcting shortcomings

Facebook said it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded the use of automated tools to reduce the distribution of violent and dehumanising posts while they are reviewed.

In the third quarter, the company said it “took action” on about 64,000 pieces of content that violated its hate speech policies. About 63 percent were identified by automated software, up from 52 percent in the prior quarter.

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in the country.

BSR said locating staff there, for example, could improve Facebook’s understanding of how its services are used locally, but said its workers could be targeted by the country’s military, which has been accused of ethnic cleansing of the Rohingya.

Some 700,000 Rohingya fled their homes last year after a brutal army crackdown.