Groups that have worked with Facebook to flag dangerous content reveal it took more than four days for it to respond when messages started circulating

A consortium of civil society, human rights and monitoring groups in Myanmar have criticised Mark Zuckerberg’s response to the spread of hate speech on the platform and accused the social media giant of failing to act quickly enough to curtail dangerous messages that incited violence inside the country.

Earlier this week, Zuckerberg told Vox Facebook’s systems had detected a pair of chain letters spreading around Myanmar on Facebook Messenger last year. One warned of an imminent attack by Muslims on 11 September.

Revealed: Facebook hate speech exploded in Myanmar during Rohingya crisis Read more

“That’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm,” Zuckerberg said. “Now, in that case, our systems detect that that’s going on. We stop those messages from going through.”

However, the groups, which have worked with Facebook to flag dangerous content, have revealed it took more than four days for the company to respond when the messages started circulating online during the Rohingya crisis.

In an open letter addressed to Facebook’s chief executive, they accused company of being ill-equipped to deal with the risks posted to society by the network.

The six organisations who wrote the letter said they did the heavy lifting during the emergency escalation of concerns over the messages.

“We believe your system, in this case, was us – and we were far from systematic,” they said. “We identified the messages and escalated them to your team via email on Saturday the 9th September, Myanmar time. By then, the messages had already been circulating for three days.”

In one message, it was claimed that the Rohingya, referred to in the racist term Kalar, “are planning to launch a Jihad on 11 September. Warn your friends. The order to get ready with guns has already been issued in the army.” This post was shared with hundreds of thousands of citizens across the country, urging recipients to forward the message on to friends and family.

At the same time, messages targeting the Muslim community were told: “On 11 Sept in Yangon, MaBaTha and extremist nationalists will collaborate and they will launch an anti kalar movement.”

Your team did not seem to have picked up on the pattern Letter to Mark Zuckerberg

The group said Facebook failed to stop the dissemination of the messages. “Far from being stopped, they spread in an unprecedented way, reaching country-wide and causing widespread fear and at least three violent incidents in the process,” the letter said.

The rumours of an imminent attack spread on Facebook Messenger and were “felt across the country”, internet freedom activist and the director of the Myanmar ICT for Development Organisation, Htaike Htaike, told The Guardian.

“As a result, you could feel it on the streets, as there were less people commuting, parents pulling their children out of school as well.”

The six organisations who signed the letter were surprised that that Zuckerberg had praised the effectiveness of Facebook “systems”.

Facebook Twitter Pinterest Nearly 700,000 Rohingya Muslims have fled Myanmar into Bangladesh in the second half of 2017. Photograph: Bernat Armangue/AP

“Your team did not seem to have picked up on the pattern,” they stated. “For all of your data, it would seem that it was our personal connection with senior members of your team which led to the issue being dealt with.”

The group said a major obstacle was the lack of Burmese-speaking Facebook staff. “We were lucky to have an English-speaking foreigner who was confident and connected enough to escalate the issue. This is not a viable or sustainable system, and is one which will inherently be subject to delays,” the letter said.

The group said Facebook was also reluctant to engage with local stakeholders. Despite informal briefings held when Facebook representatives visited Myanmar, the groups claim that Facebook has so far been unwilling to address problems with emergency solution responses.

“Your engineering team should be able to detect duplicate posts and ensure that identified hate content gets comprehensively removed from your platform,” the groups wrote. “We’ve not seen this materialise.”

A Facebook spokesperson said: “We don’t want Facebook to be used to spread hatred and incite violence, and we are very grateful to the civil society groups in Myanmar who have been helping us over the past several years to combat this type of content.

“We are sorry that Mark did not make clearer that it was the civil society groups in Myanmar who first reported these messages. We took their reports very seriously and immediately investigated ways to help prevent the spread of this content. We should have been faster and are working hard to improve our technology and tools to detect and prevent abusive, hateful or false content.”