Misinformation has lead to violent attacks against Rohingya but report says company has been slow to respond

Facebook’s efforts to crack down on hate speech in Myanmar, which has contributed to violent attacks against the minority Muslim population, have been inadequate, according to a Reuters investigation.

The social media company has faced warnings from human rights groups and researchers that its platform was being used to spread misinformation and promote hatred of Muslims, particularly the Rohingya, since 2013. As Facebook has grown its user base in the country to 18 million, hate speech has exploded, but the company has been slow to respond to the growing crisis.

Reuters and the Human Rights Center at UC Berkeley School of Law found more than 1,000 examples of posts, comments, images and videos attacking Myanmar’s Muslims – including some material that had been on the site for six years – live on the platform until it reported them to Facebook last week.

Myanmar: UN blames Facebook for spreading hatred of Rohingya Read more

One post, published in December 2013, featured a picture of Rohingya-style food and the message “We must fight them the way Hitler did the Jews, damn kalars!”, using a derogatory term for the Rohingya. Another user commented on a blogpost depicting a boat full of Rohingya refugees arriving in Indonesia: “Pour fuel and set fire so that they can meet Allah faster.”

Other posts used dehumanising language, describing Rohingya or other Muslims as dogs, rapists and maggots and calling for them to be shot or exterminated. There were also pornographic anti-Muslim images. Facebook’s community standards prohibit pornography and posts that attack ethnic groups with violent or dehumanising speech or compare them to animals.

In April, shortly after United Nations investigators condemned Facebook’s role as a vehicle for “acrimony, dissension and conflict” in Myanmar, Mark Zuckerberg told US senators that the company was hiring dozens more Burmese-speaking content moderators to review hate speech.

For many people in Myanmar, Facebook is the internet. It is one of the primary ways people get their news and entertainment online as well as messaging. Its growth has been fuelled by the fact that it is zero-rated by some of the country’s mobile phone operators, meaning people don’t have to pay data charges to use it.

Facebook Twitter Pinterest For many people in Myanmar, Facebook is the internet. Photograph: Leah Millis/Reuters

Facebook typically relies on users reporting hate speech, but due to a technical quirk in the way most Burmese websites render their fonts, the company’s systems have struggled to interpret Burmese text. Also, Facebook’s reporting tools – including the text in drop-down menus on specific posts – were only translated into Burmese in late April and early May this year. Until that happened, anyone wanting to report a post would have had to do so in English.

Facebook doesn’t have a single employee in Myanmar, where it has 18 million users – roughly the same number as in Spain. The company monitors hate speech through a contractor in Kuala Lumpur, in a secret operation codenamed “Project Honey Badger”, the Reuters investigation revealed. The operation has about 60 people reviewing reported content posted in Myanmar.

In July the company announced a new policy to remove misinformation used to incite physical harm, starting with Sri Lanka where mob violence against Muslims has been sparked by posts on Facebook. The company said it would roll this out to Myanmar as well.

If Silicon Valley won't stop fake news, we will | Damian Collins Read more

Facebook has identified and removed several hate figures and groups from the platform, including the extremist Buddhist monks Ashin Wirathu, Parmaukkha and Thuseitta, known for hate speech against Rohingya. It has also deleted pages linked to the monk-led nationalist group Ma Ba Tha – the Association for the Protection of Race and Religion.

In a blogpost published 12 hours after the Reuters investigation, Facebook revealed that in the second quarter of 2018 the company proactively identified (rather than relying on user reports) about 52% of the content it removed for hate speech in Myanmar – up from 13% in the last quarter of 2017.

The same blogpost revealed that the company would add 40 more Myanmar language experts to the 60 it has already by the end of the year.

“We have a responsibility to fight abuse on our products. This is especially true in countries like Myanmar where many people are using the internet for the first time, and Facebook can be used to spread hate and incite violence,” said a Facebook spokeswoman. “It’s a problem we were too slow to spot – and why we’re now working hard to ensure we’re doing all we can to prevent the spread of misinformation and hate.”