On January 16th, Facebook users received an error message when posting in Jinghpaw, a language spoken by Myanmar’s ethnic Kachin and written with a Roman alphabet. “We couldn’t post this. Tap for more info,” the message said. When clicking, a second appeared: “Your request couldn’t be processed. There was a problem with this request. We’re working on getting it fixed as soon as we can.”

A Facebook representative told The Verge that the issue was caused by “a bug in our language infrastructure,” and coincided with the launch, the same day, of an updated language identification model supporting ten new languages, including Jinghpaw. The representative said Facebook fixed the issue within hours of receiving reports on January 17th.

“For the Kachin, there are forces looking to monitor any expressions of resistance”

But while the disabling of Jinghpaw was not an active move of censorship, it alerted many Kachin people that Facebook had the capability to identify their language, an alarming thought for the embattled minority group. That realization has evoked a visceral reaction from the Kachin, and brought forth new calls for the company to be more transparent about its technology and the ways it will be used.

“For the Kachin, there are forces looking to monitor any expressions of resistance against the state’s narratives,” said Kachin resident Ja Htoi Pan. “I would like to see more transparency regarding Facebook’s social media monitoring process in Myanmar.”

For the Kachin, a predominantly Christian minority living in the northernmost state of the Buddhist-majority country, feelings of oppression are strong. The Myanmar government does not officially recognize the Kachin national flag or allow it to be raised at state-sponsored events, replacing it with their own version. In the past two years, five Kachin activists were jailed in cases related to freedom of expression, and in September, the Myanmar military threatened charges, later dropped, against an influential Kachin Baptist reverend after he told President Trump that Christians in Myanmar were being “oppressed and tortured by the Myanmar military government.”

In the third quarter of 2019, Facebook removed 7 million pieces of hate speech content

That history of marginalization and civil war has left many Kachin wary of even the smallest hints of surveillance. Experts estimate that there are roughly 1.1 million mother-tongue Jinghpaw speakers — a tiny sliver in a nation of over 50 million — and crackdowns on speech have sometimes been a prelude to more troubling measures. Institutionalized monitoring and censorship were the norm in Myanmar under the military regime, which transitioned to a more democratic rule in 2011. Online defamation is still criminalized and the law is often used by government officials and the military to silence dissent. “The government has recently taken on the mobile internet at large, with one still-expanding shutdown, with one still-expanding shutdown in place since June 2019 in a conflict-torn area on the opposite side of the country from Kachin.

Facebook gained widespread attention in relation to Myanmar following mass violence against Rohingya Muslims in late 2017. The company has since invested heavily in its Myanmar-focused operations, and now has more than 100 Myanmar-speaking content reviewers and the ability to review content in some of the country’s ethnic languages, including Jinghpaw. Since that time, Facebook has also removed hundreds of pages that violate their community standards, and banned the Myanmar military’s commander-in-chief from the site.

Today, Facebook’s content moderation is increasingly done through automated flagging and removal. In the third quarter of 2019, Facebook removed 7 million pieces of hate speech content from its global platform, with 80 percent caught proactively using artificial intelligence. During congressional hearings in April 2018, when Mark Zuckerberg was questioned on Facebook’s failure to effectively stop hate speech in relation to the Rohingya crisis, he suggested that artificial intelligence would “be the scalable way to identify and root out most of this harmful content.”

“Sooner or later, Facebook is going to systematically monitor ethnic users”

Yet many Kachin are concerned about the ways the technology will be used. When finding they could not post in Jinghpaw without knowing the reason, many were outraged. “Sooner or later, Facebook is going to systematically monitor ethnic users, to see whether our posts are politically sensitive,” said one member of the Kachin public named Labya La Doi.

Facebook plans to launch a new oversight board this summer, including provisions to include a member from the affected region in each moderation decision. But it remains to be seen how well versed those board members will be in the concerns of smaller minority groups such as the Kachin.

Ms. Yin Yadanar Thein, director of the rights group Free Expression Myanmar, said much of Facebook’s data and activities in the country remain opaque. “We encourage Facebook to be more transparent. Facebook has so much power in Myanmar, but the public knows almost nothing about the decisions it is making. As a result, nobody really knows how much content is removed or why.”

When asked for Myanmar-specific data on content removal, Facebook told The Verge that Myanmar’s users have historically shown lower levels of content violation reporting than other countries, leading the company to invest in proactive detection technology, including “hate speech classifiers,” which are now fully operational in the language of Myanmar. In the third quarter of 2018, this technology identified 63 percent of approximately 64,000 pieces of content in Myanmar which were removed for violating hate speech policies; in the fourth quarter of 2018, 68 percent of 57,000 pieces of hate speech were proactively identified and removed. Facebook’s transparency page reports that the Myanmar government made five requests for user data since 2013, but Facebook did not provide any data in those cases.

Another concern raised by some Kachin is the extent to which Facebook’s Myanmar-focused team represents the country’s diverse minorities. Facebook’s decision to ban the pages of four ethnic armed groups, including the Kachin Independence Army, and all related praise, support, and representation in February 2019 still holds, despite local outcry. Facebook’s representative told The Verge the company has made major efforts to ensure a positive experience for ethnic minorities from Myanmar, and ensure a two-way dialogue on these issues.

As part of those efforts, Facebook has held a series of eight workshops across the country in recent months, including one in Kachin on January 18th — but local experience of the workshop was not entirely positive. Kachin workshop attendee Naw Htoi told The Verge that the workshop’s facilitators deferred detailed questions about the Jinghpaw incident, and he left feeling that “Facebook started investing in Jinghpaw language to exert more control and watch over us.”

Saijai Liangpunsakul, social impact director at Phandeeyar, a Yangon-based organization focused on using technology for social change, said that work remains to strengthen trust and bridge gaps between civil society and Facebook. “Facebook shouldn’t see civil society on the other side of the table … Facebook is putting in all these resources, but the problem is we need to find ways to meaningfully work together.”

“Social media brought opportunities and hope, but it also brought harm to the country,” she continued. “Myanmar had been isolated for many years. People were excited about having access to information and being connected to the rest of the world ... Looking back, there was a big change, and as a country, [Myanmar] wasn’t really prepared for the harm and opportunities that would come with it.”