Facebook last week held a two-day meeting with academics, researchers, and civil society organizations from Myanmar, the Philippines, Sri Lanka, and elsewhere to discuss misinformation and propaganda, three sources told BuzzFeed News. Dubbed an “Integrity, Safety, and Conflict Roundtable” and held at the company’s Menlo Park headquarters in California, the meeting included significant discussion of how Facebook can do a better job of monitoring its platform for the misinformation and inflammatory rhetoric that has been linked to violence and social discord in those countries. Another key topic: discussing individualized content policies designed to address cultural nuance in regions far from the Silicon Valley offices where Facebook is built and managed.



Facebook confirmed the meeting to BuzzFeed News, describing it as “part of our work to better understand the challenges in these countries and improve our policies, products, and programs.”

As part of the two-day summit, attendees discussed the platform's effects on their respective countries with Facebook executives, engineers, and policy wonks. The hope, according to one attendee, was to bridge the gap between the way Facebook's policies are designed and how they affect real users in countries where the social network has been weaponized.

The meeting was held amid mounting concern over Facebook’s influence and ill effects in countries particularly vulnerable to the misinformation and inflammatory rhetoric that is so easily circulated on the platform. In Myanmar, hate speech amplified via Facebook incited violence against Rohingya Muslims. Similarly, in Sri Lanka, Facebook posts spreading blatant anti-Muslim messages fomented ethnic violence. And in the Philippines, advocates for President Rodrigo Duterte stoked anger and fear on Facebook to provide cover for a state-sanctioned war on drugs.

Attendees of the Sept. 17–18 meeting had differing opinions about its outcome. One described it as “productive,” while another bemoaned a lack of solutions and specific plans of action.

One attendee, however, told BuzzFeed News that there was one tangible shift in Facebook’s approach to these countries. Facebook, this person said, indicated it was willing to discuss and consider country-specific policies and community standards tailored to crucial cultural nuances of the regions, rather than blanket policies for its 2.23 billion users worldwide.

For instance, hypothetically, a piece of inflammatory or violent content that would typically be quickly contextualized or deemed newsworthy in the United States and parts of Europe might be allowed to remain on the platform, while it would be removed in other countries where it is more likely to be quickly decontextualized, weaponized, and reposted.

Facebook’s roundtable, and the idea of country-specific policies for misinformation and inflammatory rhetoric, seems a small, but reasonable step in grappling with the misinformation and social discord problems plaguing the platform, sources told BuzzFeed News.

But others were more dubious, despite the meeting’s idealistic aspirations. "Facebook has known about [these issues] for years," said Siva Vaidhyanathan, a professor of media studies at the University of Virginia, who was not present at the roundtable. "None of this is a 2018 revelation."

"Facebook knows it’s been part of the problem," Vaidhyanathan continued. "Yet all it does is hold meetings."