TUNIS, Tunisia — Facebook failed to delete hundreds of memes, images, and posts targeting caste, LGBT, and religious minorities in India that human rights researchers reported over a yearlong period.



Facebook has come under heavy scrutiny in the US and Europe for the way in which it has handled political misinformation and user privacy. But elsewhere in the world, it has faced even tougher criticism for doing too little to moderate non-English-language content that has demonized minority groups and, in many cases, fanned the flames of communal violence. In Myanmar, Facebook admitted shortcomings — as it has elsewhere — after its policies were cited for exacerbating ethnic cleansing, and promised to reform its processes, including by hiring more content moderators.

But Equality Labs, a South Asian American advocacy group focusing on technology and human rights, said Facebook had made little progress on these issues in India — home to some 300 million Facebook users — including during India’s 2019 general election. The report’s authors, who studied 1,000 posts over the past year, stated that widespread doxxing of activists and journalists takes place on the platform.

“Without urgent intervention, we fear we will see hate speech weaponized into a trigger for large-scale communal violence,” the report, being launched here at the RightsCon conference, says. “After a year of advocacy with Facebook, we are deeply concerned that there has been little to no response from the company.”

In a statement, a spokesperson for Facebook said the company respects and seeks to protect the rights of marginalized communities in India and elsewhere, and pointed to its rules against hate speech.

“We take this extremely seriously and remove this content as soon as we become aware of it,” the spokesperson said. “To do this, we have invested in staff in India, including content reviewers, with local language capabilities and an understanding of the country’s longstanding historical and social tensions.” The company has made “significant progress” in proactively detecting hate speech on its platform before it’s reported, the spokesperson added.

At the heart of the issue is Facebook’s approach to policing problematic content on its site, especially targeted harassment and calls for violence against minority groups. Civil society groups have repeatedly called for the company to invest more in hiring moderators proficient in local languages and to be more transparent in its process. Despite months of criticism, activists say it is still clunky and difficult to report problematic content on Facebook, and it’s usually unclear why some posts are deleted and others left up.

India is the largest market for Facebook in the world by number of users, and the social network serves as a primary source of news and information for many there.

The report highlights a meme featuring Pepe the Frog depicted as a Hindu nationalist and standing approvingly in front of a centuries-old mosque demolished by a Hindu mob in 1992, as well as posts containing anti-Muslim and anti-Dalit slurs. Dalits are at the bottom of the Hindu caste system and face heavy discrimination in India despite laws intended to protect their rights. Another post, on an Indian meme-swapping Facebook group, called a baseball bat an “educational tool” for wives.