This article originally appeared on VICE US.

At the end of February, Facebook launched an ad campaign in the Indian state of Maharashtra that was designed to inform users about resources available to protect against harassment and hate speech.

But instead of informing the public, it enraged them.

Facebook’s mistake: portraying a troll as a member of the lower Dalit caste, an oversight that essentially reinforced ugly stereotypes against the very group that is most discriminated against on its platform. Hundreds of users reported the ad as hate speech, and within a day the ad was removed. But the incident symbolized Facebook’s mounting failures in its biggest market, particularly when it comes to the spread of harassment and hate speech.

In fact, despite Facebook’s efforts, it’s barely made a dent in that department: 93 percent of all hate speech posts reported to Facebook by monitoring group Equality Labs remain on the platform — including content advocating violence, bullying and use of offensive slurs, according to a new report from the South Asian advocacy group, which is dedicated to ending caste-based discrimination, Islamophobia and religious intolerance.

Facebook's inability to curb hate speech is disproportionately harming India's Muslim minorities and at times spilling over into real-world violence, according to the report, which draws worrying comparisons between the situation in India and the platform's failures in Myanmar, where it was used to fuel violence against the Rohingya Muslim minority.

“Facebook has failed its caste, gender, and religious minority users.” Thenmozhi Soundararajan, one of the authors of the report, told VICE News. ”By its own community standards, it has not fulfilled the bare minimum required to ensure that hate speech and disinformation does not become normalized in the platform.”

Overrun by Islamophobia

Facebook has faced near ceaseless criticism at home and abroad for the often-unchecked megaphone it provides to hate mongers and merchants of disinformation. In India, those flaws appear super-charged and directed primarily at one community: Muslims. According to the report:

Islamophobic content was the biggest source of hate speech on Facebook in India, accounting for 37 percent of the content reported by Equality Labs. Fake News (16 percent), casteism (13 percent) and gender/sexuality hate speech (13 percent) were the next biggest groups.

43 percent of the hate speech Facebook initially removed was restored within 90 days, and 100% of these restored posts were Islamophobic in nature.

Facebook repeatedly states it responds to the majority of reports in under 24 hrs, but Equality Labs found that the median response time in India was 48 hours.

Equality Labs

Facebook said it has removed some of the content Equality Labs flagged as breaching its Community Standards, though it has not seen the full report. But the company did not respond to a question about why so much of the content that was removed later reappeared on the platform.

Overall, researchers pinned the blame squarely on Facebook, which it described as ill-equipped and unprepared to deal with the torrent of hate speech on its platform. With almost 300 million active accounts and potentially hundreds of millions more still to join, India is Facebook’s biggest market, and its most challenging, with unique obstacles to overcome,. “Indian religious and socio-political contexts are complex enough to require their own review and co-design process to adequately address safety.” the report said.

But instead of tailoring a solution to cope with India’s specific challenges, the company continues to rely on community standards and practices designed for western markets, Equality Labs says, that don’t track with India’s challenges.

The problem is two-fold.

First, Facebook’s moderators have not been trained to properly understand the nuance and cultural context of posts in dozens of languages, Equality Labs said.

Second, Facebook only supports eight of India’s 22 official languages, meaning community standards and reporting mechanisms are often only available in English — meaning users don’t even know how to flag hate speech. To try and cover over the cracks, Facebook continues to rely on an army of volunteer translators to deal with issues in the languages it doesn’t support.

“If they have enough money to enter the market shouldn't they have enough money to protect the users in those markets, particularly as they make money off the violence they face?” Soundararajan said.

The rise of Islamophobic hate speech on Facebook has coincided with a rise in real-world violence against Muslims in India, which has been fomented in part by increasingly divisive national politics. According to a recent study, Muslims were the victims of 59 percent of cases of religiously motivated violence — even though they make up less than 15 percent of the population.

Considering the current environment in India, Facebook has no excuse not to have had a better response plan in place to address Islamophobia, said Soundararajan, nor should they have been surprised, particularly in the wake of the atrocities in Myanmar.

“As early as 2013 Facebook knew the content on its platform could lead to large scale communal riots,” Soundararajan said. She points to Facebook’s role in helping to instigate the Muzaffarnagar riots. which led to left more than 50 deaths and over 75,000 people displaced from their homes. “Many say these riots were sparked by videos which were spread in part on Facebook.”

Pepe the Frog travels to Uttar Pradesh

The report highlights a range of hate speech that circulates on Facebook in India. Among the most surprising was the proliferation of Pepe the Frog, the image favored among American white supremacists. In India, the internet meme was used to glorify the 1992 desecration of the Babri Masjid mosque in the Ayodhya district of Uttar Pradesh state by Hindu nationalist mobs, an act that triggered riots across India and the killing of hundreds of innocent Muslims.

Equality Labs

The use of Pepe the Frog, considered an anti-Semitic hate symbol by the Anti-Defamation League, shows the common language of hate speech across the globe. Facebook knows this too. Documents uncovered by Motherboard a year ago show the company has a specific policy for Pepe, that doesn’t ban the image completely but deletes it if shown “in the context of hate, endorsed by hate groups to convey hateful messages.”

The report also reveals a worrying crossover with the hate speech problems Facebook encountered in Myanmar. According to Equality Labs, 6 percent of all Islamophobic posts researchers examined were anti-Rohingya posts. Facebook users labeled Rohingya “cockroaches” and posted screenshots from a debunked video claiming to show Rohingya slaughtering and cannibalizing Hindus.

When the video was removed from Facebook and WhatsApp, users got around the ban by posting graphic screenshots from the video, some as recently as last month’s Lok Sabha elections.

“Clearly something is wrong with Facebook moderation when it comes to Rohingya centered hate speech and given the precarious conditions Rohingya face in India and across South Asia, this issue must be dealt with immediately,” the report says.

Ultimately, the problems facing Facebook in India stems from its failure to engage with activists and groups in India, Equality Labs said. And simply hiring more staff won’t solve the problem.

“Facebook staff lacks the cultural competency needed to recognize, respect, and serve caste, religious, gender, and queer minorities,” the report says. “The hiring of Indian staff alone does not ensure cultural competence across India’s multitude of marginalized communities.”

Facebook did engage to some extent with activists in India, and at the company’s South Asian Safety Summit held in Delhi last fall, Equality Labs presented an early draft of its findings — but the process was “slow and often times did not address the structural problems our report outlines,” Soundararajan said.

The activists are now calling on Facebook to conduct an independent, third-party human rights audit on the problems in India, similar to the civil rights audit it is conducting in the U.S.

“Facebook is complicit with the extremism that is pulling apart Indian society and it must act before it is too late,” Soundararajan said.