Facebook refused for more than a year to remove a page featuring images of children taken in public under which users posted graphic descriptions of sexual abuse, according to submissions made to India’s supreme court.

A Facebook post advertising rape videos was also permitted to stay online despite being reported several times, the court heard, while police in the western state of Kerala allege another page was being used to run a child-sex ring.

The three cases were raised during an Indian supreme court inquiry into how technology giants including Facebook and Google handle abusive content from India on their platforms.

Last week the supreme court ordered the companies and the Indian government to overhaul their processes for dealing with child-abuse materials and videos depicting rape or gang rape.

The ongoing investigation has raised questions over the adequacy of the moderation systems used by Facebook and others vying to tap a fast-growing and hugely diverse Indian market with users posting in dozens of languages and hundreds of dialects.

Affidavits submitted by Facebook indicated the company does not automatically report the existence of child-abuse material to Indian police – but to American authorities instead – despite a legal requirement to do so under local child-protection laws.

Facebook is aggressively seeking to rapidly grow its user base in India, already one of the world’s largest online markets, and one forecast to more than double to 829m people by 2021.

But the extent to which the company, and others that trade on the openness of their platforms, have the capacity to monitor how these hundreds of millions of new members will use their services remains to be seen.

Facebook told the court that between March 2016 and August this year it received 7,802 user complaints from India about possible child-abuse material. It said in an affidavit it “investigated all reports and took appropriate action”.



But in the course of hearings, the court heard that Facebook had declined to take down a page whose name was written in Roman script but translated in the southern Indian language, Telugu, to “little vagina”.



The page featured pictures of of women and girls taken in public, apparently without consent, under which users left graphic comments describing sexual acts they wanted to commit.

According to screenshots tendered in court, a user who reported the page was told it did not violate Facebook’s community standards.



Only when the court asked Facebook to remove some of the posts, more than a year after the page was reported, were they finally taken down, according to Aparna Bhat, a lawyer arguing the case for the Indian anti-trafficking NGO Prajwala.

The court also heard the company declined to remove a post listing a mobile phone number users were told they could call to access a video depicting the sexual assault of an actor who was kidnapped in Kerala state in February.

Yet another page, which the court heard was reported but not initially judged to violate community standards, was being used to run a child-sex ring in Kerala state, police allege. More than 40 children were rescued and at least 39 people arrested after the page was reported to Indian authorities.

Facebook said in a statement it was “committed to providing a service where people feel safe”.

“There is no place on Facebook for content that threatens or promotes sexual violence or exploitation, and we work hard to keep it off our platform,” a spokeswoman said.

“We respond to valid law enforcement requests and report apparent child exploitation content to the National Center for Missing and Exploited Children (NCMEC).

“We will continue to work with safety experts and the Supreme Court Committee in India to help combat this abhorrent activity.”

The supreme court last week ordered the technology companies involved in the probe, who also include Yahoo, WhatsApp, Microsoft and Google, to work with the Indian government to expand their list of key words associated with child abuse material.



Key words in Indian languages and slang must also be added, as should terms associated with rape and gang-rape imagery, the court said.

India’s elite Central Bureau of Investigation has been asked to set up a special unit to handle reports of abusive material being circulated on social media or elsewhere only.

The Indian government was also ordered to subscribe to the database of the US-based NCMEC, to which Facebook and other companies are required to report online child-abuse material under US law.

NCMEC told the court it has received more than 100,000 reports of abusive material relating to India. But because India is not signed up to NCMEC’s reporting system it is unclear how many have been seen by local police. India relies on Interpol to relay relevant NCMEC reports.

The supreme court’s orders were based on recommendations agreed by a committee that included representatives from Facebook, Yahoo, WhatsApp, Microsoft and Google, as well as lawyers and police.

Other changes sought by child-protection advocates, but opposed by the technology companies, are currently the subject of reporting restrictions but will be considered in private by the supreme court in December.

The supreme court took up the investigation after Prajwala highlighted the existence of hundreds of videos and images of rape and child abuse being shared across social media platforms.

Bhat, who represented Prajwala in the proceedings, questioned why Facebook and other companies complied with US requirements to report child-abuse material, but not those of India.

“If there is reporting of these instances of child pornography to authorities in one country, why can’t they do the same in India?” she said.

She called on web companies to take greater steps to protect vulnerable women and children, even if it meant fundamentally changing the way their services operate.

“If a person is being violated because of the design of your service, and you are hosting a crime on your portal, you need to take another look at it.”



The next hearing in the case is scheduled for 11 December.