Facebook failed to prevent its platform being used to auction a 16-year-old girl off for marriage in South Sudan.

Child early and forced marriage (CEFM) is the most commonly reported form of gender-based violence in South Sudan, according to a recent Plan International report on the myriad risks for adolescent girls living in the war-torn region.

Now it seems girls in that part of the world have to worry about social media too.

Vice reported on the story in detail yesterday, noting that Facebook took down the auction post but not until after the girl had already been married off — and more than two weeks after the family first announced the attention to sell the child via its platform, on October 25.

Facebook said it first learned about the auction post on November 9, after which it says it took it down within 24 hours. It’s not clear how many hours out of the 24 it took Facebook to take the decision to remove the post.

A multimillionaire businessman from South Sudan’s capital city reportedly won the auction after offering a record “price” — of 530 cows, three Land Cruiser V8 cars and $10,000 — to marry the child, Nyalong Ngong Deng Jalang.

Plan International told Vice it’s the first known incident of Facebook being used to auction a child bride.

“It is really concerning because, as it was such a lucrative transaction and it attracted so much attention, we are worried that this could act as an incentive for others to follow suit,” the development organization told Vice.

A different human rights NGO posted to Twitter a screengrab of the deleted auction post, writing: “Despite various appeals made by human rights group, a 16 year old girl child became a victim to an online marriage auction post, which was not taken down by Facebook in South Sudan.”

Despite various appeals made by human rights group, a 16 year old girl child became a victim to an online marriage auction post, which was not taken down by Facebook in South Sudan. Sinking part is that people are now opting for social media for fulfilling orthodox rituals. pic.twitter.com/tj4cMADeFN — H4Human (@h4humanrights) November 20, 2018

We asked Facebook to explain how it failed to act in time to prevent the auction and it sent us the following statement, attributed to a spokesperson:

Any form of human trafficking — whether posts, pages, ads or groups is not allowed on Facebook. We removed the post and permanently disabled the account belonging to the person who posted this to Facebook. We’re always improving the methods we use to identify content that breaks our policies, including doubling our safety and security team to more than 30,000 and investing in technology.

The more than two-week delay between the auction post going live and the auction post being removed by Facebook raises serious questions about its claims to have made substantial investments in improving its moderation processes.

Human rights groups had directly tried to flag the post to Facebook. The auction had also reportedly attracted heavy local media attention. Yet it still failed to notice and act until weeks later — by which time it was too late because the girl had been sold and married off.

Facebook does not release country-level data about its platform so it’s not clear how many users it has in the South Sudan region.

Nor does it offer a breakdown of the locations of the circa 15,000 people it employs or contracts to carry out content review duties across its global content platform (which has 2 billion+ users).

Facebook admits that the content reviewers it uses do not speak every language in the world where its platform is used. Nor do they even speak every language that’s widely used in the world. So it’s highly unlikely it has any reviewers at all with a strong grasp of the indigenous languages spoken in the South Sudan region.

We asked Facebook how many moderators it employs who speak any of the languages in the South Sudan region (which is multilingual). A spokeswoman was unable to provide an immediate answer.

The upshot of Facebook carrying out retrospective content moderation from afar, relying on a tiny number of reviewers (relative to its total users), is that the company is failing to respond to human rights risks as it should.

Facebook has not established on-the-ground teams across its international business with the necessary linguistic and cultural sensitivities to be able to respond directly, or even quickly, to risks being created by its platform in every market where it operates. (A large proportion of its reviewers are sited in Germany — which passed a social media hate speech law a year ago.)

AI is not going to fix that very hard problem either — not in any human time-scale. And in the meanwhile, Facebook is letting actual humans take the strain.

But two weeks to notice and takedown a child bride auction is not the kind of metric any business wants to be measured by.

It’s increasingly clear that Facebook’s failure to invest adequately across its international business to oversee and manage the human rights impacts of its technology tools can have a very high cost indeed.

In South Sudan a lack of adequate oversight has resulted in its platform being repurposed as the equivalent of a high-tech slave market.

Facebook also continues to be on the hook for serious failings in Myanmar, where its platform has been blamed for spreading hate speech and accelerating ethnic violence.

You don’t have to look far to see other human rights abuses being aided and abetted by access to unchecked social media tools.