Via The Anti Media,

In a move that’s baffling at best and rather appalling at worst, Facebook has been busted asking users if they think it’s alright for adults to solicit “sexual pictures” from minors on its platform.

While this may sound ridiculous on the surface — because it is — nevertheless, it happened.

On Sunday, the social media behemoth sent surveys out to a group of its users with questions on the issue of child grooming, the process of adults befriending children for the purposes of sexual abuse or other nefarious ends like trafficking and prostitution.

And asked this … and I’m like, er wait it making it secret the best Facebook can offer here? Not, y’know, calling the police? pic.twitter.com/t2UZuKalfk — Jonathan Haynes (@JonathanHaynes) March 4, 2018

“There are a wide range of topics and behaviours that appear on Facebook,” began one of the questions. “In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.”

Respondents’ answer options ranged from “this content should be allowed on Facebook, and I would not mind seeing it” to “this content should not be allowed on Facebook, and no one should be able to see it. ”Survey takers were also allowed to select that they have “no preference” on the subject.

In a follow-up question, the tech company asked users who the arbiter of such content and behavior should be. Answer options ranged from “Facebook decides the rules on its own” to “Facebook users decide the rules by voting and tell Facebook.” Others involved getting input from outside experts.

Strangely, neither of the two questions gave survey takers the choice to suggest that law enforcement should be alerted to the situation.

It didn’t take long for the media to catch on. The digital editor for the Guardian, Jonathan Haynes, flagged the issue on Twitter. He got a response from Facebook’s VP of Product, Guy Rosen, who called the inclusion of such questions a “mistake” that shouldn’t have happened:

“We run surveys to understand how the community thinks about how we set policies. But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn’t have been part of this survey. That was a mistake.”

A statement from Facebook shared with the media struck a similarly apologetic tone but also contained some defensiveness:

“We sometimes ask for feedback from people about our community standards and the types of content they would find most concerning on Facebook. We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey. We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.”

Speaking to the Guardian, British Parliament member Yvette Cooper, chair of the Home Affairs Select Committee, roundly condemned Facebook’s move:

“This is a stupid and irresponsible survey. Adult men asking 14-year-olds to send sexual images is not only against the law, it is completely wrong and an appalling abuse and exploitation of children. I cannot imagine that Facebook executives ever want it on their platform but they also should not send out surveys that suggest they might tolerate it or suggest to Facebook users that this might ever be acceptable.”

Andy Burrows, associate head of child safety for the National Society for the Prevention of Cruelty to Children, told Newsweek that “Facebook’s decision to crowdsource views on how to deal with a criminal offence is hugely concerning.”

The move, and the backlash, comes as social media companies face increased pressure to moderate the content on their platforms. Given that context, TechCrunch notes that it’s “hard to fathom” what Facebook was thinking with such a survey.

Further, the outlet highlights, the incident shows that the company would much rather lay the responsibility of content moderation on its users: