A mom in Brazil became concerned as she watched the viewing numbers on innocent backyard clip her daughter posted to YouTube suddenly climb hundreds of thousands of views. The child posted a video of herself and a friend playing in the family pool. YouTube's recommendation engine had been suggesting the video as recommended content to viewers who'd just watched other videos that contained sexually oriented video content. YouTube's AI sexualized her kid and pushed her image to pedophiles. This happens a lot, apparently.

"YouTube's algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids," tweeted Max Fisher at the New York Times.

"YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitation."

"I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically," says Fisher.

"The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe."

YouTube's CEO is a woman, Susan Wojcicki.

From Max Fisher and Amanda Taub at the New York Times:

YouTube's automated recommendation system — which drives most of the platform's billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found. YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content. The result was a catalog of videos that experts say sexualizes children. "It's YouTube's algorithm that connects these channels," said Jonas Kaiser, one of three researchers at Harvard's Berkman Klein Center for Internet and Society who stumbled onto the videos while looking into YouTube's impact in Brazil. "That's the scary thing." The video of Christiane's daughter was promoted by YouTube's systems months after the company was alerted that it had a pedophile problem. In February, Wired and other news outlets reported that predators were using the comment section of YouTube videos with children to guide other pedophiles. That month, calling the problem "deeply concerning," YouTube disabled comments on many videos with children in them. But the recommendation system, which remains in place, has gathered dozens of such videos into a new and easily viewable repository, and pushed them out to a vast audience. YouTube never set out to serve users with sexual interests in children — but in the end, Mr. Kaiser said, its automated system managed to keep them watching with recommendations that he called "disturbingly on point."

And here is YouTube's response today: An update on our efforts to protect minors and families.

There's a lot of yada yada in there. they're pushing an update on the day the New York Times is pushing this story. Here's a snip from the YouTube blog post about the changes they say they're making to fix this horrible oversight:

Over the last 2+ years, we've been making regular improvements to the machine learning classifier that helps us protect minors and families. We rolled out our most recent improvement earlier this month. With this update, we'll be able to better identify videos that may put minors at risk and apply our protections, including those described above, across even more videos.

More from observers on Twitter and reporters, below.

We talked to child psychologists, sexual trauma specialists, psychologists who work with pedophiles, academic experts on pedophilia, network analysts. They all said YouTube has built a vast audience — maybe unprecedented — for child sexual exploitation, with grave risks for kids. — Max Fisher (@Max_Fisher) June 3, 2019

YouTube, to its credit, said it has been working nonstop on this issue since a similar issue was first reported in February. YT also removed some of the videos immediately after we alerted the company, though not others that we did not specifically flag. — Max Fisher (@Max_Fisher) June 3, 2019

YouTube's algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together. Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn't say it was. — Max Fisher (@Max_Fisher) June 3, 2019

I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically. The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe. — Max Fisher (@Max_Fisher) June 3, 2019

Initially, YouTube gave me comment saying that they were trending in that direction. Experts were thrilled, calling it potentially a hugely positive step. Then YouTube "clarified" their comment. Creators rely on recommendations to drive traffic, they said, so would stay on. ? — Max Fisher (@Max_Fisher) June 3, 2019

On a personal note, I found reporting this emotionally straining, far more so than I'd anticipated. Watching the videos made me physically ill and I've been having regular nightmares. I only mention it because I cannot fathom what this is like for parents whose kids are swept up. — Max Fisher (@Max_Fisher) June 3, 2019

As I reported last year with @kbennhold, YouTube's algorithm does something similar with politics. We found it directing large numbers of Germans news consumers toward far-right extremist videos, with real-world implications. https://t.co/f2e61KIO6p — Max Fisher (@Max_Fisher) June 3, 2019

Advertisers saying this kind of stuff is unacceptable and then continuing to buy ads unchanged is the "thoughts and prayers" of digital media https://t.co/Z5XaYYVa93 — Brian Morrissey (@bmorrissey) June 3, 2019

Congratulations to all executives software developers at @YouTube. Your recommender algorithm has figured out how to algorithmically curate sexualized videos of children and to progressively recommend them people who watched other erotic content. https://t.co/eOHScAWomz pic.twitter.com/5oY9ol4Kau — zeynep tufekci (@zeynep) June 3, 2019

YouTube isn't just monetizing the malicious slander of women.

They're not just radicalizing a generation of young men into extremism. They are literally turning innocent kids into fodder for pedophiles. https://t.co/0o6olStmli — Brianna Wu (@BriannaWu) June 3, 2019

"Best minds of our generation" have developed machine learning algorithms that connect pedophiles and amplify white-supremacists, misogynists and anti-vaxxers. Piece says YouTube wouldn't turn off recs because the company said "recommendations are the biggest traffic driver." pic.twitter.com/mrCnkTaHjA — zeynep tufekci (@zeynep) June 3, 2019

Here's my just out piece in @Sciam on YouTube's recommendation algorithms. Chromebooks are about 50 percent of the K-12 market and they come with YouTube. https://t.co/BNXlcaSTRR — zeynep tufekci (@zeynep) June 3, 2019