YouTube’s recommendation algorithm has been encouraging pedophiles to watch home videos that families upload showing their children playing.


A New York Times report details h ow YouTube has been exploiting minors through this automated system. According to the Times, researchers for the Harvard’s Berkman Klein Center for Internet and Society were studying YouTube’s influence in Brazil when they noticed the alarming effect. The team’s experiment involved a server that followed YouTube recommendations thousands of times, building a sort of map in the process, which showed how YouTube guides users.

When the experiment went down paths of recommendations stemming from videos with sexual themes, the researchers noticed the system served up videos that were “more bizarre or extreme, and placed greater emphasis on youth,” according to the Times. “Videos of women discussing sex, for example, sometimes led to videos of women in underwear or breast-feeding, sometimes mentioning their age: 19, 18, even 16.”


Deeper down the path of this experiment, YouTube reportedly began recommending videos of adults wearing children’s clothing and soliciting payments from “sugar daddies.”

After such softcore fetish recommendations, YouTube then showed actual videos of “partially clothed children,” many of them based in Eastern Europe and Latin America, according to the Times.

These videos often seemed to be home videos uploaded by parents, who possibly wanted to easily share footage of their children with friends and family. But, as the Times suggests, YouTube’s algorithm might have learned from people who look at children in sexually exploitative ways and steered those viewers to the family videos.

The Times interviewed the mother of a 10-year-old child who uploaded a video of herself and a friend playing in a pool. Within several days the video had been viewed 400,000 times.


The mother, Christine C. (last name withheld for privacy), told the Times when her daughter boasted about the view count, Christine “got scared by the number of views.”

According to the Times, this incident unfolded after YouTube had to publicly confront its pedophile issues earlier this year. In February, YouTube disabled comments on many videos of minors following reports that pedophiles were commenting on videos of children as a signal to other predators.


While studies have shown that the YouTube recommendation system can create a “rabbit hole effect,” through which the algorithm recommends increasingly extreme content, the company has skirted the topic or denied that it’s real. In May, YouTube’s chief product officer Neal Mohan told the Times, “It is not the case that ‘extreme’ content drives a higher version of engagement or watch time than content of other types.” Then the company doubled-down in April, responding to a Bloomberg investigation by claiming that “generally extreme content does not perform well on the platform.”



YouTube did not answer Gizmodo’s request for comment on whether it maintains that the recommendation system doesn’t create a rabbit hole effect. Instead, the company referred Gizmodo to a blog post published today about the company’s “efforts to protect minors” following the Times report on videos that “do not violate our policies and are innocently posted.”


The announcement highlighted YouTube’s recent steps to disable comments in videos featuring minors; restrict minors from livestreaming, unless doing so with an adult clearly present; and reduce recommendations of videos that show “minors in risky situations.”

According to YouTube, the company recently improved its machine learning to “better identify videos that may put minors at risk.”


According to the Times, researchers have said blocking videos of children from being used in the recommendation system would be the best way to protect children. But YouTube told the Times it had no plans on doing that anytime soon since the automated system is the largest driver of traffic, and the move would harm creators.