Read: The algorithm that makes preschoolers obsessed with YouTube

But zoomed out, at the level YouTube is probably more interested in, the rough edges get ironed out. Among the top 50 recommendations, 43 of them were music videos (14), kids’ stuff (11), TV competitions (11), or life hacks (7).

YouTube wants to recommend things people will like, and the clearest signal of that is whether other people liked them. Pew found that 64 percent of recommendations went to videos with more than a million views. The 50 videos that YouTube recommended most often had been viewed an average of 456 million times each. Popularity begets popularity, at least in the case of users (or bots, as here) that YouTube doesn’t know much about.

On the other hand, YouTube has said in previous work describing its algorithm that users like fresher content, all else being equal. But it takes time for a post to build huge numbers of views and signal to the algorithm that it’s worth promoting. So, the challenge becomes how to recommend “new videos that users want to watch” when those videos are new to the system and low in views. (Finding fresh, potentially hot videos is important, YouTube researchers have written, for “propagating viral content.”)

Pew’s research reflects this: About 5 percent of the recommendations went to videos with fewer than 50,000 views. The system learns from a video’s early performance, and if it does well, views can grow rapidly. In one case, a highly recommended kids’ video went from 34,000 views when Pew first encountered it in July to 30 million in August.

The behavior of the system was explicable in a few other ways, too, especially as it adapted to making more clicks inside YouTube’s system. First, as Pew’s software made choices, the system selected longer videos. It’s as if the software recognizes that the user is going to be around for a while, and starts to serve up longer fare. Second, it also began to recommend more popular videos regardless of how popular the starting video was.

These conditions were almost certainly not hard coded into the algorithmic decision making. Like most of the Google sister companies, YouTube uses deep-learning neural networks, a kind of software that retunes its outputs based on the data fed into it. It’s not that a YouTube engineer said, “Show people kids’ videos that are progressively longer and more popular,” but rather that the system statistically deduced that this would optimize along all the dimensions YouTube desires.

Pew’s work has important limitations. YouTube heavily personalizes recommendations based on a user’s history, which is impossible to simulate across the board. What Pew tested for are the recommendations YouTube would serve to an anonymous user. Most YouTube users, though, are logged in and receive recommendations based on their viewing history. Nick Seaver, an anthropologist who studies recommender systems at Tufts University, said that the study assumes that an anonymous user generates a kind of “baseline” that personalization would merely modify around the edges.