Sociologist Zeynep Tufekci wrote in an essay published in The New York Times on March 10 about experiments she performed on YouTube during the 2016 election, where she noticed that no matter what kind of political content she searched for, the recommended videos were always more extreme and inflammatory, whether politically or socially. This is a vicious circle, she writes:

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

Tufekci mentions research done by former YouTube engineer Guillaume Chaslot, who worked on the video platform’s recommendation algorithm and spoke to CJR recently about his conclusions. Like Tufekci, he found that the videos being recommended on the site were overwhelmingly contentious and inflammatory, including many that promoted conspiracy theories, because that kind of content makes people click and spend more time on the site, and that serves Google’s business interests.

Has America ever needed a media watchdog more than now? Help us by joining CJR today

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.