Since YouTube keeps its recommendation algorithm under wraps, it has been impossible for the regular YouTube user to understand how, and if, the platform funnels our collective attention toward certain talking points and perspectives — until now.

Enter AlgoTransperancy, a program led by Guillaume Chaslot, an ex-Google engineer who was fired after “agitating for change within the company,” according to The Guardian.

After his departure, Chaslot, who spent time working on YouTube’s recommendation engine and on Google’s display advertising, developed AlgoTransparency as a way to extract data from the video-sharing site and provide a snapshot of its recommended content.

For reference, recommended videos on YouTube show up around the content you are viewing; their purpose is to get users to click and watch additional videos, so they can spend more time on the platform.

Why is understanding YouTube’s recommendation algorithm critical for all of us? Because recommended videos account for 70% of what we watch on YouTube, according to the company’s product chief, Neal Mohan.

“We focused a lot in the last several years on machine learning and artificial intelligence to learn what our users like and make,” Mohan said. “Our job is to give the user a steady stream, almost a synthetic or personalized channel,” Mohan said.

This means that the content viewed by YouTube’s nearly 2 billion monthly users is inherently connected to artificial intelligence that routinely prioritizes “watch time” and clicks over credible information, as this essay will illustrate.