Jair Bolsonaro is a YouTuber with 2.5 million subscribers. He joined the platform in 2009, and used his channel to espouse his right-wing views and discuss various supposed hoaxes and conspiracy theories, including that the Nazi movement was perpetrated by leftists and liberals. At first, his videos brought a small audience. But as YouTube grew, its algorithm was tweaked, and right-wing ideologies grew more popular in Brazil, his channel began to draw as many as 30 million views per month.

This past January, Bolsonaro was sworn in as the 38th president of Brazil.

A longtime representative of the country’s Social Liberal Party, he’s a self-described “proud” homophobe with racist and misogynist ideals who once told a fellow Brazilian policymaker, “I’m not going to rape you, because you’re very ugly.” (For clarification, the Social Liberal Party is in fact conservative, and Brazil’s conservative politicians generally hold similar views to the U.S.’s.)

As president, he still uploads to his channel several times a week, and is ultimately just one of thousands of right-wing and far-right Brazilian vloggers whose content is propagated by YouTube’s algorithm, according to a study from the Federal University of Minas Gerais, The New York Times reports.

For the study, researchers interested in YouTube’s role in the political radicalization of Brazil programmed a server to go to YouTube and enter a random search term, then watch a video that search term brought up. Then, the server would watch a recommended video based on that video. Rinse and repeat.

The server did this thousands of times. In the end, based on what kinds of videos the server was recommended, researchers got a window into how far-right content amasses an audience more quickly than other types of content. When a video with popular radical or conspiracy-themed ideologies (the earth is flat, the government is purposefully creating transgender people) begins to get views, YouTube’s algorithm begins to recommend that video. Even when the server watched videos about entertainment or politics in general, it was recommended trending radical videos, researchers concluded. The study also showed that once the server watched a single far-right video, it was recommended more, and more, and more — reminiscent of the platform’s ongoing problem with a “wormhole” of videos fetishizing young children.

Along with reporting the study’s findings, the Times also spoke to a number of Brazilians who said YouTube played a part in turning them on to far-right views. Those people included Maurício Martins, an official in Bolsonaro’s party who governs the Niterói region of Rio de Janeiro in southern Brazil.

Martins told the Times he was surfing YouTube when it randomly recommended a video by a right-wing vlogger. He was curious, so he watched it. Then, like the Federal University’s server, he was recommended more right-wing videos.

“Before that, I didn’t have an ideological political background,” he told the Times, explaining that YouTube’s recommendations were “my political education,” and that “it was like that with everyone.”

YouTube representatives questioned the Federal University’s study’s methodology and told the Times that YouTube’s algorithm doesn’t give more weight to any one viewpoint. YouTube rep Farshad Shadloo added that the platform has “invested heavily in the policies, resources and products” to stop the spread of misinformation and prioritize what it calls ‘authoritative content.’

Shadloo added that “we’ve seen that authoritative content is thriving in Brazil and is some of the most recommended content on the site.”