Teachers describe classrooms made unruly by students who quote from YouTube conspiracy videos or who, encouraged by right-wing YouTube stars, secretly record their instructors.

Some parents look to “Dr. YouTube” for health advice but get dangerous misinformation instead, hampering the nation’s efforts to fight diseases like Zika. Viral videos have incited death threats against public health advocates.

And in politics, a wave of right-wing YouTube stars ran for office alongside Mr. Bolsonaro, some winning by historic margins. Most still use the platform, governing the world’s fourth-largest democracy through internet-honed trolling and provocation.

YouTube’s recommendation system is engineered to maximize watchtime, among other factors, the company says, but not to favor any political ideology. The system suggests what to watch next, often playing the videos automatically, in a never-ending quest to keep us glued to our screens.

But the emotions that draw people in — like fear, doubt and anger — are often central features of conspiracy theories, and in particular, experts say, of right-wing extremism.

As the system suggests more provocative videos to keep users watching, it can direct them toward extreme content they might otherwise never find. And it is designed to lead users to new topics to pique new interest — a boon for channels like Mr. Moura’s that use pop culture as a gateway to far-right ideas.

The system now drives 70 percent of total time on the platform, the company says. As viewership skyrockets globally, YouTube is bringing in over $1 billion a month, some analysts believe.