YouTube is adjusting its recommendation engine in an effort to lower the reach of videos that misinform users.

This type of content cannot be outright removed since it doesn’t quite violate YouTube’s community guidelines.

YouTube will instead reduce the visibility of borderline content by not displaying the videos as recommendations.

“… we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

Users will still be able to find these types of videos if they’re specifically searched for. This change only affects recommended videos that appear in places like the home page.

The only time YouTube may recommend these videos is when users subscribe to channels that publish borderline content.

According to YouTube, less than one percent of its videos qualify as borderline content.

“While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community.”

YouTube will utilize a combination of machine learning and human evaluators to assess which videos should not appear as recommendations.

The rollout of this change will be gradual and initially only affect a small set of videos in the US.

YouTube will roll out the change to more countries as its recommendation system becomes more accurate.