YouTube is changing its new video recommendation algorithm to prevent promoting conspiracies and false information, according to a post on its official blog on Friday.

The company said it would target content that “comes close to – but doesn’t quite cross the line” of violating its rules. Some contend that YouTube has been helping conspiracy theorists and fake news propigators rise in its rankings when others are looking for legitimate news.

YouTube claimed less than one percent of its content would be affected, and only in English-language videos. The videos targeted won’t be deleted, but will require a deeper search and/or subscription to specific conspiracy channels to locate them.

“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the blog post said.

Conspiracy theories are not banned from YouTube. The company does ban so-called “hate speech,” which it defines as videos that call for violence or hatred of specific groups.

The current recommendation feature suggests videos based on what was previously watched and the time spent watching it. Likes, dislikes and other factors play into the system. Human evaluators are also used to review content.

“While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community,” said the blog post.