YouTube has already employed a number of strategies to suppress content its owners have branded "conspiratorial", "hateful" or otherwise contrary to the company's Silicon Valley value system - including demonetizing videos and deplatforming controversial content creators like InfoWars. Yet, criticisms about YouTube being a breeding ground for these non-mainstream, Overton-window-expanding ideas have persisted. Now, the company is trying something that could potentially harm the company's treasured view-hours metric: Tweaking its algorithm to stop users from being ushered into conspiratorial "rabbit holes" on everything from flat earthers, to the 9/11 truther movement, to purveyors of miracle cures.

According to the Daily Beast, the streaming video website is tweaking its recommendations algorithm to overlook content that "comes close" to violating - but doesn't explicitly violate - YouTube's "community guidelines." The company estimates that this will impact less than 1% of all videos posted on the site.

To be clear, these videos will still appear on YouTube, and they can still be displayed in search results. The only thing that will change, according to the company, is their placement in the recommendations bar or queue.

This could be a huge blow to Flat Earthers and others, who count YouTube as their biggest recruitment tool. But then again, that's the whole point: YouTube said the policy does an adequate job of balancing free speech with the public interest.

Currently, conspiracy theories like Flat Earth count YouTube as one of their largest recruitment tools. At the second annual Flat Earth International Conference in November, most participants told The Daily Beast they’d converted to Flat Earth belief after watching YouTube videos on the topic. Some said they’d started watching videos on conspiracies like 9/11, and eventually saw Flat Earth videos recommended in their YouTube feeds; others said they went looking for Flat Earth videos, and were recommended a stream of new Flat Earth videos. Guillaume Chaslot, a former YouTube employee who worked on the site’s recommendation algorithm in 2010 previously told The Daily Beast that the algorithm can push people down conspiratorial rabbit holes. "I realized really fast that YouTube’s recommendation was putting people into filter bubbles," Chaslot said last year. "There was no way out. If a person was into Flat Earth conspiracies, it was bad for watch-time to recommend anti-Flat Earth videos, so it won’t even recommend them.”YouTube has also faced criticism for the prevalence of far-right videos in its recommendations. A BuzzFeed investigation on Thursday found that, over the course of nine recommendations, YouTube took a viewer from a non-partisan clip about Congress to an anti-immigrant video uploaded by a hate group. (When The Daily Beast tried a similar experiment in a cookie-free Incognito browser last month, it took four clicks to travel from a recommended video on YouTube’s homepage to a video on the far-right “red pill” theory.)

However, there's at least one group that disagrees: The Flat Earth Society.

The Flat Earth Society condemned YouTube's decision. "While it's unfortunate that this will no doubt affect some of the most prominent Flat Earth content creators, the Flat Earth Society has been prepared for years. Any social network can pull the rug from under your feet if it decides that your content is no longer welcome - which is why we've never relied on these businesses too much," the group told The Daily Beast. "Who knows - perhaps it's time to start looking into a video sharing service of our own."

But don't worry: We're sure YouTube's latest crackdown won't arbitrarily and unintentionally punish conservative voices and other content creators falsely labeled as "nazis" and "conspiracy theorists".