© Getty



Rabbit hole

Bad actors

YouTube has announced plans to "reduce" the number of conspiracy theory and "misinformation" videos users are exposed to through its algorithmic recommendations.While the Google-owned video giant has often courted controversy over some of the content that finds its way onto its platform, the company does have policies in place that serve as a guide to what is, and isn't, allowed. Some of these videos are eventually taken down. But then there is content that YouTube refers to as "borderline" - it doesn't breach any policies, per se, but at the same time many people would rather not see them.And that is the content YouTube is now looking to scrub from users' "up next" queue.Anyone who's spent even a short time on YouTube will know its addictive nature: what begins as an innocent 30-second session to watch a prank skit sent by their buddy descends into a rabbit-hole of neverending autoplay "recommendations" served up by the data-powered internet gods.It's in YouTube's interests to keep you there, of course, as the more you're on its platform the more ads you'll likely view. The company also recently added swiping to the mobile app - to make it easier for you to skip to the next recommended video."We'll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to - but doesn't quite cross the line of - violating our Community Guidelines," YouTube said in a blog post.Today's news comes just a week after YouTube won back one of the biggest advertisers in the U.S. AT&T had previously pulled its ads from YouTube after they were displayed alongside extremist content back in 2017, but it said it was now satisfied that YouTube had sorted out its programmatic advertising systems.The latest changes will apply only to viewers in the U.S. at first - the company said it's meshing human evaluators, subject experts, and machine learning to make these tweaks. More countries will receive this update in the future, according to YouTube.