Susan Wojcicki, chief executive officer of YouTube Inc., introduces the company's new television subscription service at the YouTube Space LA venue in Los Angeles, California, U.S., on Tuesday, Feb. 28, 2017.

In coming months, Wojcicki said, conspiracy videos will start including text boxes, dubbed "information cues," that link to third-party sources that debunk the hoaxes in question.

YouTube's recommendation engine and autocomplete feature have come under fire this year for pushing users toward conspiracy theories and other divisive content .

Google -owned YouTube is trying to combat the amount of misinformation spread on its site, announcing that it will link videos that promote conspiracy theories to "fact-based" sites like Wikipedia pages.

Wojcicki said the new feature will only be used on conspiracies causing "significant debate" on YouTube, like those about chemtrails or the moon landing. After the school shooting massacre in Parkland, Florida, last month, a video that theorized that one of the survivors was a crisis actor made it into YouTube's trending section. Because that conspiracy theory took off in a matter of days, it isn't clear whether a Wikipedia page disputing that theory would even be available yet.

Also, because Wikipedia pages are crowdsourced, a page for a given event may not necessarily be accurate.

Wojcicki was also asked about why it can place an outright ban on hateful content — like videos published by neo-Nazi groups — but not on videos that are untrue.

Wojcicki said that hatefulness is "more clear" than if something is true or false, and that YouTube doesn't want to be an arbiter of truth.

Tweet

YouTube, like other tech giants Facebook and Twitter, has long made the distinction that it is not a media organization, and thus bears less responsibility for the content on its platform.

A YouTube spokesperson told CNBC that the information cues initiative is part of broader efforts to tackle the proliferation of misinformation on the site.