Video hosting giant YouTube says it will take action against a rise of predatory content being uploaded to its platform showing children in a variety of distressing situations.

YouTube, which is owned by Google, currently has more than one billion users across the world and claims that, on a daily basis, roughly "a billion hours of video" is consumed.

YouTube blocks thousands of videos of al Qaeda -linked cleric Anwar al - Awlaki in 'watershed moment' Read more

But a Buzzfeed report published Wednesday (22 November) showed there is an underground faction on the platform that is eagerly watching videos showing children in vulnerable situations – including being tied up or made to "play doctor" with adults.

Some videos racked up "tens of millions" of views and featured children – some in revealing clothing – being put into "gross out" scenarios or kidnap roleplays. The phenomenon has been dubbed "Elsagate" by some media sources due to the fact the character from Disney film Frozen is often depicted during the videos.

"While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them," YouTube acknowledged in a blog post.

The Buzzfeed report emerged after an activist called Matan Uziel complained about a severe lack of action from the platform after reporting "tens of thousands of videos available on YouTube that we know are crafted to serve as eye candy for perverted, creepy adults".

Last week, the firm deleted a channel with 8 million followers called ToyFreaks, where a man had been uploading videos showing his children dressed as babies and allegedly being force-fed.

Buzzfeed's report shined a light on this shadowy world, where material is being hosted on YouTube, often in plain sight, and recirculated automatically using complex algorithms.

The video hosting service said in a statement Wednesday that it had terminated more than 50 channels and removed "thousands of videos" under a set of newly-updated guidelines.

It also said that it had removed ads from 3 million exploitative videos since June 2017.

"In the last couple of weeks we expanded our enforcement guidelines around removing content featuring minors that may be endangering a child, even if that was not the uploader's intent," said Johanna Wright, YouTube's vice president of product management.

"We will continue to work quickly to remove more every day.

"We also implemented policies to age-restrict (only available to people over 18 and logged in) content with family [...] characters but containing mature themes or adult humour.

"To help surface potentially violative content, we are applying machine learning technology and automated tools to quickly find and escalate for human review."

After days of screening these awful videos this is what my personalized 'recommended' feed looks like this very minute (creepy little girl webcam vids): pic.twitter.com/s9fJ2wlwNA — Charlie Warzel (@cwarzel) November 22, 2017

The platform will become better at removing ads from inappropriate videos targeting families, blocking inappropriate comments on videos featuring minors and providing guidance for creators who make family-friendly content, the blog post stressed.

Wright added: "We're wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right.

"As a parent and as a leader in this organisation, I'm determined that we do."