The video will start in 8 Cancel

News, views and top stories in your inbox. Don't miss our must-read newsletter Sign up Thank you for subscribing We have more newsletters Show me See our privacy notice Invalid Email

An app for teens to share fun videos has become such a “hunting ground” for paedophiles that schools have issued alerts directly to parents.

TikTok – one of the world’s most downloaded apps – is exposing ­children as young as five to leering sex pests, cruel taunts and the glorifying of anorexia and self-harm.

One school head even warned: “There has been suspicion the app is being used to stalk teenage girls.”

The app is a huge hit with under-16s, and head teachers from dozens of schools – including early years primaries – have sent letters to thousands of parents across the UK.

TikTok says users must be at least 13 years old – but asks for no proof.

It has recently added the ability to live-stream, making it even more of a threat.

That exposes youngsters to suggestions from strangers in real-time, with just seconds to decide on a response that risks leading to a spiral of abuse and exploitation.

Our reporting team spent this week investigating the app, which kids on half-term would have been using.

One video showed a teenage girl dancing while other users ogling her urged: “Take off your clothes.”

Another innocent video featuring a girl of 15 drew a string of crude comments from men about sex acts.

Others show teens in school uni­­form being punched in the head and users glorifying eating disorders.

One disturbing clip showed a teenage boy exposing his skeletal ribs. He tagged his post “#thinspo”, for “thin inspiration” – used to seek approval for body image issues.

Our findings come amid growing calls for tighter social media regulation.

The Chinese-owned app, worth up to £55billion, claims it is “raw, real, and without boundaries”.

But its dark side is concerning teachers nationwide.

One primary school in Cornwall said: “Parents of children in Year 3 to Year 6 have been horrified by what children are exposed to.”

Parents in Hounslow, London, were alerted to the hashtag #tradefortrade, a sign of wanting to trade illicit content.

As well as live-streaming and public comments, TikTok lets users send direct messages.

One North Yorkshire school advised: “If the profile is open, strangers can comment on your child’s videos.

“While this isn’t always sinister, it lets potential predators contact your child.”

In Stockport, Gtr Man­­chester, parents were alerted to the #takeitoff challenge, in which young girls are urged to video themselves removing their school shirts.

The school told parents: “There has been some suspicion the app is being used to stalk and court teenage girls.”

Chris Keates, of teaching union the NASUWT, said: “Today it’s TikTok, tomorrow it will be another site.

“Young people will be at risk until those who develop the sites, and governments who regulate them, make safety and welfare their overriding priority.”

TikTok – which used to run in the UK as musical.ly – is one of the top 10 apps globally, vying with Netflix and Snapchat.

In September, US police arrested adults in New Jersey allegedly grooming children on the app.

The NSPCC found one in 20 children on live-streaming sites have been asked to strip by a stranger.

(Image: Getty)

They said: “We know a significant number of children are contacted via live-streaming apps like TikTok, abusers use them as a hunting ground. It’s alarming how little progress social networks have made over grooming.”

Charity Barnardo’s said its child sexual exploitation team had helped victims as young as eight – younger than any it had seen before.

The Department of Media, Culture and Sport said: “We expect tech companies to remove child sexual abuse content, stop online grooming, and have robust age verification.”

TikTok is age-rated on app stores so parents can stop it being added.

A TikTok spokesman said: “We have a number of protective measures in place and are committed to enhancing them. We remove content and terminate accounts that violate our guidelines.

"We also have protections like restricted viewing mode, filters, in-app reporting, and our moderation team removes inappropriate content and terminates accounts that violate our Terms of Service.

"We provide more info to assist parents and users in our Safety Centre."