Social media firms face huge fines if they fail to keep children safe online under a new legally enforced code that will be enacted as early as this autumn.

In what campaigners say will be a sea change in online safety, the tech giants will be required by law to enforce their terms and conditions to protect children from harmful content such as cyberbullying, self-harm, sexual content and abuse – and prevent under-age children from joining their sites.

Companies such as Facebook, Instagram and Snapchat will be expected to ensure that only children aged 13 and over are on their platforms and that the content is appropriate for their age.

Those that breach the “age appropriate” code face fines of up to 4 per cent of global turnover by Elizabeth Denham, the Information Commissioner, who will police the new regime.

She said: “Our code will clearly outline what is required of developers at the design stage so that children are protected in the first place. Safeguards must be built in, not bolted on.

“We will not hesitate to use our considerable powers to enforce the law.”

Much as a parent who buys their child a cuddly toy should have the confidence there are no sharp edges or loose fixings so they should have the confidence that online games, websites and new technologies will be safe, she said.