Media playback is unsupported on your device Media caption After Molly Russell took her own life, her family discovered distressing material about suicide on her Instagram account

Social media firms could be banned if they fail to remove harmful content, the health secretary has warned.

Speaking on the BBC's Andrew Marr show, Matt Hancock said: "If we think they need to do things they are refusing to do, then we can and we must legislate."

But he said it would be better to work jointly with social media companies.

The minister earlier called on social media giants to "purge" material promoting self-harm and suicide in the wake of links to a teenager's suicide.

Asked if social media could be banned, Mr Hancock said: "Ultimately parliament does have that sanction, yes" but added: "it's not where I'd like to end up."

Molly Russell, 14, took her own life in 2017 after viewing disturbing content about suicide on social media.

Speaking to the BBC, her father said he believed Instagram "helped kill my daughter".

Mr Russell also criticised the online scrapbook Pinterest, telling the Sunday Times: "Pinterest has a huge amount to answer for."

Instagram responded by saying it works with expert groups who advise them on the "complex and nuanced" issues of mental health and self-harm.

Based on their advice that sharing stories and connecting with others could be helpful for recovery, Instagram said, they "don't remove certain content".

"Instead (we) offer people looking at, or posting it, support messaging that directs them to groups that can help."

But Instagram added it is undertaking a full review of its enforcement policies and technologies.

A Pinterest spokesman said: "We have a policy against harmful content and take numerous proactive measures to try to prevent it from coming and spreading on our platform.

"But we know we can do more, which is why we've been working to update our self-harm policy and enforcement guidelines over the last few months."

Facebook, which owns Instagram, said earlier it was "deeply sorry".

The internet giant said graphic content which sensationalises self-harm and suicide "has no place on our platform".

Papyrus, a charity that works to prevent youth suicide, said it has been contacted by around 30 families in the past week who believe social media had a part to play in their children's suicides.

"We've had a spike in calls to our UK helpline since the BBC first reported this six days ago, all saying the same thing," said a spokeswoman for the charity.

Mr Hancock said he was "horrified" to learn of Molly's death and feels "desperately concerned to ensure young people are protected".

Media playback is unsupported on your device Media caption Matt Hancock: We will and we must act if we have to

In a letter sent to Twitter, Snapchat, Pinterest, Apple, Google and Facebook (which owns Instagram), the minister "welcomed" steps already taken by firms but said "more action is urgently needed".

He wrote: "It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.

"It is time for internet and social media providers to step up and purge this content once and for all."

He added that the government is developing a white paper addressing "online harms", and said it will look at content on suicide and self-harm.

Mr Hancock explained: "Lots of parents feel powerless in the face of social media. But we are not powerless. Both government and social media providers have a duty to act.

"I want to make the UK the safest place to be online for everyone - and ensure that no other family has to endure the torment that Molly's parents have had to go through."

Molly was found dead in her bedroom in November 2017 after showing "no obvious signs" of severe mental health issues.

Her family later found she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.

Mr Russell told the BBC: "Some of that content is shocking in that it encourages self harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter."

Solicitor Merry Varney, who represents the Russell family, said Molly's case "and the examples of how algorithms push negative material" show a need to investigate online platforms, and how they could be "contributing to suicides and self-harm".

If you’ve been affected by self-harm, or emotional distress, help and support is available via the BBC Action Line.

How could social media sites and apps be blocked?

Preventing the British public from visiting some of the most popular internet platforms would be a drastic step. But it is not outside the realms of possibility.

The most obvious approach would be to make the country's internet service providers (ISPs) block access to certain internet protocol addresses directly.

The risk with this is that IP addresses can sometimes be shared between services. In the past, an effort to block one illegal site led to users being unable to visit the Radio Times. It also doesn't address the fact that banned services could start using alternative IP addresses.

So another approach would be to order ISPs to make adjustments to their domain name system (DNS) servers, which act as a sort of internet address book, translating easy-to-type web addresses into the long string of numbers that actually represent the services involved.

The aim would be to prevent requests being routed to the correct internet protocol (IP) addresses, effectively blocking access to the platforms.

But this alone would not be enough, since some people route their traffic via DNS servers belonging to Google or Cloudflare - a company that optimises web traffic performance and also protects against cyber-attacks - rather than their ISP. These firms are based overseas.

It is possible for them to detect DNS requests that originated in the UK and refuse them when appropriate, but there would likely need to be considerable arm-twisting to get them to agree to set such a precedent.

But to complicate matters, users could still try to circumvent a ban by using tools to anonymise their location and identity, such as a virtual private network (VPN) or the Tor browser.

Prof Alan Woodward from the University of Surrey said: "Many of the ways around such bans are relatively simple unless you also ban all those methods as well, such as happens in countries like China.

"For a ban to be truly effective, the UK government [would have to take steps] that I can't imagine users or ISPs would be happy with."

Matters would become further convoluted if the social media firms decided to help users bypass a ban, as has been the case with some piracy sites.

The Pirate Bay, for example, regularly updates a list of proxy sites that act as middle men, sending data to and from the banned database via intermediary servers. The government would therefore need to keep on top of such proxies and ensure they were blocked too.

This story was updated on 28 January to add the analysis "How could social media sites and apps be blocked?"