“Deepfake” porn perpetrators are being given a free hand to “shatter lives” because the law is unfit for purpose, a study has found.

Fake porn – where videos and images are digitally altered to make them sexual or pornographic - is a “growing and harmful” problem, according to law professors from Durham and Kent universities.

In a new report on image-based sexual abuse, researchers argue that the phenomenon has taken a more sinister turn due to modern technology.

The use of artificial intelligence and “deepfake” – a technique for human image synthesis that uses machine learning – makes altering videos “much more straightforward and sophisticated”, the report says.

“To the untrained eye, it is very difficult to tell the difference between the fake and real images, and so the harm and harassment felt by victim-survivors is just as significant,” it explains.

The report comes after a Government announcement that the Law Commission has been asked to consider whether existing laws were sufficient.

Clare McGlynn, professor of law at Durham University and one of the report's authors, said: "Due to the serious legal and policy failings identified in this report, we are effectively gambling with people's lives.

"We found that image-based sexual abuse can shatter lives, often experienced as an entire 'social rupture' of their world.”