Online disinformation is only going to get more sophisticated, the chair of the committee investigating disinformation and fake news, Damian Collins, has warned.

In a report released on Monday, the Digital, Culture, Media and Sport (DCMS) select committee said Facebook had in effect put democracy at risk by allowing voters to be targeted with disinformation and personalised “dark adverts” from anonymous actors. It called for the company to be regulated.

“Where we can see lies being spread, particularly in election periods, we should have the ability to say to the tech companies: we want you to act against that content,” Collins told the BBC Radio 4’s Today programme. “Its not an opinion, it’s a clear lie. It’s being spread maliciously and you should stop it.”

After an 18-month investigation, the DCMS found that British election laws were not fit for purpose and were vulnerable to interference by hostile foreign actors.

Although he stopped short of saying that companies such as Facebook, Twitter and YouTube were breaking the law, Collins said the legislation was not robust enough and needed to be made clearer.

Citing evidence of agencies working from Russia, as well as an unidentifiable organisation called the Mainstream Network that urged voters to lobby their MP to support a no-deal Brexit, Collins criticised the fact that the law did not require such actors to identify themselves.

“No one knows who this organisation is, and I think in a democracy, citizens need to be informed … and the law doesn’t require that.”

He predicted false information would become more convincing, saying that “deepfake films” featuring politicians giving inflammatory speeches they never gave could circulate social media in the near future.

The interviewer, John Humphrys, interjected: “So it looks like you’re saying, ‘sack Theresa May’, but in fact it’s somebody else with your face superimposed?”

Collins said: “In a situation like that we are going to want to be able to go to companies like Facebook and say this is clearly fake, its being released maliciously to try to influence people’s opinion to spread anger and hate and it should be taken down because its not true. That’s the power we believe we need.”

He added: “We should have a proper code of ethics, set in statute with an independent regulator to oversee whether the tech companies are complying or not.”

The approach taken by other European countries could serve as an example for the UK, he suggested. In France, judges can order fake news to be taken down, while in Germany the tech companies take responsibility for taking down hate speech from their platforms. However, social media companies “could invest more to deal with this and proactively identify this content for themselves”, he said.

The government has so far been reluctant to endorse the committee’s findings, with Collins previously complaining that ministers had been hesitant to support many of the conclusions contained in the preliminary report.

Elsewhere on social media, young people were being exposed to harmful content but were trapped within feedback loops that meant if they engaged with this material, they were served with more of it. “What you are seeing is not an organic feed,” Collins said. “We should also question the ethics of a company that would create a tool like that.”

Full Fact, a UK fact-checking charity, said it welcomed the DCMS’s recommendations and that the government should commit to making these changes before the next election.

