German politicians have voted in favour of fines of up to €50m for social media companies that regularly fail to swiftly remove illegal content from their platforms.

The new law comes into force in October and compels firms such as Twitter, YouTube and Facebook to take down obviously criminal material within 24 hours and to assess content that is not clearly unlawful within seven days.

But it has already come under fire from human rights campaigners, industry and the UK’s terror watchdog, at a time when Theresa May is drafting similar measures for the UK.

Critics claim the German penalties fail to strike the right balance between protecting freedom of expression and ensuring firms take their legal responsibilities seriously.

Ed Johnson-Williams, a campaigner for the Open Rights Group, told New Statesman Tech that the deadlines and fines could pressure companies into removing content “when its legality is unclear”.

“Meanwhile there are no incentives to ensure the companies properly analyse the context of content and leave legal content online,” added Johnson-Williams.

Alexander Rabe, a member of the Eco board, which monitors the German internet and represents the industry, is also concerned about the deadlines. “It takes time to define if a complaint’s content is really illegal or not,” Rabe told the BBC.

The necessity of financial penalties was called into question by the UK’s terror laws watchdog Max Hill earlier this month after May announced that the UK and France would follow Germany’s lead by introducing fines for social media firms.

“In Germany, there was a proposal for very heavy fines to be levied against tech companies whenever they fail to take down extreme content. Is that absolutely necessary? I’m not sure that it is,” Hill, formerly one of the country’s leading prosecutors of terrorists, told the Today Programme.

“I’ve sat with the relevant police unit when they identify extreme content. I’ve seen them communicating with tech companies and I’ve seen the cooperation that flows from that. It’s a question of the bulk of the material rather than a lack of cooperation in dealing with it.”

Following May’s announcement, Facebook, Google and Twitter issued statements defending their record on tackling terrorist content.

Google, which owns YouTube, said it already spends hundreds of millions of pounds on fighting abuse. Twitter, meanwhile, revealed that it had removed more than 376,000 accounts for violations related to promoting terror.

In a statement responding to today’s vote, Facebook, which has recently hired a further three thousand staff to review flagged content, said it shared the goal of the German government to fight hate speech, but added: “We believe the best solutions will be found when government, civil society and industry work together and that this law as it stands now will not improve efforts to tackle this important societal problem.”