German Justice Minister Heiko Maas. Reuters Facebook, YouTube, and other social media firms will be fined up to €50 million (£43.9 million) by Germany if they fail to remove hate speech and other criminal content within 24 hours, BBC News reports.

The new legislation, which applies to platforms with more than 2 million users, will come into effect in October after German MPs voted in favour of the Netzwerkdurchsetzungsgesetz (NetzDG) law.

"Our experience has shown that unfortunately, social media companies do not improve their procedures without political pressure," said Justice Minister Heiko Mass, who oversaw the legislation, according to BBC News.

Failing to comply with the NetzDG could result in a fine of up to €5 million (£4 million) on the individual deemed responsible for the company in Germany and €50 million (£43 million) against the organisations themselves.

The law comes after several high-profile incidents of fake news and terrorist content spread on Facebook and YouTube in Germany. Syrian refugee Anas Modamani is suing Facebook after his selfie with Angela Merkel became target of far-right conspiracies.

A migrant takes a selfie with German Chancellor Angela Merkel outside a refugee camp near the Federal Office for Migration and Refugees after registration at Berlin's Spandau district, Germany September 10, 2015. REUTERS/Fabrizio Bensch

Organisations representing digital companies, consumers and journalists, accused the government of rushing a law to parliament that could damage free speech.

"It is the wrong approach to make social networks into a content police," said Volker Tripp, head of the Digital Society Association consumer group.

Germany has some of the world's toughest hate speech laws covering defamation, public incitement to commit crimes and threats of violence, backed up by prison sentences for Holocaust denial or inciting hatred against minorities.

In 2015, Germany pressed Facebook, Twitter and Google's YouTube to sign up to a code of conduct, which included a pledge to delete hate speech from their websites within 24 hours.

Facebook CEO Mark Zuckerberg. Justin Sullivan/Getty Images The NetzDG turn these into legal obligations to delete or remove illegal content, to report regularly on the volume of filed complaints and they also demand that sites make it easier for users to complain about offensive content.

Facebook has taken some early steps to satisfy Germany's regulators. In January, the company announced that it would start filtering fake news for users in Germany. It was the first overseas expansion of an initiative, which launched in the US in December. The company is doing this by working with a number of fact-checking partners, including non-profit Correctiv.

At the end of May, Facebook criticised the new law, saying: "The draft law provides an incentive to delete content that is not clearly illegal when social networks face such a disproportionate threat of fines."

The company added: "It would have the effect of transferring responsibility for complex legal decisions from public authorities to private companies. And several legal experts have assessed the draft law as being against the German constitution and non-compliant with EU law. Facebook is committed to working in partnership with governments and civil society on solutions that will make this draft law unnecessary."

Stephen Deadman, Facebook's global deputy chief privacy officer, said at a conference in Berlin in March that the social media giant's scale makes it hard to monitor and filter everything that gets published and that it had hundreds of staff working on the issue.

"When it comes to managing content, we have almost 1.9 billion people on the platform," Deadman said at the G20 Consumer Summit. "It's a pretty unique situation to be in. Managing content is one of our biggest priorities. I don't want to give any impression that it's something that doesn’t matter to us: it's absolutely a top priority."

Deadman added: "The issues we're discussing today and the issues that were raised here in Germany this week are what we call 'hard problems'. They're 'hard problems' because often they involve dilemmas. Take for example, the issue around illegal content. We want everybody to be safe. We also want open and free internet with a variety of content. We also don't want companies to become the sensors of the internet, or governments for that matter."