What is really going on in politics? Get our daily email briefing straight to your inbox Sign up Thank you for subscribing We have more newsletters Show me See our privacy notice Invalid Email

Theresa May has been compared to a Chinese dictator by her own counterterrorism watchdog, over her online extremism crackdown.

The Prime Minister plans to fine technology firms including Google and Facebook if they ‘don’t do enough’ to prevent extremist content being shared on their platforms.

Max Hill QC was appointed by the government as independent reviewer of terrorism legislation in February.

But he hit out at May's policy of imposing financial penalties on tech companies if they don't do enough to combat online hate.

On Wednesday, he told a conference on Terrorism and Social Media in Swansea: "I struggle to see how it would help if our parliament were to criminalise tech company bosses who ‘don’t do enough’.

"How do we measure ‘enough’? What is the appropriate sanction?

"We do not live in China, where the internet simply goes dark for millions when government so decides.

"Our democratic society cannot be treated that way."

(Image: Getty) (Image: Getty)

It comes after Mrs May and Home Secretary Amber Rudd were mocked for demanding tech firms like WhatsApp and Facebook give security services a ‘back door’ into the ‘end-to-end’ encryption used by their messaging apps.

End-To-End encryption uses an algorithm to scramble and unscramble messages on the phone itself, rather than the encryption being done by the tech firm.

The upshot of this is that Apple, WhatsApp, Telegram and other services that use the technology could not decrypt messages sent through their services if they wanted to.

Moreover, the technology is used by banks, credit card companies and healthcare providers to protect users from hackers. Demanding a ‘back door’ to such encryption would necessarily make it more vulnerable to attack from hackers.

Google last week ramped up its efforts to tackle online terrorism with the introduction of four new steps to address the problem.

The internet giant acknowledged that the threat poses a serious challenge and more immediate action needs to be taken.

Google pledged four additional steps in the fight against online terrorism - better detection of extremist content and faster review, more experts, tougher standards, and early intervention and expanding counter-extremism work.

"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all," said Kent Walker, senior vice president and general counsel at Google, in a blog post .

(Image: Reuters)

"Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online.

"There should be no place for terrorist content on our services.

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now."

Google's engineers have developed technology to prevent re-uploads of known terrorist content using image-matching techniques.

Better detection of extremist content

Google will devote more engineering resources to apply its most advanced machine learning research to train new content classifiers to help identify and remove extremist and terrorism-related content more quickly.

More experts

Google will increase the number of independent experts in YouTube's Trusted Flagger programme.

It will expand this programme by adding 50 expert NGOs that it will support with operational grants.

It will also expand its work with counter-extremist groups to help identify content that may be being used to radicalise and recruit extremists.

Tougher standards

The company will take a tougher stance on videos that do not clearly violate its policies.

In the future, videos such that contain inflammatory religious or supremacist content will appear behind an interstitial warning and they will not be monetised, recommended or eligible for comments or user endorsements.

(Image: Youtube)

Early Intervention and expanding counter-extremism work

Google-owned YouTube will expand its role in counter-radicalisation efforts. Its approach targets online advertising to reach potential Islamic State recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.

Mr Walker said: "Collectively, these changes will make a difference. And we'll keep working on the problem until we get the balance right.

"Extremists and terrorists seek to attack and erode not just our security, but also our values, the very things that make our societies open and free. We must not let them.

"Together, we can build lasting solutions that address the threats to our security and our freedoms. It is a sweeping and complex challenge. We are committed to playing our part."

Google is also working together with Facebook, Microsoft and Twitter to establish an international forum to share and develop technology and support smaller companies and accelerate our joint efforts to tackle terrorism online.

(Image: Daily Mirror)

Labour MP Yvette Cooper welcomed the pledges.

She said: "This is a very welcome step forward from Google after the Home Affairs Select Committee called on them to take more responsibility for searching for illegal content.

"The Select Committee recommended that they should be more proactive in searching for - and taking down - illegal and extremist content, and to invest more of their profits in moderation.

"News that Google will now proactively scan content and fund the trusted flaggers who were helping to moderate their own site is therefore important and welcome though there is still more to do."