YOUTUBE takes down less than half of the dangerous hate content that gets reported to it, a major study has revealed.

Even when the video sharing giant does act to remove Islamist extremist and far right films, it takes almost two weeks to do it.

3 Facebook is owned by Google, a company that has also come under fire for not removing radical videos quickly Credit: Alamy

The Henry Jackson Society think tank ran a three month long experiment to test the Google-owned site by reporting alarming material it found on it every week.

At the end of the period, moderators had removed just 47 of a total of 107 Islamist extremist postings that glorified terror acts.

For the Islamist hate videos they did bother to act on, it took them an average of 11 and a half days to take them down.

And just 33 out of 94 far right movies that promoted racial violence spotted by the think tank’s researchers were eventually taken down - and on average, after 13 and a half days.

3 Youtube decided not to remove controversial Adolf Hitler videos Credit: Getty - Contributor

The revelation comes despite almost every jihadi terrorist attack on British soil being linked to radicalisation online.

Hate content that YouTube refused to remove during the experiment included a video of a man filmed slapping a Muslim teenager with bacon and shouting ‘ISIS scum’.

Another, entitled ‘Adolf Hitler was right’, praised of Hitler with images of Jewish families being marched off to concentration camps.

It also allowed a film to stay up of a child singing to images glorifying Islamist terrorism, as well as promotional material posted to support the Taliban.

3 It took moderators an average of 11 and a half days to take them flagged videos down Credit: Getty - Contributor

By the end of the three months, 121 extremist videos that had been reported were still fully viewable.

The extensive study was commissioned by Commons Home Affairs Select Committee chair Yvette Cooper to test YouTube’s repeated pledges that it acts immediately on hate reporting.

Former Labour Cabinet minister Ms Cooper dubbed the findings “simply unacceptable”, adding: “We know social media can play a role in the radicalisation of young people, drawing them in with twisted and warped ideology.

MOST READ IN POLITICS KEEP WARM & CARRIE ON Carrie Symonds wraps up warm after returning from Italy with Wilfred CHINA DIG Boris Johnson to make veiled pop at China's handling of Covid in UN speech MAKE A MARC Marcus Rashford calls for free school meals for needy kids in October half-term MAKE OR BREX EU to move to No Deal planning if there isn't a Brexit breakthrough next week Exclusive FARE DEAL Cabbies with diesels to get interest-free loans to buy electric under Tory plan CHIN-AID Britain spent £67.8million on 'overseas aid' for China last year

“YouTube have promised to do more, but they just aren’t moving fast enough. Google, which owns YouTube, is one of the richest and most innovative companies on the planet. They have the resources and capability to sort this and they need to do so fast.”

Dr Alan Mendoza, Executive Director of the Henry Jackson Society, added: “These ideologies can be freely disseminated and amplified online, and there is room for improvement by technology firms to provide spaces to expose and debate their inconsistencies”.

The internet giants have also failed to deliver on a demand by PM Theresa May at the UN in September that they remove all extremist content within two hours of it being posted or face crippling fines.