Guides to mounting a car terror attack were available on Google and Twitter last night.

The vile manuals were online despite widespread warnings that UK jihadists use them for training.

Fanatics are urged to deploy large vehicles as ‘tools of war’ before going on a stabbing rampage – the template for Wednesday’s atrocity in Westminster. Boris Johnson accused social media websites of inciting terrorism.

The vile manuals were online despite widespread warnings that UK jihadists use them for training

And Google’s YouTube video platform was found to be raking in money from conspiracy theories saying the London outrage was a hoax.

As the maniac behind the attack was unmasked as 52-year-old Khalid Masood:

The security services faced questions because he was known to police and MI5;

Home Secretary Amber Rudd denied failures but admitted: ‘One got through’;

It emerged MPs had raised concerns about the Commons gates Masood waltzed through;

Officials revealed that he was shot dead by a ministerial bodyguard, rather than by armed police;

Islamic State claimed Masood was its ‘soldier’;

Officers made eight arrests around the country;

The death toll rose to five when a 75-year-old man died in hospital last night.

Masood, a bodybuilder and violent criminal who claimed to have been a teacher, raced across Westminster Bridge on Wednesday in a hire car, smashing into pedestrians at up to 70mph, killing three and injuring 28.

He then slipped through a gate into the precincts of Parliament where he hacked to death Keith Palmer, a 48-year-old constable.

Born Adrian Elms in Kent, Masood had converted to Islam and was ‘on the radar’ of MI5. Police said he had been ‘inspired’ by international terrorism.

But last night attention turned to whether the attacker, who is said to have acted as a ‘lone wolf’, could have been radicalised online.

Speaking at a security conference in the US, Foreign Secretary Mr Johnson called on internet giants to take action. He said: ‘We are going to have to engage not just militarily, but also to stop the stuff on the internet that is corrupting and polluting so many people.

‘This is something that the internet companies and social media companies need to think about.

Wedding Day: Murdered constable Keith Palmer with wife Michelle

‘They need to do more to take that stuff off their media, the incitements, the information about how to become a terrorist, the radicalising sermons and messages. That needs to come down.’

In the hours after the London attack, the Daily Mail found vile Islamic State terror manuals online through simple searches on Google and Twitter. One included a section on using vehicles as weapons.

It told jihadists in the West to learn from Palestinian terrorists who ‘have resorted to using cars as tools of war, also knives as weapons which are easily available from DIY stores’. The manual was published a year ago, before the vehicle attacks in Nice, Berlin and London, which have killed 102 people and injured more than 500. Another Islamic State publication was available through Google and Twitter with detailed instructions on how to cause mayhem.

It was written after the Bastille Day attack in Nice, when a truck was used to murder 86 people – ten of them children and teenagers – at a fireworks display. It said the Nice attack ‘superbly demonstrated’ how vehicles can be used for terror, having the effect of ‘smashing their bodies while crushing their heads, torsos and limbs under the vehicle’s wheels leaving behind a trail of carnage’.

It added: ‘Vehicles are like knives, as they are extremely easy to acquire. But unlike knives, which if found in one’s possession can be a cause for suspicion, vehicles arouse absolutely no doubts due to their widespread use throughout the world. It has been shown that smaller vehicles are incapable of granting the level of carnage that is sought. One of the main reasons for this is smaller vehicles lack the weight and wheel span required for crushing many victims.

‘The type of vehicle most appropriate for such an operation is a large load-bearing truck.’

The guide also gave instructions on where on the body to strike with a knife. The social media giants were criticised by MPs last week for failing to do enough to remove extremist content.

Last night, Google removed links to the manuals that were found by the Mail. A spokesman said: ‘We are deeply troubled by violence and acts of terrorism and our thoughts are with the victims of yesterday’s attack in London. We remove links to illegal content in search when reported to us.’

Standing together: Thousands gathered in London's Trafalgar Square today to pay a defiant tribute to those who lost their lives in the attack

Links to the Islamic State manuals were available on Twitter, as well as pictures of pages with detailed instructions on how to kill innocent people.

Twitter removed one suspect user’s account after being contacted by the Mail. But other images were not removed as these had been posted by academics who were not promoting the manuals in a positive way.

Twitter said that in the last six months of 2016 it suspended 376,890 accounts for violations related to promotion of terrorism.

A spokesman added: ‘We don’t comment on individual accounts for privacy and security reasons.’

Among those injured in Wednesday’s attack were nationals from France, Romania, South Korea, Germany, Poland, Ireland, China, Italy, the US and Greece. Twelve of the victims had to be treated in hospital for serious injuries.

UK students Travis Frain and Owen Lambert were among the injured, as were four South Korean tourists, including one with serious injuries.

YouTube makes money on sick hoax claim videos

KATHERINE RUSHTON, MEDIA AND TECHNOLOGY EDITOR FOR THE DAILY MAIL

Google is making money from vile claims that the London terror attack was a hoax, and that its victims were actors and mannequins.

Netflix, Guess, Trivago, Opodo, Asus and SunLife insurance have adverts alongside videos published by conspiracy theorists on Google’s YouTube platform.

Within hours of the attack, YouTube was hosting hundreds of videos claiming the atrocity was faked.

One user sang Allahu Akbar to the tune of London Bridge is Falling Down – a reference to Westminster Bridge where the attacker ploughed into pedestrians.

Google profits from the ads itself, but also hands a cut directly to those who post shocking videos.

Some of those posting about Wednesday’s attack claimed that those who were injured and killed were in fact actors using plastic limbs and that the emergency services were in on an elaborate plot to terrify ordinary citizens.

Conspiracy theorists: The vile videos have adverts alongside them

Ads from the online travel site Opodo appeared on posts by the user Russianvids, claiming that a woman trapped under a bus looked like a mannequin ‘they pulled out of Walmart’. One of the viewers said the ‘actors are terrible’. Others claim that the atrocity was orchestrated by Jews trying to frame Islamists, Freemasons, or the ‘New World Order’ – a name used to describe a new, global totalitarian government which is allegedly taking over the world. They even drew links to the Prime Minister and the Queen. Many of the wild theories also dwell on the date of the attack – 22 March, or 3/22 as it would be written in the US.

The fantasists claim that the number 322 has occult associations, and point out that is shorthand for Skull and Bones, an American secret society.

One user, ScreamCrow Face, published a series of rants suggesting that London Mayor Sadiq Khan staged the attack to support his demands for more police on the streets. The video featured ads by Opodo, Asus, Guess, Trivago and Sleep mattresses.

Yesterday, Google had disabled ads on many but not all of the hateful videos.

The firm places the ads using automated technology rather than human judgment.

Those posting videos on the site receive up to £6.15 for every 1,000 views, and many are watched millions of times.

Many of the users who have posted hoax claim videos about the attack have made money from YouTube, but not necessarily from ads on that video itself. The videos only host ads where clearly stated.

A YouTube spokesman said last night: ‘Videos threatening violence are against YouTube’s policies and we remove them quickly when they are flagged to us.

‘When it comes to advertising, we have strict guidelines that define where ads should appear.’