Google and YouTube spread fake news and propaganda about a Texas mass shooting suspect, one month after the video-sharing site adopted reforms meant to restrict the promotion of misinformation during breaking news events.

Search results on both platforms have amplified the false news that Devin Kelley, the man accused of killing 26 people in a Sutherland Springs church on Sunday, was linked to anti-fascist and leftwing movements. Twitter also prominently promoted an article from a Russian state-funded news organization, which initially published unsubstantiated claims about Kelley’s political beliefs.

The rapid proliferation of fabricated politicized content has become a common feature of social media in the aftermath of mass shootings in the US. But the latest wave of propaganda in Texas was particularly significant given YouTube’s recent claims that it implemented changes to counter the problem and prioritize legitimate news during national tragedies.

Searches suggested Devin Kelley was linked to anti-fascist movements. Photograph: YouTube/Screengrab

After a Las Vegas gunman committed the deadliest US mass shooting in modern history last month, YouTube promoted a wide range of videos claiming the massacre was a hoax or a government-staged conspiracy featuring actors posing as victims. One day after survivors and victims’ relatives condemned the Google-owned platform, YouTube said it was making tweaks to its algorithm so that reliable content would appear more prominently.

While some of the offensive videos were taken down or removed in search results, others suggesting victims were liars or performers remained on the site. The conspiracy theories have had major consequences for victims, fueling death threats and aggressive online harassment that survivors and their relatives have continued to face weeks after the shooting.

Hours after the Texas shooting in a Baptist church Sunday morning, viral claims spread on a number of social media platforms that Kelley was associated with “antifa”, the anti-fascist movement that is frequently the target of conspiracy theories promoted by far-right propagandists. Some fake news stories claimed the gunman carried an “Antifa flag”, spoke of a “communist revolution” and was targeting conservative churches.

Officials have not commented on a motive for the 26-year-old, who was found dead in his car after the attack.

On Monday, more than 24 hours after the attack, both Google and YouTube search bars promoted “antifa” as the first autocomplete suggestion for users who typed the suspect’s name. The subsequent results included a number of propaganda videos, including one titled “DEVIN PATRICK KELLEY ANTIFA CONFIRMED *PROOF*” and another labeled “Devin Kelley is ANTIFA”. Some of those videos also appeared in top results for a simple search of his name.

Both Google and YouTube search bars suggested ‘antifa’ after users typed the suspect’s name. Photograph: Google/Screengrab

A search for “Sutherland Springs” in YouTube also suggested an autocomplete to “false flag”, a term conspiracy theorists frequently use for mass shootings, sometimes claiming that the government faked the deaths in an effort to push gun control policies. A video called “Sutherland Springs Church Shooting – Federal False Flag” had more than 15,000 views and another titled “ANTIFA Texas Church Shooter False Flag Fiasco” had more than 5,000 hits.

A YouTube spokesperson admitted that the site had problems, telling the Guardian that search results were not working as they should have. As soon as the news broke, the site had teams review searches and improve results, and while many queries produced appropriate videos, some did not, according to the spokesperson.

“There is still more work to do, but we’re making progress,” the company said in a statement, adding, “We’re continuing to invest in new features and changes to YouTube search that provide authoritative results.”

A YouTube spokesperson said the search results were not working as they should have. Photograph: YouTube/Screengrab

On Monday, a Twitter search for the suspect’s name also promoted as the top result an article from RT, the Russian news site formerly known as Russia Today. The piece included speculation about Kelley’s antifa flag, though the site later updated its article to note that the claims had been debunked and wrote on Facebook that it had made a “mistake” publishing “unverified information”.

The promotion of RT on the social media site came weeks after Twitter announced it would stop taking advertisements from all accounts owned by RT, a response to the disclosures that Russian trolls had widely infiltrated Twitter and Facebook in an apparent effort to interfere with the US presidential election.

Twitter did not immediately respond to a request for comment.

Google’s “Popular on Twitter” feature also appeared to promote a variety of false news about Kelley in the immediate aftermath of the shooting, including claims that he “identified as a radical Alt-left”, supported the former Democratic presidential candidate Bernie Sanders and was a “MUSLIM CONVERT”.

A Google spokesperson said that autocomplete predictions are “generated based on users’ search activity and interests” and “may be unexpected or unpleasant”. But the statement added: “In this case, our system did not work as intended. We’re currently working on our system for name detection to improve this process moving forward.”

Kelley was a white man with a history of domestic violence allegations. His political beliefs have not been disclosed.



Brooke Binkowski, managing editor of the fact-checking website Snopes.com, said she has increasingly seen rightwing trolls create and spread fake news designed to blame the left for violent incidents.

“They are rushing to politicize it and make it this ‘alt-left’ conspiracy when there is nothing,” she said in an interview, adding that it was frustrating to see social media sites continuing to rely on algorithmic tweaks to solve the problem: “Obviously algorithms are not going to change this.”



Contact the author: sam.levin@theguardian.com