“What are we in for next?” asks the narrator on the YouTube video.

“Will the temperature resume an upward trend? Will it remain flat for a lengthy period? Or will it begin to drop? No one knows, not even the biggest, fastest computers.”

The video — with the clickbait title “What They Haven’t Told You about Climate Change” — has been watched more than 2.5 million times on the Google-owned video platform.

Produced by the conservative group PragerU, the video sees Canadian lobbyist and fossil fuels advocate Patrick Moore run through a long-debunked argument that because the world’s climate has changed before, there’s no problem with burning record amounts of fossil fuels.

Moore claims, for example, there has been “no significant warming trend” in the 21st century — not mentioning that nine of the 10 warmest years on record have occurred since 2005, or that the world’s oceans have been heating rapidly.

Despite the clear errors, the video has gathered more views than any other climate science denial clip on YouTube. All up, PragerU claims the video has been watched 4.4 million times across all platforms.

A search on YouTube for the most viewed “climate change” videos has Moore’s effort ranked 13th — searching for “global warming” has it ranked 19th.

But where the problems really start, are when YouTube’s “up next” algorithm takes a guess at what you might want to watch next after seeing Moore’s video.



PragerU video on climate models.

Recommending Denial

When I viewed YouTube without signing in, almost all the videos suggested by the algorithm would sit firmly in the climate science denial folder. There’s so much of this material on YouTube that it’s not hard to find once the algorithm opens the door.

There’s a Nobel Laureate who apparently “Smashes the Global Warming Hoax” — just don’t mention the 76 other laureates asking for “rapid progress towards lowering current and future greenhouse gas emissions.”

Then there are two other videos, both titled “The Truth About Global Warming,” and both delivering the opposite to what its title claims.

Before you know it, you’re in a world of “climate cults,” “global warming hysteria,” and claims of failed predictions and Al Gore getting “slammed.”

For an unsuspecting viewer, watching just one video can lead you quickly into an alternate universe where facts, physics, and real-world experiences are replaced by conspiracies, cherry-picking, and fossil fuel–backed propaganda.

All of this exists after YouTube declared in January 2019 that it had been working on its recommendations algorithm and making “hundreds of changes to improve the quality of recommendations for users on YouTube.”

Google White Paper

But could YouTube and its parent company Google finally be getting to grips with misinformation to marginalize, rather than ban, counter-factual content?

YouTube says it’s making changes with an eye on content that “comes close to — but doesn’t quite cross the line of — violating our Community Guidelines.”

“To that end,” the official blog post explained, “we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

At a conference in Munich on February 16, 2019, YouTube's owner Google released a “white paper” in which it defended its record in tackling disinformation on the video site, listing a series of steps it was taking to weed out disinformation. “We aim to provide content that lets users dive into topics they care about, broaden their perspective, and connect them to the current zeitgeist,” the paper said.

But when it comes to factual claims, Google said: “But as we describe in our Search section, in verticals where veracity and credibility are key, including news, politics, medical, and scientific domains, we work hard to ensure our search and recommendation systems provide content from more authoritative sources.”

This included “information panels that contain additional contextual information and links to authoritative third-party sites” on topical content “that tends to be accompanied by disinformation online.”



Some YouTube users in the U.S. may see a link to the Wikipedia entry on “global warming” when searching for related videos.

Given that climate denial videos take positions contradicted by every major scientific academy in the world, many scientists would surely hope that YouTube takes “blatantly false claims” about climate change as seriously as it does flat earthers or 9/11 truthers.

Using machine learning and a team of humans, YouTube has said it is rolling out changes just in the U.S., and this would affect only a “small set of videos in the United States.”

In 2018, YouTube started adding pop-up links to Wikipedia, with brief factual descriptions, to some climate change videos. As reported by BuzzFeed, this angered some producers, including PragerU.

Craig Strazzeri, PragerU’s chief marketing officer, told BuzzFeed: “Despite claiming to be a public forum and a platform open to all, YouTube is clearly a left-wing organization.”

“This is just another mistake in a long line of giant missteps that erodes America’s trust in Big Tech, much like what has already happened with the mainstream news media.”

Clearly, for PragerU, it is more important to politicize YouTube’s mild attempts to correct misinformation with a tiny pop-up message than to get its facts straight.

Echo Chambers

“YouTube and other social media platforms have exacerbated the misinformation problem in a number of ways — whether it's creating echo chambers for science denial, making it easy for misinformers to micro-target audiences, or funneling its users to extremist content,” says Dr. John Cook of George Mason University’s Center for Climate Change Communication.

“In the case of YouTube, their algorithms result in extremist content like climate denial receiving millions of views. However, YouTube's response has been entirely inadequate. Adding a generic link to Wikipedia under denialist videos is like slapping a tiny bandaid on a large, open wound.”

While at the University of Queensland in Australia, Cook led a study showing that 97 percent of climate scientists agreed that global warming was caused by human activity. Cook also led the production of a free Massive Online Open Course, through the university, to explain the science of climate denial, producing many debunking videos that also appear on YouTube.

“It's a challenging problem,” says Cook. “Interventions like adding a ‘fake news’ warning on online misinformation can actually backfire and promote the myth.”

“Nevertheless, there is a great deal of research into how to inoculate the public against misinformation without triggering adverse effects.”

He says platforms like YouTube should be working with misinformation researchers to develop strategies that “reduce the negative impact on society” of climate denial videos.

For now though, YouTube has a serious climate science denial problem.

Main image: Screen shots of climate science denial videos on YouTube.