I let YouTube tell me what to watch. The results were out of this world. Raphael Follow Apr 12, 2018 · 5 min read

For the past four months, the world’s most popular video sharing site has been trying to get me to believe in aliens.

At least 200,000 of them are secretly walking among us, “getting married and producing alien-human hybrids,” says one rambling video recommended to me by YouTube in February. The aliens are running out of patience, the video warns, and unless Earth’s “super elites” change their corrupt ways, “a very powerful cosmic entity” is preparing “to deliver a massive worldwide final judgment.”

YouTube had been urging me to check out videos of this ilk for weeks. In January, it recommended a silent clip called “Alien Infestation of the Moon,” which purports to carry evidence of extraterrestrial spacecraft hiding in plain sight among the grainy black-and-white photos taken by NASA’s Apollo missions. YouTube also recommend I view videos posted by users with bizarre handles such as “thirdphaseofmoon” or “Disclose Screen The Grimreefar Filthy South” where I could learn, for example, that the false alarm over a ballistic missile that spread panic across Hawaii was designed to hide evidence of alien activity, or that drones “are a conspiracy made by the government to cover up and discredit UFOs.”

One of the first push notifications delivered to my Android phone

The material being shared across YouTube is under new scrutiny amid a general reassessment of how companies such as Facebook and Google shape the way we see the world. As Facebook struggles to weather concerns over its relationship with unscrupulous advertisers who boast of manipulating users’ insecurities, YouTube stands accused of serving up incendiary content to keep audiences hooked. Writing last month in The New York Times, techno-sociologist Zeynep Tufekci accused YouTube’s algorithms of channeling her and others to progressively more extreme material, calling the site “one of the most powerful radicalizing instruments of the 21st century.”

My personal experience with YouTube has been, if anything, more disturbing than Tufekci’s. Sometime late last year, the YouTube app on my Android phone suddenly began sending me push notifications. I don’t recall enabling them, and they certainly weren’t wanted, but the alerts gave me an insight into YouTube’s recommendation algorithm so I kept them on. Every few days a new notification bar would pop up on my lock screen along with a link to a video and the word: “Recommended.”

Another early push notification

The recommendations were jarring. Among the first ones were clips published by an outfit called “Golden State Times,” which routinely circulates pro-Trump videos under over-the-top headlines such as “President Donald Trump gives EXPLOSIVE Speech on Tax Cuts and Reform VICTORY” or “MUST WATCH: President Donald Trump Tells CNN’s Jim Acosta to Get ‘Out.’” Next came the alien conspiracy videos, such as “NASA FILMS UFO FLEET ABOVE EARTH,” interspersed with a clip from the Asian version of “America’s Got Talent,” skits from “Saturday Night Live,” and — inexplicably — tedious fan-made videos related to “Star Trek” and “Game of Thrones.” Some of the recommendations made sense. For example, my only subscription was to virtual reality news site VRFocus and YouTube occasionally flagged new videos from that channel. But the majority, like a video purporting to show a perpetual energy machine that ran on magnets, were mystifying. It was almost as if the platform was throwing random content at me — no matter how dissonant or extreme — to see what would stick.

In a written statement, Google said that approximately 95 percent of YouTube’s recommendations were intended to advertise new videos from channels I already subscribed to or fresh content related to what I had recently watched.

“Our systems reflect what people subscribe to and select to watch on YouTube. We are constantly working to improve notifications and make sure they reflect viewer interest and don’t erroneously skew towards any particular content.”

Google wouldn’t say how many times the company had recommended alien conspiracy or Trump rally videos its users, saying only that the more innocuous videos it had nudged me to watch— like the Boston Dynamics robot dog clip that went viral in February — were “more representative of the notifications most users receive.”

A social media expert I spoke with said he was unsettled by the idea that YouTube could be marketing conspiracy theories to people via their phones — even though he cautioned that a sample size of one was far too small to draw any conclusions about what was happening.

“Video has a huge impact on people’s perceptions of the world,” said Aviv Ovadya, the chief technologist at the Center for Social Media Responsibility at the University of Michigan’s School of Innovation.

“I’ve heard so many stories during my exploration of YouTube about parents, friends, siblings, etc. falling down the YouTube conspiracy theory rabbit hole and becoming deeply divorced from reality. This has generally been suggested as a result of YouTube’s recommendations, but these kinds of push notifications may be another crucial cause of that which needs to be investigated. We definitely need more data, analysis, and context though to learn more.”

More alien conspiracies

Ovadya, who reviewed a four-month extract of my YouTube viewing history (which I’ve made available here) said it was always possible that signals given by my search patterns— or activity elsewhere within Google’s digital empire — could be informing YouTube’s bizarre push notifications. It’s also true journalists tend to have unusual patterns of online behavior. But I found it hard to reconcile my day-to-day viewing activity — Playmobil movies, a clip from Portishead, goofball Ukrainian covers of “Despacito”— with YouTube’s recommended mix of extraterrestrial intrigue and ideological extremism.

One YouTube suggestion, like the clip entitled “Robert De Niro completely destroy Trump” [sic] appeared aimed at baiting audiences furious at America’s scandal-plagued president. I didn’t click on the link. Five days later YouTube recommended a film from the opposite extreme of the ideological spectrum, a documentary by far right provocateur Lauren Southern.

A push notification recommending Lauren Southern’s film

My eyes goggled when the film, whose protagonists warn of a coming “race-based civil war” in South Africa, began with an advertisement from South African Tourism.

Somewhere, someone was making money off of a film about an impending race war.

And YouTube was eager for me to watch.