A group of YouTubers said they have worked since June to compile evidence that certain words or phrases within video titles lead to automatic demonetization by the platform’s machine learning program.

As a result, those YouTubers also claim the platform’s bots are routinely demonetizing LGBTQ+ content.

A day after the videos documenting this evidence were posted, YouTube directly responded to them and said that “the right teams are reviewing your concerns in detail,” also promising to follow up on the claims.

YouTubers Create Monetization/Demonetization Word List

In a series of videos released Sunday, a group of YouTubers detailed 15,000 keywords that they tested against YouTube bots and claimed many of those words—including some LGBTQ+ terms—lead to automatic demonetization.

Particularly, the project looks at those keywords and determines whether or not each caused a video to be demonetized when used in the title of a video. The research, which was conducted from June to July, was a collaboration between creators Nerd City, YouTube Analyzed(who does not work for YouTube), and Sealow.

“Robot law enforcement on YouTube just resulted in two years of gay people being treated like it’s the 1300’s,” Nerd City said in his video.

The report, published as a Google spreadsheet, classifies words in one of two categories: green meaning monetized and yellow meaning demonetized. However, YouTube Analyzed said the way monetization is decided is more like a 0-1 scale.

Thus, certain words near the middle of that scale might be green one day and yellow the next. To provide context, he placed an asterisk next to words that yielded mixed results.

To create the list, they uploaded two-second clips they said had no demonetizable audio or video. Then, they experimented with keywords, replacing demonetized words with “happy” or “friend” to see it if that would monetize the video.

As such, they found a grab bag of results. For example, “antivaxx” sometimes resulted in demonetization, but never “antivax” or “anti-vaxxer.”

Additionally, “North Carolina” was demonetizable but not “North Korea.” YouTube Analyzed actually explained this by saying that if a word has too much negative association with it, the bot might be prone to flagging the word. He argued “North Carolina” might have been flagged because news surrounding transgender bathroom laws made headlines in July as he was compiling the list.

Other words like “restaurant,” “you,” “sunglasses,” “photos,” “profit,” and even “Shrek” reportedly caused their videos to get demonetized.

While more expected terms like slurs, cuss words, and other words like “Hitler” were also flagged, other controversial words like “incel” and phrases like “how to murder” weren’t demonetized. YouTube Analyzed suggests, unlike the “North Carolina” example, if the bots haven’t seen a word or phrase used enough, they might not catch it.

LGBTQ+ Video Demonetization

The creators also found that common LGBTQ+ terminology tended to be demonetized, and some media outlets have called this project the most conclusive evidence that YouTube is demonetizing LGBTQ+ videos.

Again, however, the system yielded highly variable results. For example, “gay” was demonetizable, but YouTube Analyzed noted the word is context-sensitive. The term “lesbian” was sometimes green but “lesbians” was always yellow. Also, “transgender” was monetizable but not always “trans.”

Additionally, the word “homophobia” was ad-friendly, but not “homosexual,” while terms like “straight” and “heterosexual” were both always green.

Some of the titles they tried included “Lesbian princess” and “Kids Explain Gay Marriage,” a reference to a Jimmy Kimmel skit posted on YouTube. Both were demonetized but later monetized when replacing “lesbian” and “gay” with “happy.”

As to why these videos are being demonetized, Sealow posits a couple of possible reasons. The first is similar to the “North Carolina” example where, politics and negative press could influence certain words. In the case of LGBTQ+ content, bots could interpret certain terms negatively if they are regulating a high number of homophobic or hateful content.

Sealow also worries that if videos with words like “gay” are manually demonetized by people with biases, then bots will also develop the tendency to demonetize those videos regardless of the content.

According to Nerd City, YouTube is possibly outsourcing some 10,000 workers from a company called Lionbridge, which employs people from a number of countries that have anti-LGBTQ+ laws, including Somalia, Afghanistan, and Indonesia.

He then asks: if there’s no standardized policy in place for LGBTQ+ content could reviewers keep a video demonetized based on their own bias?

It is unclear how many workers—if any—are from those countries or if such a bias is actually being taken into account; however, former workers with Lionsbridge have reportedly complained of unclear guidelines.

Past Accusations Against LGBTQ+ Creators

Some YouTubers like Petty Paige have now resorted to censoring words like trans and homosexual to stay monetized, and a wide range of LGBTQ+ creators have called this trend an open secret.

In December, Mexican YouTuber Lusito Comunica asked YouTube Chief Product Officer Neal Mohan about this directly, saying three of his videos with LGBTQ+ titles were demonetized.

“I can just tell you categorically that there is no list of words or keywords or terms or anything like that that is going to go into our classifiers making an apriori decision on whether our videos are monetized or not,” Mohan said.

“There’s nothing in terms of how our monetization algorithms work that should be based on any kind of predescribed or predetermined list,” he continued.

In his video, Sealow refutes that point, saying, “Given our testing results, it’s made clear that these comments are not accurate.” He notes that while the current situation for LGBTQ+ may be improved from two years ago, most would still call it unacceptable.

He also said he finds Mohan’s comments troubling because as CPO, Mohan has the power to fix this problem.

Later, in August, Alfie Deyes posed a similar question to YouTube’s CEO Susan Wojcicki.

“We do not automatically demonetize LGBTQ content,” she said. Then, later adding, “There’s no policies that say if you put certain words in the title that that will be demonetized.”

Deyes then reiterated his question, asking if any words specifically from the LGBTQ+ are flagged, to which she says, “There shouldn’t be.”

Nerd City then focused on the word “policy” in his video, saying Wojcicki lied by omission.

“It’s sneaky language from a very smart woman who talks to a lot of lawyers,” he said. “There’s no policy to demonetize gay words, but there is a protocol where bots are doing exactly that.”

Also in August, a group of YouTubers sued the platform and claimed among other things, that YouTube is demonetizing their content.

In 2018, YouTube took steps to expand its reviewing process, adding those previously-mentioned 10,000 workers to combat what Wojcicki called “bad actors,”or people who attempt to exploit the platform’s monetization system. Those “bad actors” are actually part of why YouTube says it hasn’t released its algorithm data.

YouTube’s Mystery Algorithm

The report represents an attempt to better warn creators about why their videos may be demonetized, but demonetization involves other factors, as well. As they continue to attempt to learn more about the mysterious algorithm, that list changes every day.

Because of that, all of them note the information they presented is not necessarily complete. Nerd City has argued that YouTube should publish details on how its algorithm works, saying more openness could allow creators to make more money because they would then be able to see what does and does not get monetized.

He also deconstructs the “bad actors” argument, saying people would just report misleading content anyway.

Notably, the FairTube Campaign is urging YouTube to at least send creators a reason why their specific videos were demonetized, that way they can then learn and take steps to make sure future videos are ad-friendly.

YouTube Responds

Monday, the YouTube Team Twitter account respond to this series of videos, saying, “Wanted to let you know that we’ve watched your video and the right teams are reviewing your concerns in detail. We want to make sure that we give you some clear answers, so we’ll follow back up when the teams have been able to take a good, hard look.”

Wanted to let you know that we’ve watched your video and the right teams are reviewing your concerns in detail.



We want to make sure that we give you some clear answers, so we’ll follow back up when the teams have been able to take a good, hard look. — TeamYouTube (@TeamYouTube) September 30, 2019

Later, a YouTube spokesperson then released a statement saying there is no list of words that deem a video not ad-friendly.

“We’re proud of the incredible LGBTQ+ voices on our platform and take concerns like these very seriously,” the spokesperson said. “We do not have a list of LGBTQ+ related words that trigger demonetization and we are constantly evaluating our systems to help ensure that they are reflecting our policies without unfair bias.”

That spokesperson also said YouTube tests samples of LGBTQ+ content when there are new monetization classifers to make sure LGBTQ+ videos aren’t more likely to be demonetized.

See what others are saying: (The Verge) (INSIDER) (Mashable)