A new report from Bloomberg's Mark Bergen details in damning specificity how YouTube has allowed extremist content to run rampant on its site. According to the report, YouTube executives, including CEO Susan Wojcicki, repeatedly ignored warnings from YouTube employees regarding extreme and misleading videos gaining popularity on the site. This was done reportedly for "fear of throttling engagement."

Reportedly, "scores" of YouTube and Google employees raised concerns about incendiary content on YouTube. Some also offered solutions—one engineer suggested removing videos from recommendations that were "close to the line" of the company's takedown policy, while another employee wanted to track toxic videos in a spreadsheet to monitor how popular they became over time. YouTube did not take these employees up on their suggestions and continued to turn a blind eye to many types of extreme content.

While none of these rebuttals seem to be documented on paper as official policy, employees were reportedly discouraged from being proactive. YouTube lawyers told employees who were not assigned to moderation tasks to not research toxic content on their own.

A YouTube spokesperson refuted the claim that the company focuses, first and foremost, on engagement. However, engagement has been a huge part of YouTube's bottom line for years. Much to creators' dismay, the company consistently changes its algorithm to prioritize some videos over others on the site's home page and in its recommended lists. Engagement—or the number of views a video has, how long a viewer spends watching a video, and other interactions with the site as a whole—continues to have an effect on the rising popularity of some channels over others.

What has changed, and what hasn’t

But now, YouTube says it's more focused on "responsibility." Since 2017, YouTube has reportedly recommended videos based on a "responsibility metric," which is hard to quantify. The company only told Bloomberg that this metric is measured by input received from satisfaction surveys it shows viewers after they watch a video. It's unclear what else, if anything, contributes to the responsibility metric.

"Our primary focus has been tackling some of the platform’s toughest content challenges," a spokeswoman said in a statement to Bloomberg. "We’ve taken a number of significant steps, including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies—we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority."

YouTube has made a number of policy changes in the past few years. Ever since the ad-pocalypse of 2017, the company has changed many of its rules and regulations around the types of content that can be monetized on the platform, who can get paid from YouTube, and what content is explicitly banned from the site.

All of those changes have pros and cons. While they show that YouTube has taken a stronger stance against certain types of extremist content, some creators have found the new rules confusing and have been vocally unhappy with how YouTube carries out its flagging and demonetization punishments.

YouTube also hasn't totally prevented toxic content from finding a home on its platform—just recently, YouTube had to disable comments on most videos featuring children because of lurking pedophiles on the site. It's also continuously dealing with adult content masked as children's content that has infiltrated YouTube Kids, a version of the site that supposedly only shows child-friendly content.

While Bloomberg's report offers a frightening look into YouTube's alleged oversight, the information in the story isn't surprising. Much like Facebook and other social media sites, YouTube has been focused on grabbing users' attention and keeping it for years. It's now the top site for video streaming, and while we don't know specific revenue numbers, YouTube is estimated to bring in billions each year. YouTube wouldn't be the first company to ignore or overlook toxic content if it meant bringing more people onto its platform.

It's unclear how YouTube will respond to this new report, if it responds at all. After facing incredible fire for its lack of responsibility in catching and killing extreme content on its platform, Facebook recently announced a shift in strategy to "privacy-focused communications." We have yet to see how Facebook will deliver on that project or if YouTube will do anything more to address similar problems on its platform. History has shown that YouTube is likely to respond to controversy only when criticism becomes too loud for it to ignore—but that strategy may not work well for much longer.