Firefox maker Mozilla is trying to shame YouTube into “fixing” its recommendation algorithm, soliciting horror stories from users sent down radicalizing “rabbit holes.” Trouble is, most users don’t want more censorship.

“Once, at 2 a.m., you searched YouTube for ‘Did aliens build Stonehenge?’ Ever since, your YouTube recommendations have been a mess: Roswell, wormholes, Illuminati,” Mozilla laments in its call for submissions, asking users for their “YouTube regret” so that they might “put pressure on YouTube to do better.”

“YouTube’s recommendation engine can lead users down bizarre rabbit holes — and they’re not always harmless,” the company warns.

Sometimes they drive people toward misinformation and extreme viewpoints.

Putting aside the inanity of blaming YouTube for its users’ regrettable viewing choices – no one forces a user to click on the platform’s “recommended” videos – Mozilla seems confident that there is an army of YouTube users out there who are itching for stricter censorship on the platform. The media establishment, after all, has been screaming for months that YouTube is radicalizing people, and no one wants to be radicalized.

Except there’s little indication that this silent majority of YouTubers offended by recommendations for “misinformation and extreme viewpoints” exists, outside of what has been insinuated by the mainstream media.

PewDiePie, the YouTube star who recently hit 100 million subscribers, actually had to withdraw a planned $50,000 contribution to the Anti-Defamation League – an “anti-hate” organization that has demanded more stringent censorship across social media, including presenting platforms with lists of accounts it would like to see deplatformed – after his viewers rebelled.

Certainly, PewDiePie’s viewers do not represent all of YouTube. But his channel is not political, and these are not fringe elements denouncing the ADL. These are ordinary YouTube users tired of having content creators they love booted from the platform for wrongthink, often in a “trial-by-algorithm” that fails to take context into account when demonetizing and banning users.

The algorithm does occasionally punish users for watching the “wrong” video, as when one watches a live-streamed presidential debate and is subsequently deluged with mainstream TV network content. However, independent content creators have been complaining about YouTube cutting off their recommendation traffic, especially in months following June’s “Vox Adpocalypse,” which saw “borderline” content creators who behaved themselves and didn’t actually break any rules punished anyway.

YouTube CEO Susan Wojcicki even boasted about strangling these creators’ reach last month, announcing the latest round of censorship piloted in the US would be rolled out in the UK and other English-speaking countries. She even bragged about YouTube’s “commitment to openness” and free speech while a number of creators were being deplatformed on the very same day.

Searching controversial terms is much more likely to bring up mainstream sources anyway, thanks to YouTube breaking its own algorithm in order to keep users’ opinions in line, as a former employee whistleblower revealed earlier this year, releasing nearly 1,000 pages of internal documents detailing an Orwellian censorship regime.

Mozilla’s call for complaints is a solution in search of a problem. The company is scheduled to meet with YouTube in two weeks, according to its post. If it really wants to make its users happy, it will tell YouTube to stop censoring them.

https://archive.is/N8zNl