YouTube regrets: Anecdotal claims of damaged users By Flora Carmichael

BBC World Service Published duration 15 October 2019

image copyright Mozilla image caption Mozilla's not-for-profit advocacy arm campaigns for more responsible use of recommendation algorithms

"My 10-year-old sweet daughter innocently searched for 'tap dance videos'," one parent wrote.

"Now she is in this spiral of... videos that give her horrible unsafe body-harming and body-image-damaging advice."

This is one of hundreds of accounts outlining damage said to have been caused by YouTube's recommendations algorithm.

It's a phenomenon some refer to as "falling down the YouTube rabbit hole" with users directed to controversial and potentially dangerous content they might never have stumbled on otherwise.

The accounts have been gathered by Mozilla, the organisation best known for its Firefox web browser, which competes against Google's Chrome. The BBC was unable to corroborate the posts, as the foundation said they had been collected anonymously.

It's impossible to know if all the details are true. But Mozilla says it has shared a representative sample of the messages it received. And some read like horror stories.

"She is now restricting her eating and drinking," the parent continued.

"I heard her downstairs saying, 'Work to eat. Work to drink.'

"I don't know how I can undo the damage that's been done to her impressionable mind."

image copyright Getty Images image caption The examples Mozilla collected describe a broad range of harmful content recommended by YouTube's algorithm

White supremacists

Mozilla asked the public to share their "YouTube regrets" - videos recommended to users of the video clip platform, which led them down bizarre or dangerous paths.

"The hundreds of responses we received were frightening: users routinely report being recommended racism, conspiracies, and violence after watching innocuous content," said Ashley Boyd, Mozilla's vice-president of advocacy.

"After watching a YouTube video about Vikings, one user was recommended content about white supremacy.

"Another user who watched confidence-building videos by a drag queen was then inundated by clips of homophobic rants."

image copyright Mozilla image caption Some contributors described how family members had been taken in by videos about UFOs, the Illuminati and government conspiracies

YouTube is the second most visited website in the world. Its recommendation engine drives 70% of total viewing time on the site, by tailoring suggestions to keep viewers watching.

The BBC contacted YouTube for comment about Mozilla's report.

"While we welcome more research on this front, we have not seen the videos, screenshots or data in question and can't properly review Mozilla's claims" Susan Cadrecha, a YouTube spokesperson said.

"Generally, we've designed our systems to help ensure that content from more authoritative sources is surfaced prominently in recommendations.

"We've also introduced over 30 changes to recommendations since the beginning of the year, resulting in a 50% percent drop in watchtime of borderline content and harmful misinformation coming from recommendations in the U.S.

"This update has also begun rolling out in the UK and we expect similar results."

YouTube has begun tackling videos that contain misinformation and conspiracy theories by showing "information panels" containing trustworthy information.

Even so, claims that its recommendations have a tendency to lead users astray persist.

"We urge YouTube and all platforms to act with integrity, to listen to stories and experiences of users," said Lauren Seager-Smith, chief executive of children's protection charity Kidscape, which is not involved in Mozilla's campaign.

"[It needs] to reflect on when content may have caused harm - however inadvertently - and to prioritise system change that improves protection of children and those most at risk."

Fear and hate

Mozilla said it received more than 2,000 responses in five languages to its call.

It has published 28 of the anecdotes.

"My ex-wife, who has mental health problems, started watching conspiracy videos three years ago and believed every single one," recalled one contributor.

"YouTube just kept feeding her paranoia, fear and anxiety, one video after another."

image copyright Getty Images image caption Contributors from the LGBT community voiced concerns about being recommended homophobic content

Members of the LGBT community also raised concerns.

"In coming out to myself and close friends as transgender, my biggest regret was turning to YouTube to hear the stories of other trans and queer people," one person wrote.

"Simply typing in the word 'transgender' brought up countless videos that were essentially describing my struggle as a mental illness and as something that shouldn't exist. YouTube reminded me why I hid in the closet for so many years."

The LGBT Foundation - a Manchester-based charity - called for YouTube and other social media companies to take more responsibility for the content promoted by their algorithms.

"Hateful content online is on the rise, and something that is of increasing concern," the foundation's Emma Meehan told the BBC.

"Social media giants have a responsibility for what is shared on their platforms and the real-world impact this may have, and need to work to take a more dedicated approach to combating hate online."

Research challenges

YouTube's recommendations system poses difficulties for researchers outside the company as the business does not share its own recommendations data.

Since each user is given different suggestions, it is hard to determine why some choices are made and how many others have had the same content promoted to them.

"By sharing these stories, we hope to increase pressure on YouTube to empower independent researchers and address its recommendation problem," Mozilla's Ashley Boyd said.

"While users should be able to view and publish the content they like, YouTube's algorithm shouldn't actively be pushing harmful content into the mainstream."