Algorithmic Videos Are Making YouTube Unsuitable For Young Children, And Google's 'Revenue Architecture' Is To Blame

from the so-how-do-we-fix-it? dept

There's an interesting article on Medium by James Bridle that's generating plenty of discussion at the moment. It has the title "Something is wrong on the internet", which is certainly true. Specifically, what the article is concerned about is the following:

Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.

I recommend reading the article so that you can decide whether it is a perspicacious analysis of what's wrong with the Internet today, or merely another of the hyperbolic "the Internet is corrupting innocent children" screeds that come along from time to time. As an alternative -- or in addition -- you might want to read this somewhat more measured piece from the New York Times, which raises many similar points:

the [YouTube Kids] app contains dark corners, too, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms. In recent months, parents like Ms. Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes.

The piece on Medium explores a particular class of YouTube Kids videos that share certain characteristics. They have bizarre, keyword-strewn titles like "Bad Baby with Tantrum and Crying for Lollipops Little Babies Learn Colors Finger Family Song 2 " or "Angry Baby vs Spiderman vs Frozen Elsa BABY DROWNING w/ Maleficent Car Pink Spidergirl Superhero IRL". They have massive numbers of views: 110 million for "Bad Baby" and 75 million for "Angry Baby". In total, there seem to be thousands of them with similar, strange titles, and similar, disturbing content, which collectively are racking up billions of views.

As Bridle rightly notes, the sheer scale and downright oddness of the videos suggests that some are being generated, at least in part, by automated algorithms that churn out increasingly-deranged variations on themes that are already popular on the YouTube Kids channel. The aim is to garner as many views as possible, and to get children to watch yet more of the many similar videos. More views means more revenue from advertising: alongside the video, before it, or even in it -- some feature blatant product placement. Young children are the perfect audience for this kind of material: they are inexperienced, and therefore are less likely to dismiss episodes as poor quality; they are curious, and so will probably watch closely to see what happens, no matter how absurd and vacuous the storyline; and they probably don't use ad blockers. As Bridle says in his Medium post:

right now, right here, YouTube and Google are complicit in that system [of psychological abuse]. The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale.

That may be overstating it, but it is certainly true that YouTube's "revenue architecture", based on how many views videos achieve, tends to produce a race to the bottom in terms of quality, and a shift to automated production of endless variations on a popular themes -- both with the aim of maximizing the audience.

YouTube has just announced that it will try to restrict access by young children to this type of video, a move that it rather improbably claims has nothing to do with the recent articles. But given the potential harm that inappropriate material could produce when viewed by young children, there's a strong argument that Google should apply other criteria in order to de-emphasize such offerings. A possible approach would be to allow adults to rate the material their children see, using a mechanism separate from the current "like" and "dislike". Google could then use adverse parental ratings to scale back payments it makes to channels, while good ratings from adults would cause income to be boosted. Parents would need to sign up before rating material, but that's unlikely to be a significant barrier to participation for those who care about what their children watch.

Although there is always a risk of such systems being gamed, the sheer scale of the audience involved -- millions of views for a video -- makes it much harder than for material that has smaller reach, where bogus votes skew results more easily. Google would anyway need to develop systems that can detect attempts to use large-scale bots to boost ratings. The fact that the company has become quite adept at spotting and blocking spam at scale on Gmail suggests it could create such a system if there were enough pressure from parents to do so.

If Google adopted such a reward system, Darwinian dynamics are likely to lead to better-quality content for children, where "better" is defined by the broad consensus of what adults want their children to see. Other ways that Google could encourage such content to be produced would be to allow parents to boost further what they regard as valuable content with one-off donations or regular subscriptions. Techdirt readers can doubtless come up with other ways of providing incentives to YouTube channels to move away from the automated and often disturbing material many are increasingly filled with.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: algorithms, gaming, kids, youtube

Companies: google, youtube