Youtube has been in the news lately following reports about cartoons with obscene material being targeted towards kids. For those not in the loop, more can be seen here.

Even if you ignore these cartoons though, youtube (and by extension google) has failed to protect young people from inappropriate content. To prove this, I recently opened youtube in an incognito browser and was presented with this.

Even though it claims to be 18+, I was able to open it without any age check. Now obviously youtube is assuming that parents are acting responsibly and monitoring their childrens internet habits but is this ethical behaviour? Clicking on this link opened up a video that was absolutely unsuitable for children as can be seen below:

More importantly though, look at the recommended videos youtube is suggesting on the right hand column.

Why is youtubes algorithm suggesting a mixture of sexualised material and “children's” cartoons?

This only gets worse when you look at what youtube searchbox is capable of. A recent image on reddit revealed a range of disturbing “suggested” content which includes “nakid boy kids 12”. Now these suggestions might be based off the naive curiosity of other kids, or something more sinister. The lack of information and response from youtube however does not provide much confidence.

When I tried a similiar search I was presented with equally problematic suggestions.

Now based off the spelling I wonder whether children have been searching for this material and if so, what is youtube showing them in return? Furthermore, why is “how to do sex when you are 11” even a search term?

A cursory search revealed more disturbing material. Notice when searching for “nacid baby” youtube suggests also looking for “nacid baby boys” or girls.

But this problematic material isn’t confined to youtube. In fact, the issue of inappropriate material has bled into google search itself. When I was trying to teach myself how to code I typed a seemingly innocent word into google image only to be presented with child pornography.

This screenshot shows numerous pictures of children in various states of undress based of what appears to be 2–3 Eastern European websites. I’ve blocked the search term and offending pictures but the fact that this material is so blatantly accessible is horrifying. As if the problem couldn’t get any worse though, clicking on an offending image actually leads to google suggesting more related content under the “view more” link. Clicking on “view more” brings up an even greater range of horrible images consisting of children from young babies to teenagers in varies states of dress and also presented in sexualised and pornographic manners.

So as it stands:

Youtube is failing to protect children from exposure to innapropriate content

Youtube is actively promoting inappropriate material to children

Google is making child pornography accessible and promoting access to wider sources of this material via there “view more” link.

While I appreciate many people will point out the need for parents to actively monitor childrens internet access, parents inaction is not suffient to excuse googles actions. Interestingly, child pornography laws in Australia state:

Telecommunications based child exploitation offences cover the range of activities a person can engage in when using the internet, email, mobile phones and other applications to deal with child pornography and child abuse material. These activities include viewing, copying, downloading, sending, exchanging material and making available for viewing, copying or downloading.

Which raises the question, is google in breach of the law and complicit in making child pornography available?