YouTube cracks down on ‘hateful’ content

YouTube has issued new guidelines designed to protect children and stop users making money from inappropriate videos featuring well-known TV characters.

The Google-owned platform has attracted criticism for what has been seen as a lack of action on channels that feature children’s characters in videos that could cause kids distress.

In the future, advertising will no longer be carried on videos that YouTube says features “inappropriate use of family entertainment characters.”

The company said it is taking a “tougher stance” on content that depicts family entertainment characters engaged in violent, sexual, vile or otherwise inappropriate behaviour, “even if done for comedic or satirical purposes.”

It is also clamping down on what it calls “hateful” content that promotes discrimination against an individual or group of people on the basis of factors such as race, ethnicity, nationality, religion, disability, sexual orientation and gender identity.

Other content targeted in the ad ban includes videos that are “gratuitously incendiary, inflammatory or demeaning. For example, video content that uses gratuitously disrespectful language that shames or insults an individual or group.”

Ariel Bardin, VP of product management at YouTube, issued the new guidelines to video creators in a blog post.

“We recognise there is still more work to do. We know we have to improve our communications to you, our creators. We also need to meet our commitment to our advertisers by ensuring their ads only appear against the content they think is suitable for their brands,” said Bardin.