Once Again, Rather Than Deleting Terrorist Propaganda, YouTube Deletes Evidence Of War Crimes

from the take-a-step-back-and-rethink dept

It really was just last week that we were discussing the problems of telling platforms like YouTube to remove videos concerning "violent extremism" because it's often tough to tell the difference between videos that many people think are okay and ones that those same people think are not. But in that post, we also linked back to a story from 2013 in which -- after getting pressure from then Senator Joe Lieberman -- YouTube started removing "terrorist" videos, and in the process deleted a channel of people documenting atrocities in Syria.

It appears that history is now repeating itself, because YouTube is getting some grief because (you guessed it), it's effort to keep extremist content off its platform has resulted in deleting a channel that was documenting evidence of war crimes in Syria.

YouTube is facing criticism after a new artificial intelligence program monitoring "extremist" content began flagging and removing masses of videos and blocking channels that document war crimes in the Middle East. Middle East Eye, the monitoring organisation Airwars and the open-source investigations site Bellingcat are among a number of sites that have had videos removed for breaching YouTube's Community Guidelines.

This comes just days after YouTube announced it was expanding its program to remove "terror content" from its platform -- including better "accuracy." Oops.

Again, there are no easy answers here. You can certainly understand why no platform wants to host actual terrorism propaganda. And platforms should have the right to host or decline to host whatever content they want. The real issue is that we have more and more people -- including politicians -- demanding that these platforms must regulate, filter and moderate the content on their platform to remove "bad" speech. But in the over 4 years I've been asking this question since that last time we wrote about the shut down of the channel documenting atrocities, no one's explained to me how these platforms can distinguish videos celebrating atrocities from those documenting atrocities. And this gets even more complicated when you realize: sometimes those are the same videos. And sometimes, letting terrorists or others post the evidence of what they're doing, people are better able to stop that activity.

There is plenty of "bad" content out there, but the kneejerk reaction that we need to censor it and take it down ignores how frequently that is likely to backfire -- as it clearly did in this case.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: extremism, platforms

Companies: youtube