Karan Pradhan

On the evening of 12 January, Delhi’s 20-year-old YouTube ‘prankster’ Sumit Kumar Singh was detained by police after YouTube footage of him running up to ‘unsuspecting’ women and kissing them went viral and was picked up by television news channels. His cameraman and collaborator Satyajeet Kadian was also picked up by the police.

Sumit and Satyajeet were released the next day after claiming that the ‘victims’ in his videos were their friends and had agreed to participate in the ‘prank’. And at the time of writing, the police is said to be verifying these claims.

All’s well that ends well, right?

Well, not quite.

When the Delhi Police turned up at his home, they discovered an award from YouTube for crossing 100,000 subscribers to his channel ‘The Crazy Sumit’. This prompted Ravindra Yadav, joint commissioner of police (crime) to note that, “YouTube pays them around Rs 700 for every 1,000 hits their video receives… The trend is to shoot obscene videos to get more hits and thereby more money… YouTube has been organising and day seminars to encourage such activities by the pranksters.”

So is it YouTube’s fault that these repulsive — in terms of content, certainly, but especially so in terms of the precedent they set — videos made their way onto the internet? Or is Sumit and Satyajeet’s fault that in their greed for money, they chose the most ‘indecent’ route to make bank? Or, is it Yadav’s fault for misplacing the blame, because “This is how the world works. Get used to it!”?

There are two ways to view this gridlock of sorts:

A case of shooting the messenger?

In 2004, British 14-year-old Stefan Pakeerah was stabbed to death with a knife and claw hammer by his 17-year-old friend Warren LeBlanc. Rockstar Games’ Manhunt — a pretty average game that gives you points for how gruesomely you can snuff out your targets — was blamed for influencing the murderer and his victim, and was subsequently pulled off shelves. That the tragedy occurred as a result of a drug-related robbery was seemingly lost on those targeting the game.

In 1999, Eric Harris and Dylan Klebold — the perpetrators of the Columbine Massacre — were reportedly influenced by the music of Marilyn Manson to go out and kill their classmates.

In the 1990s, a series of murders were attributed to people who claimed to have been influenced by the 1994 film Natural Born Killers to go out and commit all those crimes.

And in the 1960s, the book Lady Chatterley’s Lover was banned on the grounds that it was so filthy, it would inspire an entire generation to take to immoral and indecent behaviour.

Which brings us nicely to the present day.

In a quote to News18, a YouTube representative said, “YouTube’s Community Guidelines prohibit content featuring things like harassment, hate speech, shocking or disturbing content, illegal acts, and graphic violence, and we give our users tools to flag content so that we can review and remove anything that violates our policies… We also comply with valid legal requests from authorities wherever possible, consistent with our longstanding policy.”

At first look, this sounds perfectly reasonable. The same users who populate the website with their content are empowered to flag inappropriate content, which can then be pulled down — democracy in motion, some might say. Simultaneously, by highlighting the fairly open-ended concepts of ‘harassment’, ‘hate speech’, ‘illegal acts’, ‘graphic violence’ and ‘shocking or disturbing content’, there are enough parameters to cover the offended user in his/her endeavour to flag the offensive video.

YouTube, like other video-sharing websites, is merely the vessel. What it contains is decided by you, the user. Blaming YouTube is like blaming television for that offensive (in one of the ways listed above) show you watched. Or like blaming a pressure cooker for some particularly terrible-tasting daal.

But let’s stay with the TV show metaphor. At elaborate awards shows, complete with a bevy of stars, costume changes, dance sequences and whatever it is that passes for comedy these days, TV channels often shower appreciation on their most popular programmes — no matter how crass, regressive or frankly idiotic their content.

Similarly, YouTube also felicitates its most popular users, regardless of whether they put up videos of cats, guitar masterclasses or the Sumit-Satyajeet brand of pranks. That those videos became so popular isn’t YouTube’s fault. It’s yours.

The desperate need for a human touch

There’s tonnes of literature and cinema around that depicts various dystopia in different forms. One frequently-appearing theme, however, appears to be the idea of humans being reduced to mere statistics. And this idea may actually be closer to home than we’d care to admit.

Today, you are the ‘like’ you stick against a social media post, the ‘share’ you provide to a particularly poignant article, the ‘stars’ you assign an Uber driver, the ‘heart’ you bestow upon a tweet and the ‘upvote’ with which you give your approval to a video.

And along with your fellow statistics, you provide data to the YouTubes, Googles, Amazons and Facebooks of the world to help them decide what’s best for you.

As tech2 editor Nash David points out, “(m)odern businesses assume that data and scalability through technology is a given. But what are obvious misses is not getting the cultural and social context… Being sensitive to cultural realities doesn’t hold true just to prevent people from getting offended. It also helps ensuring overall satisfaction. What (is needed is) a system that tracks cultural sensitivities”.

In short, being driven purely by the metric of numbers (or hits) is not acceptable for an entity like YouTube, that alongside the likes of Twitter and Facebook, is far more powerful than we can fathom. A single video, if viral enough, can become the mainstay of television and online news, drive police investigations, incite riots and uprisings, and could even lead to global paranoia.

The power to decide what content is dangerous and what isn’t cannot be left in the hands of the same users who create the content. It is an abdication of responsibility on part of YouTube and the ilk to wait until legal requests are filed before filtering out content of this nature. It’s already too late by then.

This loose control over content contributes to another scary possibility. Today, it’s these ‘pranks’ carried out by people on their so-called friends. Given the growing access to mobile phones with excellent video cameras, and in a bid to attract more subscribers (considering the incentives on offer) amidst a sea of users turning increasingly desensitised (it’s an inevitable human condition), what’s to stop users turning to ‘sharking’ or chikan videos (you’ll have to Google those yourself) to up that views count?

What’s needed here is moderation at the human level by real-life moderators hired or contracted by YouTube. The job cannot be left to algorithms.

So what’s the solution?

Unlike American presidential elections, there’s usually a third way.

And in this case, it’s the acceptance of responsibility.

As much as YouTube is responsible for the content that goes up on its website, you, the user are equally culpable for what you post, no matter what the inducement — views, subscribers or simply the quest to be seen as cool.

Digital and social media was up in arms when Adam Saleh, the Yemeni-American YouTube star, was kicked off a Delta flight in London for speaking to his mother in Arabic. The incident itself reflected very poorly on Delta, of that there can be no doubt. But a look at Saleh’s YouTube page shows plenty of videos depicting pranks where the prankster and his friend try and freak people out in the streets and in airplanes.

Compare this with ‘The Crazy Sumit’.

Both sets of pranksters seek to exploit what are essentially very sensitive subjects in today’s world. In a post-9/11 world, poor airline security and the unfortunate demonisation of Islamic flyers are a real concern (very regrettably in the latter instance, where it leads to uninformed behaviour and bigotry among flyers and non-flyers alike). Similarly, the safety of women — which has always been an issue in India — is for once being recognised as a major concern. Making light of these concerns hampers our collective journey through combatting these issues.

So, here are a few pointers. Not just to YouTube or to Sumit, but to all of us. Responsibility is required when preying on people’s sensitivities. Responsibility is also required when deciding whether or not these videos should be shared on your website.

And finally, responsibility is required when you decide to like or dislike a video.