Periscope’s abusive commenters will be tried by a jury of other users Periscope has come up with a novel solution to the plague of trolls that threatens to ruin all social interaction […]

Periscope has come up with a novel solution to the plague of trolls that threatens to ruin all social interaction on the web: trial by a jury of other users.

The live video service owned by Twitter has introduced “flash juries” to moderate comments. If a post is flagged as abusive – or as spam – it will be referred to other users to decide whether it has to come down.

The sentence? First a 60-second suspension, then losing the ability to comment on the broadcast entirely for a second offence.

The i newsletter cut through the noise Email address is invalid Email address is invalid Thank you for subscribing! Sorry, there was a problem with your subscription.

“Ultimately, the goal is to make it as hard as possible to get gratification from being abusive.” Periscope engineer Aaron Wasserman

It’s a different development in a field that has caused a lot of trouble for social media companies in the past. Twitter has long suffered from a reputation for harbouring trolls, while Facebook finds itself shot by both sides: it doesn’t censor enough for some tastes, while it censors too much for others.

Periscope has outsourced the decision-making to the people who know the service, its users and its tone best: the users.

Its troll problem was as acute as any. Live comments pop up on the broadcast in real time, meaning there’s nowhere to hide from inappropriate content. A couple of people committed to causing trouble could ruin a whole broadcast without much effort. The new plan is a way for real users to shut them out seamlessly.

Risk of mob rule

“Ultimately, the goal is to make it as hard as possible to get gratification from being abusive,” Periscope engineer Aaron Wassermann told Fast Company.

Democracy, of course, is dangerous: what the flash jury thinks is acceptable might not match a reasonable social standard. Mob rule has made many internet spaces difficult for women and people of colour to comfortably inhabit, and that’s always a risk when the overseers give up control.

But until social networks develop robots powerful enough to understand political, social and cultural context at the same time as sarcasm and surrealism – which could be quite a long time – it’s important for them to be seen to be doing something.

This is something, which is better than nothing. Trolls beware: a court date awaits.