We look into the invisible workers scrubbing the net, deciding what you do or do not see across the web.

Many social media users assume that content moderation is automated, that when an inappropriate image or video is uploaded to the net a computer removes it. But in reality, there are reportedly more than 150,000 content moderators working today.

The job description involves sifting through images, videos and text, assessing whether or not the content contravenes their platform's policies. And the work can take its toll.

In December last year, two Microsoft employees sued the company, saying that years of content moderating left them with post-traumatic stress disorder (PTSD).

It can be unpleasant work, but it is necessary, and many social media companies based in the West outsource it to places such as the Philippines or India.

These internet service providers and tech companies make billions of dollars. But like a lot of products, there's a toxic by-product and that toxic by-product needs to be managed, and the employees that work with that toxic by-product need to be protected. Ben Wells, attorney

But the question is: do they do that responsibly? Or do they just take advantage of the cheap labour with little consideration for the labourer?

"When most people think about the production of the social media that they are involved with and consume themselves, I think the last thing they think about is what they don't see," explains Sarah Roberts, assistant professor at the University of California.

"When I do talk to people about this topic, usually two things happen. The first thing is that they say, 'Oh, I never thought about that.' And then the second thing they say is, 'Don't computers do that?'"

Automated content moderation does exist but it's in its early stages of development. The technology is still incapable of making complex judgments on an image or a video. Humans are needed for the task of sanitising your news feed and shielding you from the horrors posted online every day.

"Obviously, huge tech companies are very, very secretive about this. They don't want you to know that there are other people who possibly could be looking at your photos, that there are people involved in the viewing of, you know, nasty material," says filmmaker Ciaran Cassidy, whose film on the subject, The Moderators, debuted last month.

Aside from the editorial part, there's a security aspect to it, according to Suman Howlader, CEO of the India-based IT company Foiwe Info Global Solutions: "So as a content moderator you would have access to different kinds of sensitive data or users' personal data which should not be exposed to the outside world."

Content moderators have to sign non-disclosure agreements that are "broad and they're written by good lawyers in large firms", says attorney Ben Wells, who is representing one of the two men suing Microsoft.

Henry Soto and Greg Blauert are the plaintiffs in a legal case against Microsoft who say that they were not warned about the effect that moderating online content could have on their welfare. Their lawsuit argues that the job required them "to witness horrible brutality, murder, indescribable sexual assaults, videos of humans dying", material "designed to entertain the most twisted and sick-minded people in the world".

The company refutes the lawsuit, saying that the work is difficult but critically important to a safer and more trusted internet and the health and safety of its employees who do this work is their top priority.

The financial incentives behind outsourcing lead all kinds of businesses, and not just tech companies, to take advantage of cheap labour forces overseas. But it’s also the nature of the work and the fact that content moderation is clouded in secrecy that fuels suspicions that this is a case of rich companies farming out their psychological trauma to the developing world.

"These internet service providers and tech companies make billions of dollars. But like a lot of products, there's a toxic by-product and that toxic by-product needs to be managed and the employees that work with that toxic by-product need to be protected," concludes Wells.

Contributors:

Sarah Roberts, assistant professor, University of California

Ben Wells, attorney

Ciaran Cassidy, director, The Moderators

Suman Howlader, CEO, Foiwe Info Global Solutions

Source: Al Jazeera