Two former Microsoft employees say they developed post-traumatic stress disorder after being forced to watch videos of child abuse, murder and sexual assault, according to a lawsuit filed by Henry Soto and Greg Blauert on December 30. '

The two formerly worked for the company's "online safety team" where they were tasked with reporting offensive digital content. The lawsuit charges Microsoft with “negligent infliction of emotional distress” claiming that when Soto and Blauert complained about the horrific videos they had to watch, they were simply told to take more smoke breks.

"It's horrendous. It's bad enough just to see a child get sexually molested. Then there are murders. Unspeakable things are done to these children," said Ben Wells, one of the attorneys who filed the lawsuit.

The lawsuit claims Soto was “involuntarily transferred” to the safety team in 2008, the same year legislation passed requiring tech companies to report criminal images. Soto says he was never warned about the potential mental impact such work could have on his psyche. At the time of Soto's transfer, he claims he had "god-like" status as a monitor and that he could “could literally view any customer’s communications at any time.”

Blauert became a full-time Microsoft employee in 2012 and was required to “review thousands of images of child pornography, adult pornography and bestiality that graphically depicted the violence and depravity of the perpetrators.” When Blauert informed his superiors he was experiencing trauma as a result of the job, he claims they told him “limiting exposure to depictions, taking walks and smoking breaks, and redirection [of] his thoughts by playing video games would be sufficient to manage his symptoms."

The lawsuit also claims that safety team workers “were not told that the more they became invested in saving people, the less able they would become to recognize and act on their own symptoms of PTSD." Although the company did have a compassion fatigue counselor on staff, the person “lacked sufficient knowledge and training regarding vicarious trauma or PTSD and lack the authority to take employees off content or rotate them entirely out of the department.”

Microsoft released a statement disagreeing with the potrayal of the company and insisting that it understands "its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work."

This isn't the first time such workers have made headlines. In 2014 Wired ran a story by Adrian Chen about how companies like "Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us." Chen's piece detailed how much of that work is now done in the Philippines where tech companies can hire workers at a fraction of U.S. wages.

Chen interviewed a number of people, using pseudonyms to conceal their identity. "Maria" in Manila, who moderates photos and videos for a major U.S. tech company, told him, “I get really affected by bestiality with children. I have to stop. I have to stop for a moment and loosen up, maybe go to Starbucks and have a coffee.”

Chen also spoke with "Denise," a psychologist who consults for content moderation firms. “It’s like PTSD,” she told Chen, “There is a memory trace in their mind…how would you feel watching pornography for eight hours a day, every day? How long can you take that?”

If the Microsoft suit is successful it could have an impact that transcends the case. Attorney Ben Wells says he hopes the case will inspire other workers to tell their stories and force tech companies to do more to protect their employees.