Editor’s note: This story contains descriptions of explicit and violent acts and images.

TAMPA — A harrowing video was broadcast live on Facebook on March 15, 2019. A man armed with a semiautomatic rifle stalked worshippers at a New Zealand mosque, streaming a first-person view of carnage. The Christchurch terrorist attack claimed 51 lives.

Clifford Jeudy watched the 17-minute video of the murders from his cubicle in Carrollwood almost as soon as it landed online. He watched it over and over as the video went viral.

That was his job as a content moderator at Cognizant Technology Solutions, a contractor hired by Facebook to police the social media platform.

Jeudy said the job of reviewing graphic, disturbing and violent content on Facebook left him with post-traumatic stress disorder. Co-worker Debrynna Garrett said the same happened to her on the job.

The two filed a class-action lawsuit against Facebook and Cognizant on Wednesday, alleging the companies made content moderators work under dangerous conditions that caused debilitating physical and psychological harm and did little to help them cope with the traumas they suffered as a result. Jeudy also has filed a discrimination charge against Cognizant with the Equal Employment Opportunity Commission.

The lawsuit says the two companies ignored the very safety standards they helped create. It also alleges that Facebook’s outsourcing relationship with Cognizant is a way for the social media giant to avoid accountability for the mental health issues that result from moderating graphic content on the platform.

Filed in Hillsborough County circuit court, the lawsuit accuses Facebook of negligence, Cognizant with deliberate concealment of known danger and both companies with unfair or deceptive trade practices under Florida law. It seeks unspecified compensation and damages, and, by filing a class-action lawsuit, the plaintiffs hope to represent all of their current and former co-workers in Florida against the companies.

The lawsuit further argues that content moderators should be considered “first responders” and provided special protections such as workers’ compensation and health coverage for post-traumatic stress disorder under state law because they are often the first to witness emergency situations and flag them to law enforcement.

It calls on the companies to improve mental health support programs in the workplace, establish a medical monitoring fund to provide long-term mental health treatment to current and former content moderators and compensate them for lost wages and medical expenses.

The lawsuit comes as Cognizant prepares to shut down its Tampa campus and lay off more than 500 employees, after reporting by the Tampa Bay Times and the Verge revealed a grueling work environment that pushed workers to quickly churn through disturbing Facebook posts but provided few mental health resources for dealing with the aftermath.

“Workers in Tampa will basically be left with no recourse and will be suffering mental impairments for a long time,” said the workers’ attorney, Jay Lechner. “The company will leave town without having to be responsible for it.”

Facebook and Cognizant declined to comment on the allegations.

• • •

Jeudy, 47, said it took him months to figure out what was wrong.

He said he has worked as a content moderator for Cognizant since December 2017, making $16 an hour. He considered himself somewhat hardened to the job, regularly reviewing incidents of rape, murder and child abuse that poured onto his screen.

There were symptoms that something was wrong, he told the Times, but he brushed them off at first. He had an anxiety attack at work last spring, but chalked it up to nerves over layoffs. He frequently suffered migraines and threw up, but he figured it might be an ulcer coming back.

Then his primary care doctor sent him to a psychiatrist. Jeudy said she diagnosed him with post-traumatic stress disorder and anxiety disorder. “I was shocked,” he said. “I might get upset about things I saw, but PTSD wasn’t on the radar for me.”

He took a leave of absence in July 2019 and started taking anti-anxiety medication. The next month, Jeudy suffered a stroke and partial seizures, according to the lawsuit. He was later diagnosed with epilepsy. The lawsuit alleges those ailments “were directly caused by the extreme working conditions to which he was exposed.”

He kept thinking of his former co-worker, a veteran who died of a heart attack while at work in March 2018. Jeudy had been in the building at the time, heard the man’s gasps as he collapsed. No one at the facility had been trained in emergency first aid, he said, and he and his coworkers searched in vain for a defibrillator. The memory haunted him and he often couldn’t stop thinking about death.

Jeudy said his job at Cognizant often focused on Facebook’s live-streaming service, Facebook Live, where he witnessed crises unfold in real time, like a young girl slitting her wrists, extending her arms and allowing the blood to spill live on the Internet.

Then he hurriedly filled out forms to send to Facebook’s alert team, so they could contact law enforcement. “If you make the wrong decision, if you’re too late, somebody could die,” he said. He never heard anything back about the cases he submitted.

He sometimes talked to the on-site counselors about what he saw, he said, but they were mostly unavailable during his overnight shift.

He was wary of using too much of his allotted “wellness time.” Managers kept a strict eye on numbers and pushed employees to review up to 300 pieces of content a day, he said. He feared being fired if he spent too much time away from his desk and didn’t meet his goals.

“They incentivize you not to go to counseling,” he said. “Your ‘wellness time’ counts against you because you’re not working … and if you meet a certain threshold, you will get extra money. So they pay you to not go and get the help you need.”

• • •

Jeudy said his health is in a fragile state and, with Cognizant set to close the office later this month, he isn’t sure what he’ll do for work. He can’t bear to open the hospital bills that have been arriving in his mailbox.

“My short term memory is shot," he said. "That makes me worried about finding a new job. How am I going to earn more money?”

Despite the health setbacks he’s experienced, he still believes content moderation is a necessary job. Someone has to shut down the next New Zealand massacre video before millions of Facebook users see it. Someone has to try to find the hidden black-market Facebook groups that trade in child pornography.

The lawsuit alleges that Facebook already knows best practices to mitigate psychological symptoms from viewing trauma-inducing images — because it helped draft workplace safety standards as a member of the Technology Coalition.

But Jeudy said he and his coworkers never saw those measures implemented at Cognizant: "I don’t think anyone really cares about our little band of content moderators.”

Times senior news researcher Caryn Baird contributed to this report.