A new lawsuit alleges Facebook fails to protect moderators who suffer from post-traumatic stress disorder after viewing violent and disturbing content people attempt to post on the social network.
The lawsuit, filed on Sept. 12 in state superior court in San Mateo County, California, says Facebook content moderators working under contracts have to look at thousands of “videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder” every day, according to a press release.
Facebook and other internet providers have established industry standards for training, counseling and supporting content moderators, but Facebook isn’t following the workplace safety guidelines that it helped create, the lawsuit said.
The lawsuit, which is seeking class action status, was filed on behalf of Selena Scola, who worked at Facebook for nine months under a contract through staffing company Pro Unlimited. Scola was diagnosed with PTSD after experiencing symptoms such as fatigue, insomnia and social anxiety, according to the release.
Scola is the lone plaintiff at the moment, but if class action is granted, the lawsuit could impact thousands. The lawsuit alleges negligence and failure to maintain a safe workplace at Facebook and Pro Unlimited.
“Our client is asking Facebook to set up a medical monitoring fund to provide testing and care to content moderators with PTSD,” said Steve Williams, one of Scola’s lawyers from the firm Joseph Saver, in the release. “Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized.”
Facebook and Pro Unlimited didn’t immediately respond to requests for comment.
Taking It to Extremes: Mix insane situations — erupting volcanoes, nuclear meltdowns, 30-foot waves — with everyday tech. Here’s what happens.
The Honeymoon Is Over: Everything you need to know about why tech is under Washington’s microscope.