Facebook has reached a $52 million settlement with former content moderators who alleged they suffered psychological trauma and symptoms of post-traumatic stress disorder from repeatedly reviewing violent images on the world’s largest social network.

Filed in 2018, the lawsuit alleges that Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. Former moderators reviewed content including murders, suicides and beheadings that were livestreamed on Facebook, according to the lawsuit.

“The harm that can be suffered from this work is real and severe. This settlement will provide meaningful relief, and I am so proud to have been part of it,” said Steve Williams, a lawyer for the Joseph Saveri Law Firm in San Francisco, who’s representing the plaintiffs. 

Selena Scola, a former Facebook content moderator who worked as a contractor at the tech company from June 2017 to March 2018, was the first plaintiff in the lawsuit who alleged she suffered from PTSD on the job. Scola was an employee of PRO Unlimited, a Florida staffing business that worked with Facebook to police content. Other former content moderators who contracted with Facebook later joined the lawsuit.

More than 10,000 current and former content moderators who worked for Facebook’s partners in California, Arizona, Texas and Florida will be eligible for a piece of the settlement, which still needs to be approved, according to a press release from the plaintiffs’ lawyers. Each moderator will receive $1,000 and some could get more compensation. Moderators diagnosed with certain conditions because of their work will receive money that could go toward treatment. Depending on the amount remaining in the settlement fund, they may be eligible for awards of up to $50,000.

As part of the settlement, Facebook will require staffing firms to provide coaching sessions with licensed mental health counselors along with other mental-health support.

The preliminary settlement was filed last week in San Mateo Superior Court, according to The Verge. Last year, the news outlet reported that some content moderators made as little as $28,800 per year and one moderator who worked at a site in Florida operated by Cognizant died after having a heart attack at his desk. 

The settlement comes as Facebook relies more on artificial intelligence to help detect content such as coronavirus misinformation and hate speech. 

“We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future,” a Facebook spokesperson said in a statement. 

Source Article