SAN FRANCISCO: Facebook has agreed to pay $52 million to third-party content moderators who developed post-traumatic stress disorder (PTSD) and other mental health issues as they scanned scores of disturbing images of rape, murder, and suicide to curb those on the platform.
According to a report in The Verge, in a preliminary settlement in San Mateo Superior Court, the social networking giant agreed to pay damages to 11,250 US-based moderators and provide more counseling to them.
Facebook hired several firms like Accenture, Cognizant, Genpact, and ProUnlimited to help it moderate and remove harmful content in the aftermath of the 2016 US presidential election and Cambridge Analytica data scandal.
As per the compensation, each moderator will receive a minimum of $1,000 and will be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or related conditions.
Anyone who is diagnosed with a mental health condition is eligible for an additional $1,500, and people who are diagnosed with PTSD and depression may be eligible for up to $6,000.
“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” Steve Williams, a lawyer for the plaintiffs, said in a statement late Tuesday.
In 2018, Selena Scola who was a former content moderator at Facebook sued the company alleging that moderators who face mental trauma after reviewing distressing images on the platform are not being properly protected by the social networking giant.
Scola sued Facebook for allegedly ”ignoring its duty” to protect moderators who deal with mental trauma after seeing disturbing imagery and were “bombarded” with thousands of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.
Last year, several moderators told The Verge that they had been diagnosed with PTSD after working for Facebook.
Facebook had more than 30,000 employees working on safety and security — about half of whom were content moderators.
Cognizant later announced that it would quit eave the content moderation business and shut down its sites earlier this year. The company also developed “the next generation of wellness practices” for affected content moderators.
“In addition to payment for treatment, moderators with a qualifying diagnosis will be eligible to submit evidence of other injuries they suffered for their time at Facebook and could receive up to $50,000 in damages,” said the report.
(With inputs from Agencies)
Comment