Skip to main content

In Settlement, Facebook To Pay $52 Million To Content Moderators With PTSD

caption: Facebook will pay $52 million to thousands of contract workers who viewed and removed graphic and disturbing posts on the social media platform, lawyers for the company said in a new legal filing.
Enlarge Icon
Facebook will pay $52 million to thousands of contract workers who viewed and removed graphic and disturbing posts on the social media platform, lawyers for the company said in a new legal filing.
AP

Facebook will pay $52 million to thousands of current and former contract workers who viewed and removed graphic and disturbing posts on the social media platform for a living, and consequently suffered from post-traumatic stress disorder, according to a settlement agreement announced on Tuesday between the tech giant and lawyers for the moderators.

Under the terms of the deal, more than 10,000 content moderators who worked for Facebook from sites in four states will each be eligible for $1,000 in cash. In addition, those diagnosed with psychological conditions related to their work as Facebook moderators can have medical treatment covered, as well as additional damages of up to $50,000 per person.

"We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago," Steve Williams, a lawyer for the content moderators, said in a statement. "The harm that can be suffered from this work is real and severe."

Facebook admitted no wrongdoing as a part of the settlement. The company has agreed to provide mental health counseling to its moderators.

"We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We're committed to providing them additional support through this settlement and in the future," a Facebook spokesperson told NPR via email.

The payout follows a class-action lawsuit filed in September 2018 describing the circumstances of Facebook content moderators like Selena Scola, who was the lead plaintiff in the suit.

She worked as a public content contractor for about a year at Facebook's offices in Menlo Park and Mountain View, Calif., where she was employed by the Florida-based contractor Pro Unlimited Inc.

During that time, she had to sift through a barrage of posts published by some of Facebook's billions of users. The suit said the content included "broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder."

Moderators like Scola were tasked with maintaining a "sanitized platform" on Facebook in order to "maximize its already vast profits, and cultivate its public image," the suit alleged.

Relying on users to report objectionable content, Facebook depends on thousands of contract workers to look into whether sometimes horrific content violates Facebook's terms of use.

Scola's lawyers said in the suit that Scola was exposed on the job to thousands of graphically violent images and videos, and now her "PTSD symptoms may be triggered when she touches a computer mouse," enters a cold building or hears loud noises.

It is an experience shared among thousands of contract moderators, the suit claimed.

Some of the content moderators were earning $28,800 a year, the technology news site The Verge found last year.

The settlement allows for moderators who worked for a Facebook contractor between 2015 and now in California, Arizona, Texas or Florida to be compensated.

Facebook has agreed to make one-on-one mental health counseling for moderators available throughout their employment.

As part of the settlement, managers for the Facebook contractors will now assess a candidate's emotional resiliency before hiring someone for the job. There will also be new guidelines for how a moderator can stop seeing a specific kind of content, if requested.

Williams, the lawyer for the moderators, described the settlement as a first-of-its-kind payout, potentially paving the way for similar suits filed on behalf of moderators at other popular online sites on which violent material often spreads quickly.

"This groundbreaking litigation fixed a major workplace problem involving developing technology and its impact on real workers who suffered in order to make Facebook safer for its users," Williams said. [Copyright 2020 NPR]

Why you can trust KUOW