Facebook videos gave me PTSD: moderator sues
She claims the company didn't train her to deal with the 'extreme and graphic violence' she had to watch
A former Facebook moderator is suing the company for failing to protect her from the trauma she suffered while combing through thousands of disturbing images and videos.
Selena Scola, who worked at Facebook’s Menlo Park headquarters between June 2017 and March 2018, developed post-traumatic stress disorder (PTSD) after witnessing “acts of extreme and graphic violence”.
The class action lawsuit claims that Facebook and its contractor, Pro Unlimited, created dangerous work conditions for thousands of contractors by failing to provide adequate training and counselling in defiance of its own guidelines.
Facebook employs at least 7,500 human moderators across 50 languages, often through contractors, who manually search through millions of posts every week to check if they should be removed. The content can include images of sexual abuse and murder, terrorist videos, illegal pornography and even live broadcasts of people committing suicide.
“It is well documented that repeated exposure to such images can have a profoundly negative effect on the viewer,” said Korey Nelson, a lawyer acting for Scola. “Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatised by what they witnessed on the job.”
The suit alleges that Facebook ignored guidelines for dealing with traumatic content drawn up in 2015 by the Technology Coalition, an industry body that Facebook partly funds. The guidelines recommend extensive psychological screening for new employees and mandatory counselling during the job, as well as technical measures to reduce the impact of the content.
For instance, some companies pixelate or blur images under review, altering the colours to reduce their impact. Some play video without audio, or provide a decompression period.
Facebook, the suit claims, “ignored” these guidelines, leaving Scola with PTSD that can be triggered when she touches a computer mouse, watches violence on TV or hears loud noises. She now wants to force Facebook and Pro Unlimited to set up an ongoing fund to cover former moderators’ medical bills.
A spokesperson for Facebook said the company was reviewing the claim and that on-site counselling was available at Scola’s office while she worked there.
He said: “We recognise that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”
A blog post in July by Ellen Silver, vice-president of operations, said all moderators are screened for mental resilience before hiring, given a minimum of 80 hours of training and “have access to mental health resources” as well as ongoing coaching.
But evidence from former moderators suggests these provisions are limited or relatively new. Sarah Katz, who worked as a moderator in 2016 and is now a science fiction author, said they were “news to [her]”.
Katz said she was given minimal training, no screening and no psychological support. She also said she worked under a strict quota of one post per minute, something Silver’s blog post categorically denied.
Another moderation contractor told The Guardian last year they were “underpaid and undervalued”, saying training and support were “absolutely not sufficient”.
At a German newspaper, a moderator who worked in Berlin described her job as a “production line” in which workers were repeatedly exposed to traumatic content without time to reflect or process it. She quit after three months when she started to feel herself becoming desensitised in everyday life.
– © The Daily Telegraph