The Battle Against PTSD

Jul 7, 2020 | Social Media Moderators

Over the last few years there has been a significant rise in reported cases of ‘content moderators’ being diagnosed with PTSD.


As reported by (the verge), this rise has been attributed to daily exposure to violent and disturbing content as part of the job requirement. Exposure to gruesome and disturbing content like animal cruelty, beheading, bestiality, drug abuse, extreme violence, gambling, pornography, sexual assault, and terrorism has taken a toll on their mental health.

The Battle Against PTSD

There are approximately 15,000 content moderators in the US who are hired by third-party contracting companies to monitor and review “flagged” or “reported” content. The moderators also determine whether the content is inappropriate or harmful for the general user. The flagged content is categorised based on different queues such as copyright, hate & harassment, adult and violent extremism.

The Battle Against PTSD

What makes this job even more stressful for content moderators is that apart from reviewing such content on a frequent basis they are also required to maintain regular targets to keep their jobs going. Without adequate support from their superiors and insufficient entitlement to services only adds to the anxiety of an already stressful job.

Content moderation companies have been persuading potential employees by providing them with misleading information related to their job. “Shawn Speagle” former Facebook content moderator quoted during an interview with (The Verge) that his job description was misleading to what he was actually assigned. After their initial screening, content moderation companies have been forcing their employees to sign a lengthy non-disclosure agreement acknowledging “this job might cause post-traumatic stress disorder (PTSD)”, before they even begin their employment, said Casey Newton, Reporter (The Verge). This contract not only prohibits employees from disclosing any work-related stress to their superiors but could also lead to harassment or even termination of their employment contract if the company deems them unfit for the job, (The Verge).

Clueless and scared, content moderators continue their jobs with neither adequate benefits nor medical allowances. Although full time-moderators working for Facebook and Google get additional benefits compared to contract-based moderators. However, a former Google content moderator “Daisy Soderberg” quoted during an interview with verge “all these benefits doesn’t mean anything if a person has to see such contents every day. It leaves a scar behind which nothing can heal”

With increasing distress among content moderators, Facebook has finally agreed to pay a sum of $ 52 million to all existing and former content moderators as compensation for mental suffering that they have endured during their tenure. Facebook is also working on an algorithm to automatically remove “reported content” without the involvement of content moderators to reduce the post-traumatic stress involved in this job.

Irrespective of the actions taken by multi-national corporations, there are still more than 10,000 vulnerable content moderators working tirelessly across the globe. No amount of compensation could compensate the traumatic experience and memories, these moderators will have to carry on for the rest of their lives.


If you are a Content Moderator suffering from mental or psychological trauma caused by your job, Coleman Legal urges you to speak to an experienced solicitor. Confidentiality will be given utmost priority. Please visit this page for more information: Click Here

Clodagh Magennis

Clodagh Magennis

Head of Client Services

(01) 531 3800
[email protected]

”At Coleman Legal, excellence in customer care is paramount. We aim to meet both prospective and existing client’s needs in a professional, engaging, and friendly manner with a clear objective to give quality legal advice and reach a positive outcome.”

View Profile