The CAMHS in North Kerry review, expected to finish by early 2024, has now been delayed until 2025, causing frustration among families. Only 300 cases are being examined, leaving many without answers as delays mount. Families and advocates are calling for an expanded review and quicker resolution to the alleged harm caused by the service.
Facebook operates in several parts of the world including but not limited to, Europe, Greece, Texas and Germany. It uses service providers such as CPL Resources and Accenture that provides content moderation services. Content moderators employed by these service providers are suing both the service provider and Facebook for damages and personal injuries. They claimed that they acquired PTSD, depression and anxiety after seeing the distressing content.
Interviews with employees revealed moderators to review content illustrating animal cruelty, beheading, bestiality, drug abuse, extreme violence, gambling, pornography, sexual assault and terrorism. One employee named Chris Gray who started working as a content moderator in July 2017 said he witnessed incidents where a man uploaded a video of himself shooting an elderly stranger on the street and shortly after, saw a live-streamed incident of a Thai man murdering his daughter. He claimed this caused him to suffer from anxiety and PTSD. Another employee by the name of Sean Burke said: “My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed”, and thereafter mentioned he witnessed content portraying child sexual abuse and bestiality which caused psychological trauma and PTSD.
Moderators sought damages for personal injuries sparked by the distressing content during their employment with CPL Resources. Under section 17 of the Personal Injuries Assessment Board Act 2003, “if a plaintiff’s injury consists of psychological damage that would be difficult to assess by the board, it can permit for the claim to be pursued through the courts.” Diane Treanor a solicitor with Coleman Legal Partners based in Dublin and who is representing the moderators said: “The Personal Injuries Assessment Board has commenced authorising the issuing of High Court proceedings against Facebook.
The work schedule is tight and limits social interaction between staff. Although moderators are part of teams in open-plan offices, the virtual space breeds feelings of isolation among employees. This amplifies the negative impact of the content on workers. Moderators receive random tickets in the form of texts, images or videos they react on. The process constantly repeats, generating new tickets for employees while being monitored by management. One employee said, “There are grades of decision making, and if you get it wrong by just a little bit it still counts as a mistake, and that counts against your quality score, and you might be fired. You’re not just looking at it objectively; “The Personal Injuries Assessment Board has commenced authorising the issuing of High Court proceedings against Facebook.”
Facebook provided a statement saying, “We recognize this review work can be difficult, and we work closely with our partners to ensure that the people who do this work are supported. We require everyone who reviews content for Facebook go through an in-depth, multi-week training program on our Community Standards and have access to extensive psychological support to ensure their wellbeing. This includes 24/7, on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right.” However, several employees claim the mental health resources and training provided was inadequate.
Facebook and its service providers began distributing a document to employees wherein it acknowledges the work its employees do could cause PTSD. The document instructed employees to: “disclose any negative mental health fluctuations to management”, that indicates that they are aware that their workplace is unsafe for selected employees. The document also states, “I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may negatively influence my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). I will take full advantage of the WeCare program and seek additional mental health services if needed.
I will tell my supervisor/or my HR People Adviser if I believe that the work is negatively affecting my mental health. I understand how important it is to monitor my own mental health, particularly since my psychological symptoms are primarily only apparent to me.” It continues saying, “If I believe I may need any type of healthcare services beyond those provided by Accenture, or if I am advised by a counsellor to do so, I will seek them.” “Strict adherence to all the requirements in this document is mandatory, and Failure to meet the requirements would amount to serious misconduct and for Accenture, employees may warrant disciplinary action up to and including termination,” reads the document.
After providing a statement and distributing the document to its employees, Facebook and its service providers refused to comment on the timeline of the documents’ delivery.
Related Articles
Clodagh Magennis
Head of Client Services
F: 1800-844-104
E: [email protected]
”At Coleman Legal, excellence in customer care is paramount. We aim to meet both prospective and existing clients’ needs professionally and in a friendly manner with a clear objective of giving quality legal advice and reaching a positive outcome.”