01-5313800 FREEPHONE 1800 844 104 info@colemanlegalpartners.ie

A professional service firm, Cognizant has been requested to cease moderations of objectionable content regarding social media platforms; Facebook in particular. Based on the working conditions and mental health of employees of Arizona moderation center, having undergone investigation by an investigation company named “The Verge” the conclusion had been based on that. 
With the service firm having employees situated on various locations such as India, Europe, and Latin America; due to the investigations, it left them with the decision to cut approximately 6,000 jobs. Cognizant service firm mentioned to BBC: “We have determined that certain content work in our digital operations practice is not in line with our strategic vision for the company and we intend to exit this work over time. This work is largely focused on determining whether certain content violates client standards – and can involve objectionable materials. “Our other content-related work will continue. In the meantime, we will honor our existing obligations to the small number of clients affected and will transition, over time, as those commitments begin to wind down. In some cases, that may happen over 2020, but some contracts may take longer.”
The decision which was made by the firm received the approval from Arun Chandra a worker at Facebook, as she mentioned; “Their content reviewers have been invaluable in keeping our platforms safe – and we’ll work with our partners during this transition to ensure there’s no impact on our ability to review content and keep people safe.” 
Due to these investigations, the Cognizant service firms located in Tampa, Florida, and Phoenix, Arizona, would also be impacted by this, however, would still be operational until next year March. They would also increase the number of staff at the service center in Texas which would then be facilitated by a different partner.
Social media platforms have 24-hour moderation in progress seven days a week in 20 different sites with 15,000 moderators; having to analyze content containing child sex abuse, beheadings, torture, rape, and murder.
BBC’s investigation team came across a former content moderator who had felt “disgusted by humanity” as a result of the traumatic job she was unable to trust anyone. Meanwhile, there had been an incident that had been reported by the human rights group Avaaz, highlighting hate speech in India being aimed at residents of Bengali Muslims located in the state of Assam. The post had gone viral reaching 100,000 shares and approximately 5,4 million views, in September they managed to remove 96 of 213 posts and comments containing the violation. 
Facebook replied: “We have clear rules against hate speech, which we define as attacks against people on the basis of things like caste, nationality, ethnicity, and religion, and which reflect the input we received from experts in India. “We take this extremely seriously and remove content that violates these policies as soon as we become aware of it. To do this, we have invested in dedicated content reviewers, who have local language expertise and an understanding of India’s long-standing historical and social tensions.”

Call Now ButtonCALL