Content Moderator in Germany Talks Back and Gets Sent on Leave: A Threat to Democratic Principles

By Todd Sekuler

On June 14th, Cengiz was among two invited speakers at a parliamentary expert discussion held with the Bundestag Committee on Digital Affairs about the working condition of content moderators – a relatively small group of workers who spend their days and nights evaluating online content to ensure that social media platforms remain free from gruesome and traumatizing images and texts. Shortly after speaking at the Bundestag, Cengiz was abruptly suspended from his job late last week. 

Cengiz is one of some 5,000 content moderators employed in Germany by major social media platforms, or by the outsourcing companies hired by those platforms, to remove content in violation of platform policies or national laws. In Germany, for example, not only is the content itself regulated by criminal law, but so is the timely removal of that content on large social media platforms. Day after day, often working throughout the night, these moderators witness and read about a range of more or less horrific acts and assertions, ranging from beheadings to child abuse to hate infused calls to violence. As Cengiz reported to the Bundestag, such experiences take an enormous toll on their mental and physical health, often leading to severe trauma and mental health disorders, such as post-traumatic stress disorder. 

Even beyond this important toll to health, Cengiz made known to German parliamentarians how content moderators face various forms and modes of exploitation in their workplace. For example, many endure high levels of pressure from colleagues and supervisors, insufficient psychological support, and appallingly low wages. Despite the emotionally taxing nature of their job, they often receive minimal compensation, barely surpassing the minimum wage. This exploitative environment exacerbates the challenges they already face, Cengiz explained, making it imperative that their working conditions are improved.

As I have been learning over the course of my research, tech giants frequently outsource the responsibility of content moderation to third-party service providers like Telus International, where Cengiz is employed. Members of supportive NGO’s, including Superrr Lab and Foxglove, and a representative from the German trade union ver.di, joined Cengiz and other content moderators at that Bundestag discussion two weeks ago. During conversations between events and in external research, I’ve learned how outsourcing these practices allows social media companies to distance themselves from the inherent risks and burdens faced by content moderators, thereby evading accountability for the health and stability of those tasked with the emotionally laborious task of cleaning their platforms.

Even beyond the further precarity and violence this suspension has inflicted on Cengiz, media outlets and the aforementioned representatives have also framed the move as a threat to democratic principles, such as to one’s freedom of speech and the protection of whistleblowers. This is why Ver.di, Superrr Lab, and Foxglove are rallying against the unjust actions taken against Cengiz, rightly making it known as a case of union-busting. Such suppressions of dissent, together with the implicated obstruction of planned worker council elections to be held at Telus International, which would have ensured a new obligation for the company to respect the needs and rights of their employees, are clear signs of an attempt to stifle democratic rights and responsibilities within the workplace.

Members of the CrimScapes project hereby support and strongly encourage readers of our blog to sign this petition initiated by Ver.di, Superrr Lab, and Foxglove in which this suspension is located within a broader culture of fear and secrecy, and framed as a violation of Article 9 of the German Basic Law – which enshrines in law a right to join together to improve working conditions. The suspension and subsequent attempts to silence Cengiz underscore the systemic challenges faced by content moderators, but they also highlight the alarming trend of punishing individuals who expose the unseen casualties of the digital world, effectively transforming whistleblowing into a punishable offense.