Among all the images that X. cannot forget, among the multitude of atrocities that the screen lobbed at his brain, there is one that stuck with him with particular ferocity. La Vanguardia must warn that, even in writing, the description is very harsh. It is the video of a man, drunk, who murders his son, a month-old baby, in front of the camera. He plunges a knife into his chest, opens it, rips out his heart. It bites the heart. “The cries of the baby. Blood, a lot of blood…” X. take a breath. They are inconceivable images that, once seen, cannot be erased.

X. was 19 years old when, newly arrived from Brazil, in 2018 he started working at CCC Barcelona Digital Services, in the Glòries tower. The company, acquired in 2020 by the Canadian group Telus International, is subcontracted by Meta for content moderation on Facebook and Instagram. It employs 2,030 people in Barcelona; at least 20% are out of work. In large part, due to traumas similar to those of X. They are the guardians of social networks.

To X. it seemed like a leap from the waiter and call center jobs he had had until then. “They didn’t ask for training or experience. They paid almost two thousand euros, the offices were impressive and it was a step on Facebook”, he remembers. They explained to him that he had to study the policies of Mark Zuckerberg’s company and, based on those policies, make quick decisions about whether a photo, video or text should be removed. He was not informed, he claims, of the volume or the hardness of the content to which he would be exposed.

The dream soon became a nightmare. After two months, due to his good results, he was assigned to the “high priority group”, the team in charge of the wildest content. “Brutal murders, dismemberments, torture, rape, live suicides, children being abused. The worst of humans”, he sums up. A few weeks were enough for him to see that he didn’t want to be there. He asked to go back to soft content, he was denied. The nightmares, the insomnia, the violent dreams began. He needed the money and he held on. Until the first panic attack came. “It was due to a live suicide. Facebook has a strict policy on when to notify the police, it can’t just be because someone says they’re going to kill themselves, there has to be a gun, a rope, an open window in sight. It often happens that the person pulls out a hidden gun and kills himself without you being able to do anything about it. That day it was like that, but worse, because the man used a very large weapon. It was all blood and brains. I fell to the ground, I couldn’t breathe. I asked to see the company psychologist, she attended to me after an hour. Only I spoke. When I finished he told me that I was doing a very important job for society and I had to be strong. And that the time was up and I had to go back to my place”.

Five years later, he remains under psychiatric treatment, takes five different medications, and a day is good if he has no more than two panic attacks. “On my mind now there is only death. Death, blood, pain, fear. Fear all the time. Afraid to go out, afraid to stay at home, afraid to remember, afraid to sleep.”

As he struggles to heal, he is immersed in another battle, a judicial one. In 2020, already out of the company, he filed a complaint with the Labor Inspectorate. (In 2022, the Generalitat sanctioned Telus with 40,985 euros for not assessing the psychosocial risks of a worker; the company appealed). At the same time, X. has several labor-related lawsuits open so that his illness is recognized as a work-related accident and he is granted total permanent disability.

Telus – and Meta as the ultimate responsible party – is facing another legal battle today, because a worker plans to file a criminal complaint for damages. The lawsuit accuses the two companies of social crimes, injuries due to serious recklessness and against moral integrity.

“We have decided to go through the criminal process because non-compliance is bloody. The company is fully aware of what is happening, that it is generating masses of sick young people, and neither takes responsibility nor does anything to remedy it”, says the plaintiff’s lawyer, Francesc Feliu, from the Feliu Fins law firm.

There are no precedents in Spanish justice. In Ireland, where Meta has its European headquarters, 35 moderators have filed lawsuits before the High Court. Some are CCC workers in Barcelona, ??confirms lawyer Diane Treanor, from Coleman Legal.

In the US, where the collective demand route exists, in 2020 Facebook compensated more than 11,000 moderators with 52 million dollars. In Kenya, 184 others have reported Meta and two subcontracted companies.

Accused of fostering misinformation, violent attacks and even genocide with its algorithms, designed to prioritize interaction (and financial gain) over security, Meta has invested billions and hired 40,000 people, including 15,000 moderators through third parties. “We have more than 20 centers in several countries, where these teams can review content in more than 50 languages,” he says on his website.

According to Meta, in the second quarter of 2023 it removed 7.2 million content for child sexual abuse, 6.4 million for suicide and self-harm from Facebook, 6.2 million for violence on Instagram, and 17.5 million for speech from hate on facebook The list goes on.

The case of X. is not isolated. La Vanguardia has interviewed nine more CCC/Telus moderators who report that they suffer from mental disorders. Post-traumatic stress disorder, anxiety, insomnia, depression or suicide attempts are on their medical reports. Some continue in the company, on leave; others have left. Some consider reporting, others are reluctant to do so.

Telus has undertaken an ERTO (in March) and an ERO (August), which it has justified by the drop in demand from its client, in the context of the technology crisis. He does not link it to the mental health epidemic in the workforce. However, he admitted in May, in a letter to workers to announce that he was no longer paying disability benefits, that the absenteeism rate was 19.8%. According to union sources, there have been peaks of 25%. It is more than twice the average of the Contact Center sector, already high. In the letter, the company insinuates that many casualties are false, since “after” the ERTE they shot up “exorbitantly”.

They have hired Q-ready, an “absenteeism management” company from the Quirón group. “We have engaged an external provider to support our team members who are on medical leave and determine what additional support they need. He is also working with us to develop a plan to reintegrate these individuals with their health and safety as a priority,” said Roger Clancy, Telus’ vice president of operations, via e-mail.

The current workforce in Barcelona is 2,030 people, about 400 less than at the beginning of the year, after the ERO and a policy to encourage voluntary departure. Salaries range between 24,000 and 30,000 euros gross. There is a restaurant ticket, transport voucher and private medical insurance.

There are, without a doubt, those who are happy with the job. The Mexican P. confesses “perhaps less sensitive, perhaps because of the reality of my country, where there are murders every day”. He also points out that it has not been on Telus since the beginning, but since the middle of 2022. Over time – and the work of the reviewers – “the worst content has been looking for other platforms, such as Only Fans or Telegram”; he checks Instagram.

Not everyone is exposed to the same volume of graphic content. It varies according to the “queues” to which they are assigned and above all the market. In Latin America there is a lot of “terrorism”: videos uploaded by criminal groups, with brutal torture and executions, to scare people. It is no coincidence that most traumatized moderators interviewed are from these markets.

“The ones who come to me are the ones who can’t take it,” explains a psychiatrist who has treated six moderators. The first months are happy with the salary and the work seems easy to them. I had a French patient who was delighted until he got the content from French-speaking African countries”.

“I don’t think that someone exposed to such barbarities can last more than a year in the place, it’s easy for them to develop nightmares of guilt or responsibility,” says Benedikt Amann, director of the psychiatry research unit at the hospital’s Forum Center of the sea “Although there are people who go to this job with an ethic, of filtering out evil and improving the world, the company should inform beforehand what exactly it will consist of, and should warn that with a history of vulnerability they have a high and real risk of post-traumatic stress”.

There are differences between those who entered the beginnings of the company and the last batches. While the former report that they were not reliably informed of the nature of the work, those hired in the last two years were clear about where they were going, but underestimated the effect it would have on them. This is the case of a 30-year-old Brazilian woman. As a child she had depression, linked to bullying. The company never asked her about her record when they hired her, although she herself admits that, all these years later, she didn’t think she was at risk either. They put her in the “live suicide queue”. He agreed that he separated from the couple. He went into depression. He had two suicide attempts.

It is a key element in lawsuits in Ireland, lawyer Treanor points out: “The company does not screen or look at whether it is a person with a history of self-harm or suicide attempts and who is then exposed to this type of content, when any psychologist will tell you that it is a trigger”.

The lawyer and activist Cori Crider, who had just defended detainees at Guantánamo, admits that when she came across the first stories of sick moderators a few years ago, she found it difficult to assimilate that someone could develop post-traumatic stress simply by having viewed violent images in a computer “It is a job that can have serious neurological effects. Like a pesticide factory: not everyone develops cancer, but it greatly increases the chances and we all understand that we need to take protective measures”, argues Crider, co-founder of Fox Glove, a London-based fair technology organization.

Companies know this, Crider points out. Accenture, which also moderates content for Facebook, has its employees sign an informed consent that says: “I understand that the content I will be required to review will be disturbing. It may have an impact on my mental health and could even lead to post-traumatic stress.”

“The work of the moderators is necessary and important, and unfortunately I don’t think we can ever completely devolve it to artificial intelligence. What we’re demanding is that tech companies make sure it’s a safer place to work, and it’s not enough to tell employees to protect themselves. It is your responsibility. It involves selecting the workers appropriately. train them Give them psychological treatment and the breaks they need. And also a salary in line with the risks”, claims Crider.

The lawyer points out, for example, that, in the United Kingdom, the police who track pederasty on the Internet “have an army of psychiatrists and strict limits on the amount of content they can see, precisely because the effect it causes is well known.”

On its website, Meta assures that its moderators “learn how to access resilience and well-being resources, and how to contact a professional if they need additional help.” Similar terms used by the Telus executive: “We have designed a resilience and wellness program that includes a comprehensive benefits program for access to physical and mental health services,” says Roger Clancy.

The moderators interviewed consider that the psychological care offered is insufficient. “We have 40 minutes a week of what they call wellness. It means you can talk to someone. But if you tell them you’re unwell, they tell you to look for another job”, explains a moderator. “The company is used to people who do not resist taking their leave and then leaving the company. Without complaining”.

Cori Crider applauds that there are moderators suing and getting compensation. But she is convinced that the change will not come here. “I understand that people want justice. But our priority is to change this work, and that is through committed and organized unions, and through more regulation. Governments need to understand the problem and regulate, and maybe even stop tech companies from becoming so big and powerful. Fines and compensation won’t make Facebook change its policy. The same day that the Federal Trade Commission fined him 5,000 million, his shares went up on the stock market.