© Reuters. FILE PHOTO: The Facebook app logo is seen in this illustration taken August 22, 2022. REUTERS/Dado Ruvic/Illustration/File Photo
By Joan Faus and Catarina Demony
BARCELONA/MADRID (Reuters) – A Spanish court ruled that the mental health of a former Facebook (NASDAQ ) moderator was harmed by his work reviewing graphic content such as beheadings, in a case that could have implications for the way companies of social networks work with moderators.
The Barcelona court, confirming a decision by the Spanish Social Security, said that the psychiatric treatment the outsourced moderator needed was due to work-related issues, so he is entitled to additional compensation for sick leave.
The moderator worked between 2018 and 2020 for CCC Barcelona Digital Services, part of Telus (NYSE:) International, which is one of the outsourced providers of Facebook owner Meta.
Meta did not immediately respond to a request for comment.
Telus said it was disappointed by the ruling and would appeal.
It is the first time in Spain that a court has recognized that a content moderator's sick leave was caused by his work, said Francesc Feliu, the worker's lawyer who also represents 20 other former and current content moderators at CCC on legal grounds. Similar.
The former worker had to watch content that included “self-mutilations, beheadings of civilians killed by terrorist groups, torture inflicted on people, suicides,” the court stated.
The CCC filed a lawsuit in 2022 seeking to overturn the social security agency's decision that the moderator's mental health condition was a result of his work.
In his Jan. 12 ruling, seen by Reuters, Judge Jesús Fuertes rejected the CCC's claim.
“The worker has been suffering a situation of great emotional and psychological impact in his job,” he wrote, adding that the leave granted in 2019 was “exclusively and undoubtedly” caused by his work.
Martha Dark, director of the London-based tech justice advocacy group Foxglove, said the court was “100% right to recognize that the work of keeping Facebook safe causes mental health illnesses.”
“Meta needs to compensate this brave former moderator for the harm he has suffered, but that is only half the battle,” he said. “They must also be required to provide real, ongoing mental health care and safe workplaces to the tens of thousands of workers doing this work around the world.”
Dark urged governments to introduce regulations to ensure social media platforms are safe for both users and workers.
In 2020, Facebook reached a settlement with American content moderators who suffered from mental health issues. Last year, a moderator in Germany was placed on paid leave pending an internal investigation after he called for improved working conditions.