Hundreds of social media moderators in Germany — who remove harmful content from platforms such as Facebook and TikTok — are calling on lawmakers to improve their working conditions. The moderators, who work long and intense shifts while screening some of the most disturbing material on the internet, say they are being pushed to work faster by their employers than is necessary and that the job has left them mentally and emotionally drained.
Cengiz Haksoz, who has worked as a content moderator at outsourcer TELUS International, is due to appear before the Bundestag’s Digital Council on Wednesday when he is expected to tell lawmakers his work screening harmful material left him “mentally and emotionally drained.” “I was sitting in front of my screen and looking at images that were so disturbing I didn’t know what to do,” said Haksoz, who now does freelance work and has suffered from burnout. “I felt like I couldn’t go outside and have a normal life anymore.”
The group that Haksoz leads, which includes hundreds of workers who work for outsourcer Meta, is pushing for legally binding rules that force Facebook and other prominent social media companies to disclose how they treat their staff publicly. The group also wants the companies to agree that their staff has a right to unionize or collectively bargain and to let them form legally protected “work councils” in many EU countries for firms of a specific size. It would also like to see Facebook and other companies recognize the rights of their moderators to confidential psychological counseling.
A separate initiative backed by several of Germany’s biggest trade unions aims to pass laws to make it more difficult for companies to outsource jobs requiring the highest scrutiny and skill, such as deleting posts containing violence or other grotesque content. The German state secretary of justice, Gerd Billen, says the initiatives show that European people are addressing concerns about workers’ treatment at some of the world’s largest tech firms. This contrasts with the U.S., where several high-profile lawsuits have been filed against Google and Facebook, but comprehensive laws have yet to be passed.
The most resilient companies respond to these pressures by imposing stricter rules about what can be posted on their sites, quickly deleting inaccurate or purposefully misleading content, and hiring psychologists to design worker “resiliency” programs. By doing so, they are protecting user trust during an unsettling pandemic and securing loyalty for the long run by treating their workers with care and respect. That support can help them weather even the most turbulent times, boosting profitability.