The Disturbing Reality of Social Media Moderation
Written on
Chapter 1: The Hidden World of Social Media
In today’s digital age, many of us start our mornings by scrolling through social media platforms like Facebook, Twitter, or TikTok. We crave updates about global events and the lives of our friends, driven by a fear of missing out (FOMO). However, the disturbing nature of the content we consume often goes unnoticed.
As we engage with these platforms, we rarely consider the extensive effort that goes into maintaining the friendly façades they present. Facebook, for instance, has become a hub for misinformation, but the challenge of managing the vast amounts of information is a monumental task that these companies face every day.
Still, given that these platforms have evolved into billion-dollar enterprises, there comes a responsibility to manage the content effectively.
Inside Facebook's African Operations
It’s not uncommon to hear about how American corporations like Facebook outsource tasks to developing nations in search of cheaper labor. What’s particularly alarming is how these tech giants manage to outsource content moderation without accountability. This leads to a troubling impact on young individuals globally, many of whom are ill-equipped to deal with the overwhelming exposure to graphic content that can be mentally damaging.
What do these content moderators endure? As technology advances, artificial intelligence is increasingly used to identify harmful content. However, there are nuances that machines have yet to grasp, which is why human moderators are still essential. Unfortunately, this job can take a severe toll on mental health, leading some moderators to express thoughts of self-harm or suicide.
A report from the Washington Post highlights a similar situation in the Philippines, where companies like Accenture and Cognizant have hired thousands of young workers to serve as content moderators. Their experiences mirror those reported by their African counterparts, revealing the grim reality of low wages and high emotional costs.
Day after day, content moderators confront horrific images of violence and death. Yet, many receive compensation that falls short of what their U.S. counterparts earn. In developing nations, the operational standards often differ, and U.S. companies tend to distance themselves from the ethical implications of outsourcing such distressing work.
Will Mark Zuckerberg ever experience the reality of content moderation? Could he withstand the trauma of witnessing acts of violence, such as murder and rape, firsthand? For someone like Idris, every time he begins to review content, he faces a daunting 50-second countdown to decide whether to remove or retain gruesome material, a requirement set by his employer.
A recent exposé by Time reveals the inhumane conditions faced by former employees at Sama in Kenya. While Meta, Facebook's parent company, disputes the claims made in the article, they acknowledge operating over 20 sites globally, including locations in Germany, Spain, and Ireland.
Chapter 2: The Psychological Toll of Moderation
In this video, "Inside the traumatic life of a Facebook moderator," we delve into the harrowing experiences faced by content moderators and the psychological effects of their work.
The second video, "The Horrors of Being a Facebook Moderator | Informer," sheds light on the grim realities of this profession, emphasizing the emotional burden carried by those involved.