Employed in Manila to look at “self-harm” videos, a young man had thrice asked his bosses to transfer him out of the unit whose task was to delete from social media platforms user-posted images of beheadings, decapitations, mass killings and the most perverse sexual acts like bestiality and child pornography.
A walking time bomb like everyone on his team, the man hanged himself in front of his laptop at home, in full online view of his fellow workers who were at the office. His was a “cry for help” that went unheeded; his was a plea to be freed from watching too many livestreamed suicide attempts, many successful and all an assault on the psyche and the soul, that fell on deaf ears.
That young man was among the hundreds if not thousands of psychologically and physically scarred Filipinos who may suffer, sooner than later if not now, from nervous breakdowns and depression. They are “The Cleaners” whose plight and stories were first exposed by filmmakers Moritz Riesewieck and Hans Block in their BBC documentary “The Internet’s Dirtiest Secrets.”
On the surface, the deskbound job of “The Cleaners” — which entails deleting images that do not meet the social media platforms’ guidelines on nudity, violence and racism — seems easy enough, a piece of cake.
Paying well above the country’s minimum wage, the work can be a way out of poverty for many who only need to be armed with a high school diploma to apply for the job. That is, if one can retain the sanity to live through the nightmares while resisting the urge to “clean” himself or herself out of this world.
Quizzed by the United States Senate in 2017 on terrorists, criminals and sex fiends using their social media platforms to radicalize or victimize people, or to peddle content that scrape the bottom of human depravity, Facebook, Twitter and Google told lawmakers they have been deploying a global workforce of thousands as content moderators or “trust and safety officers.”
Indeed, the social media giants have done that and, in many instances, through outsourced labor. The moderators are led by the thousands in the Philippines who, according to Riesewieck, “we know almost nothing about because they work in secret” and “sign non-disclosure agreements, which prohibit them from talking and sharing what they see on their screens and what this work does to them.”
“They are monitored by private security firms in order to ensure that they don’t talk to journalists. They are threatened by fines in case they speak. All of this sounds like a weird crime story, but it’s true. These people exist, and they are called content moderators,” he warned on Ted Talks.
All around the world, the content moderators, like the handful in Manila who were brave enough to be interviewed for the documentary, are speaking out, but only after leaving their jobs and for their own good, too. Probably in breaking their silence, they think they can purge themselves of the gore fest indelibly etched in their consciousness — asleep or awake.
Their tales shared a unified plot of first being shocked to the core by what they were forced to view, being slowly desensitized and then descending into the depths of despair like the detective in the 1999 movie “8MM,” who tried to determine if a “snuff” or videotaped killing he saw was real or not.
“When it was time for me to sleep, all I saw in my dream were penises — long, short, white, dark, fat, a kid’s and also those of old men,” said one female “Cleaner,” who recounted that her first day on the job served as a rude sexual awakening. Another one said she quit upon seeing a girl, “probably just six,” being forced to go down on a pedophile.
Still another interviewee, who had seen too many ISIS and narco-related killings, ruminated on the differences between being beheaded with a short, dull “bread knife” and a sharp long one, with the former resulting to over a minute of agony for the victim and jagged, uneven edges around the neck.
“A long knife cuts fast and cuts cleanly,” the male “Cleaner” said in Filipino, his voice even, like he was describing a scene from inside a deli store. If the imageries cobbled from mere words make you cringe, how more seeing the videos and photos for deletion?
In all this, the questions are: What are being done by the state? Why not pierce the veil of secrecy imposed by those who employ these “Cleaners”? How about ensuring proper and not just week-long training for them?
Content moderators need all the support they can get like counselling and psychiatric care. They also must be provided with opportunities for alternative jobs that may not pay as well but pay enough to survive.
It’s time to clean the act of those who employ these “Cleaners” because the job, distasteful and stomach-churning as it may be, has to be done. Mark Zuckerberg has said, “We (at Facebook) stand for connecting every person in a global community.”
Sadly, there’s a price to pay for that connectivity, that easy access to all the information and data through the Internet and social media. Connectivity means that bad players can reach out from outside the confines of the Dark Net.
Hackers peppering YouTube with pornographic images they embedded on the children’s videogame Roblox this week? Send in the cleaners.