The «cleaners» of global social media platforms live in harsh societies and conditions of comparative clutter to those after whom they are cleaning up.
At the 2018 Rotterdam Film Festival, Hans Block and Moritz Riesewieck presented their new documentary The Cleaners. It begins with a dry and overused list of statistics: Three billion people are connected worldwide through social media websites. Every minute, 500 hours worth of video footage is uploaded on YouTube alone. In that time, some 450,000 Tweets are posted on Twitter and 2.5 million Facebook posts are also published. The Facebook community is estimated to include at least 2 billion members – nearly a quarter of the world. Its influence in opinion formation is larger than that of any nation state.
Recruited from the streets
The Cleaners travels from the high-tech, pristine work quarters of the company’s engineers and their day-to-day conditions, to derelict homes in Manila, where so-called content moderators or Facebook’s «cleaners» live. Nearly all of them are recruited from the street, without any educational background in politics, sociology or psychology, not to mention knowledge about art theory and expression. They are hired merely to «delete». Officially, Facebook is not allowed to employ content control staff in the Philippines. However, local outsourcing transmitter companies render it possible. They are the bodies that deliver Facebook’s paycheques.
The filmmakers are given intimate access into the lives of these «cleaners», following them to their homes, to their leisure activities, church, discos and game halls. Some of them run a big risk by speaking out – which they did, even though they all signed nondisclosure agreements. Other insider statements appear as anonymous emails or as snippets of online communications.
«These moderators defend the «principles» they have ultimately been instructed to uphold.
Facebook can be said to apply radical Islamic censorship patterns.»
These moderators defend the «principles» they have ultimately been instructed to uphold. The filmmakers make great efforts to capture their mindsets without interference. Essentially, it seems, the cleaners believe themselves responsible for ensuring the Facebook platform (and thereby society at large), is healthier through their attempts to prevent suspicious content from appearing online. Some hold the view that the world is simply insane and their role in it is to fight evil. However, there are moments when their personal opinions and misinterpretations of situations occur.
One fragment shows a picture in which a soldier is mistaken for an ISIS member. The picture is subsequently removed from the site. In one instance, a «cleaner» postulates that state representatives should not be offended on these platforms. Another compares himself to Rodrigo Duterte, the controversial president of the Philippines.
These moderators are asked to «identify terrorism». They are ordered to delete all communication from a dubiously established list of 39 terrorist organisations. However, the order does not end there; they must systematically memorise these communications for surveillance purposes. Team leaders survey the censorship of content by these moderators in a meagre 3 per cent of instances. Every agent treats around 25,000 images a day.
By these mechanical rules, there is no doubt that Nick Ut’s Pulitzer Prize winning photograph of Vietnamese children, including a naked nine-year-old girl, fleeing a Napalm attack would immediately be removed. Similarly, a picture of stranded, naked immigrants would never be allowed. Any allusion to sexual intercourse or even nudity is forbidden. In this way Facebook applies radical Islamic censorship patterns. As Nicole Wong, the former policy maker for Google and Twitter confirms, «we delete ‘what we don’t want in our community’.»
One victim of such censorship politics is the Los Angeles based artist Illma Gore, who uploaded a painting of a naked body with a small penis and Donald Trump’s head. We are living in a society where this picture was shared 15 million times. Trump, himself, referred to this image in a public debate soon after, declaring there was no problem with the size of his penis. Just a few days later, Illma Gore’s Facebook account was shut down together with all of her other social media accounts. She was silenced without the possibility of defending her art, and lost all contact details including those essentials for her work, threatening her livelihood.
«We delete ‘what we don’t want in our community’.» Nicole Wong, the former policy maker for Google and Twitter
Other victims have included NGO activists, like their Airwars Section, which needs to store evidence of war attacks for further evidence analysis and responsibility requests. Without their work, aggressors would have free reign and even more civilians would be killed. They try to save what they can before their online materials are deleted or classified by Facebook as ISIS propaganda. Even their YouTube account was blocked whilst other accounts were suspended.
In two public hearings, one before the Senate Judiciary Subcommittee on Crime and Terrorism on October 31, and the other before the Intelligence Committee on November 1 in Washington, D.C., Google’s representative simply confirmed that thousands of people work on content control in the company (good bye privacy!). Facebook, however, was scrutinised for not doing enough against the suppression of radical political propaganda but give in facing political local powers to delate «annoying» context.
Helping censorship in Turkey
Indeed, Facebook has said that it never wanted to act in this role, but instead provide an essential, global marketing tool for consumers and suppliers. This strategy led Facebook to accept certain pressures, for example from the Turkish government, where all critical content against their activities had to be deleted in order to avoid being excluded from the Turkish market. A geolocalised shutdown made content critical of the government inaccessible to Turkish IP addresses. On the other hand, Facebook allowed the publication of hate propaganda against the Rohingya population, declared the most persecuted minority worldwide by the United Nations, in Myanmar and Bangladesh. In societies where Internet is only used to access Facebook, where people don’t even own an e-mail account, online propaganda is treated with extreme importance for its effectiveness in manipulation and distorted reality. The propaganda is accepted as reliable «information», which in extreme cases can easily lead to genocide. Tristan Harris, the former Google design ethicist remembers that local victims had no chance to report or oppose hate videos. Again, Facebook’s interest is to deliver messages to the highest number of consumers and potential buyers. Nothing gets more share-messages than outrage.
Threats to democracy
Facebook delivers messages to interest groups, calming their mind-sets. Antonio Garcia Martinez, former Facebook product manager, states that Facebook has an active role in fabricating the loss of communication capacities. Every user is confirmed through Facebook’s preselected information criteria. The original rule was that everyone was entitled to his or her opinion; now this understanding has expanded to include his or her reality. «If we lose our values, accepted rules and behaviours, if we abandon a basic ground requiring truth, we can’t establish a democracy anymore». Losing the capacity to communicate means losing the most basic element of building a healthy society, he said.
The Cleaners asks a key question: under what circumstances can the censorship of information be exceptionally legitimate, and when does it head towards the collapse of communication, democracy and finally, civilisation? And, in our outrage at what has or has not been deleted, have we neglected to focus on the process through which this takes place – and more importantly, the people behind it who have much quieter voices in our current media?