ABSTRACT
An estimated 100,000 people worldwide work as content moderators, responding to the millions of photographs and videos uploaded online every minute. Primarily employed by outsourcing companies in the Philippines, these labourers scrub social media of sexual content. This article unpacks what it calls the ‘digital life of coloniality’ as it is produced through content moderation along two lines of interrogation. The article initially suggests that the traditional understanding of the coming together of sexuality, subjectivity, and regulation under colonialism are rendered more complex by content moderation, positioning the formerly colonized as regulators of their former colonizers’ sexualities. Secondly, asking questions of witnessing, ethics, and accountability, the article interrogates the lines of disavowal and displacement which structure the offshoring of violent, obscene, and mundane sexual content. Contributing to the field of porn studies, this article suggests that the ambivalent and multiple directions of sexual subject production within digital coloniality be addressed anew.
Acknowledgements
The author would like to thank two anonymous readers and the reading group at the Department of Gender Studies, London School of Economics who provided invaluable feedback on an earlier version of the article.
Disclosure statement
No potential conflict of interest was reported by the author.
ORCID
Jacob Breslow http://orcid.org/0000-0001-8643-9804
Notes
1 According to Monika Bickert, Facebook's head of global product policy, Facebook users flagged more than one million items of content for review every day in 2016 (Buni and Chemaly Citation2016).
2 PhotoDNA is a technology developed by Microsoft that uses a technique called ‘robust hashing’ which creates a digital fingerprint for an image. This then matches it to other copies of that same image, even if the image has been altered. It was developed to track the spread of child pornography across the internet.
3 Content moderation is more of a fragmented process than I am able to account for here. It includes workers located within in-house departments at various technology companies, outsourcing firms, call centres, and micro-labour sites, as well as ‘untold numbers of algorithmic and automated products’ (Buni and Chemaly Citation2016).
4 While content moderation is thus a form of outsourced labour, it is not exactly the same job as working in a call centre, the more traditional understanding of this type of work. Despite their differences, however, both of these industries can be understood together though an analysis of their shared political economy, and their emergence in certain geopolitical locations, from specific colonial relations.
5 For other analyses of the American colonial administration of the Philippines, see, among others, Go (Citation2008), Rafael (Citation2000), and Westling (Citation2011).
6 Eventually, Buni and Chemaly (Citation2016) write, ‘the “stew pot guy” began uploading more explicit content that clearly violated Pinterest's terms and [the] team removed his account’.
7 For an analysis of the circulation of the footage of Agha-Soltan's death, one which questions why this footage of death, rather than, say, the footage of Oscar Grant's murder, functioned as a global catalyst for action and empathy, see Malkowski (Citation2017).
8 Reviewers and readers of this article have similarly asked me to define and differentiate between pornography, sexual imagery, sexual expression, and sexual content. While I agree with Bickert here that careful analysis and contextualization is necessary for such an act, I hesitate to make such a distinction within this piece precisely because of the ways in which, as within this quote from Bickert, the lines between art and pornography are blurred by social media companies. Rather than attempt to resolve slippages in my own writing between these different (yet overlapping) formulations, I am seeking to emphasize their slipperiness. For it is precisely the ways in which they are produced as slippery signifiers (mundane sexual expression for one becomes pornography for another) that they are able to take on so much work in terms of affect, accountability, and politicization.
9 While one could thus argue that content moderators’ reviewing and re-viewing of images of sexual abuse, particularly child sexual abuse, might engender re-traumatization – particularly given that one of the central arguments made against child pornography is that the child's knowledge of the image's circulation and viewing by others is in and of itself a form of trauma (Oswell Citation2006; Smolen Citation2013) – this is not my intent here.