1,041
Views
0
CrossRef citations to date
0
Altmetric
Articles

Moderating the ‘worst of humanity’: sexuality, witnessing, and the digital life of coloniality

ORCID Icon
Pages 225-240 | Received 04 Aug 2017, Accepted 30 Apr 2018, Published online: 06 Jun 2018
 

ABSTRACT

An estimated 100,000 people worldwide work as content moderators, responding to the millions of photographs and videos uploaded online every minute. Primarily employed by outsourcing companies in the Philippines, these labourers scrub social media of sexual content. This article unpacks what it calls the ‘digital life of coloniality’ as it is produced through content moderation along two lines of interrogation. The article initially suggests that the traditional understanding of the coming together of sexuality, subjectivity, and regulation under colonialism are rendered more complex by content moderation, positioning the formerly colonized as regulators of their former colonizers’ sexualities. Secondly, asking questions of witnessing, ethics, and accountability, the article interrogates the lines of disavowal and displacement which structure the offshoring of violent, obscene, and mundane sexual content. Contributing to the field of porn studies, this article suggests that the ambivalent and multiple directions of sexual subject production within digital coloniality be addressed anew.

Acknowledgements

The author would like to thank two anonymous readers and the reading group at the Department of Gender Studies, London School of Economics who provided invaluable feedback on an earlier version of the article.

Disclosure statement

No potential conflict of interest was reported by the author.

Notes

1 According to Monika Bickert, Facebook's head of global product policy, Facebook users flagged more than one million items of content for review every day in 2016 (Buni and Chemaly Citation2016).

2 PhotoDNA is a technology developed by Microsoft that uses a technique called ‘robust hashing’ which creates a digital fingerprint for an image. This then matches it to other copies of that same image, even if the image has been altered. It was developed to track the spread of child pornography across the internet.

3 Content moderation is more of a fragmented process than I am able to account for here. It includes workers located within in-house departments at various technology companies, outsourcing firms, call centres, and micro-labour sites, as well as ‘untold numbers of algorithmic and automated products’ (Buni and Chemaly Citation2016).

4 While content moderation is thus a form of outsourced labour, it is not exactly the same job as working in a call centre, the more traditional understanding of this type of work. Despite their differences, however, both of these industries can be understood together though an analysis of their shared political economy, and their emergence in certain geopolitical locations, from specific colonial relations.

5 For other analyses of the American colonial administration of the Philippines, see, among others, Go (Citation2008), Rafael (Citation2000), and Westling (Citation2011).

6 Eventually, Buni and Chemaly (Citation2016) write, ‘the “stew pot guy” began uploading more explicit content that clearly violated Pinterest's terms and [the] team removed his account’.

7 For an analysis of the circulation of the footage of Agha-Soltan's death, one which questions why this footage of death, rather than, say, the footage of Oscar Grant's murder, functioned as a global catalyst for action and empathy, see Malkowski (Citation2017).

8 Reviewers and readers of this article have similarly asked me to define and differentiate between pornography, sexual imagery, sexual expression, and sexual content. While I agree with Bickert here that careful analysis and contextualization is necessary for such an act, I hesitate to make such a distinction within this piece precisely because of the ways in which, as within this quote from Bickert, the lines between art and pornography are blurred by social media companies. Rather than attempt to resolve slippages in my own writing between these different (yet overlapping) formulations, I am seeking to emphasize their slipperiness. For it is precisely the ways in which they are produced as slippery signifiers (mundane sexual expression for one becomes pornography for another) that they are able to take on so much work in terms of affect, accountability, and politicization.

9 While one could thus argue that content moderators’ reviewing and re-viewing of images of sexual abuse, particularly child sexual abuse, might engender re-traumatization – particularly given that one of the central arguments made against child pornography is that the child's knowledge of the image's circulation and viewing by others is in and of itself a form of trauma (Oswell Citation2006; Smolen Citation2013) – this is not my intent here.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 187.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.