ABSTRACT
This article analyzes Google SafeSearch's use of machine vision to automate censorship. I first provide a history of SafeSearch, followed by an overview of Google's machine vision system and the datasets it is built atop. Next I examine its political implications from four perspectives. First, I demonstrate how English-language semantic biases about sexuality are embedded in WordNet's ontology. Second, I outline how ImageNet embeds the biases of both image makers and labellers and examine the possibility for automatically outing ‘closeted’ LGBTQIA+ people. Third, I look at instances where explicitness is blurry, focusing on Google understanding the term ‘bisexuality’ as indicative of pornography. Lastly, I analyze the post-2012, always-on SafeSearch model that requires explicit keywords to trigger pornographic results, which reifies mainstream heteroporn's dominance online. In closing, I suggest some tactics of resistance.
Disclosure statement
No potential conflict of interest was reported by the author.
Notes
1 This article comprises work contained in the author’s book The Digital Closet: How the Internet Became Straight (Monea Citation2022).
2 While it is outside the purview of this article, it is worth noting that research indicates that in the USA adolescents who use online pornography are more likely to be African American and to come from less educated households with lower socio-economic status (Brown and L’Engle Citation2009). There are thus always class and racial tensions that cut through these sex panics (see Lancaster Citation2011).
3 It is worth noting that the term bisexual is not used to describe MMF threesomes or larger group sex scenes in mainstream porn, and only begins to appear in LGBTQIA+ porn when men penetrate one another in these scenes.
4 Research indicates that repeated viewing of certain sexual behaviour does normalize that behaviour and increases the viewers’ positive evaluation of that behaviour over time (Byrne and Osland Citation2000; Goodson, McCormick and Evans Citation2000).