Publication Cover
Continuum
Journal of Media & Cultural Studies
Volume 36, 2022 - Issue 3
1,835
Views
3
CrossRef citations to date
0
Altmetric
Special Issue: Media and Fakery, Guest Editors: Celia Lam, Wyatt Moss-Wellington, and Filippo Gilardi

Deepfakes and documentary practice in an age of misinformation

Pages 393-410 | Published online: 12 Nov 2021
 

ABSTRACT

The emergence of deepfakes is the latest form to prompt anxieties over the wider implications of misinformation. This chapter explores possibilities for how these technologies extend the repertoire of modalities available for documentary makers. While these ‘synthetic media’ offer a disruption of the documentary genre, they are also a continuation of long-standing trends within software culture and also clearly augment practices which are deeply embedded within the documentary genre. This discussion draws upon Wardle and Derakhshan’s ‘misinformation’ and ‘disinformation’ framework to highlight the increasing complexity of documentary’s forms and the challenges they pose to audiences. The limited experiments in integrating synthetic media into documentary media in a productive way suggest especially the possibilities for using these to develop more openly reflexive content. The proliferation of synthetic media forms prompt a wider need within documentary practitioners for critical data practices, software literacy, and ethical practices embedded within a broader understanding of automated, networked and entangled media systems. And they challenge documentary designers to strategise the nature of their content, and engage more directly with their audiences on questions around evidence, trust, authenticity and the nature of documentary media within an era of misinformation.

Notes on cotributors

Dr Craig Hight is an Associate Professor in Creative Industries at the University of Newcastle, Australia. His research has drawn on documentary theory, software studies, critical data studies and a variety of approaches within the field of audience research. His most recent work explores the nature of documentary culture and practice within digital media platforms.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1. Gregory is a programmedirector at New York-based human rights video advocacy group WITNESS, a group which has become a major stakeholder in initiatives against deepfakes, as discussed below.

2. A ‘post-photographic’ era was declared by some commentators (Lister Citation2013) as Photoshop’s easy manipulation of digital images appeared to herald a distinctive break from the degree of trust associated with documentary photographs.

3. It is notable that there are already rudimentary deepfake apps, some focused on generating pornographic content, others focused on playful face swapping. Deep Nostalgia (see https://www.myheritage.com/deep-nostalgia) and other commercially available applications are a training ground for amateurs in these techniques.

4. Bruns’ (Citation2019) useful critique of the terminology of ‘filter bubbles’ (Parser, Citation2011) and ‘echo chambers’ suggests how much work is still needed in contextualizing anxieties around the political implications of fake media more carefully within broader media and technology use. The precise ways in which these kinds of development are playing out across national populations is uncertain, but there has been intense commentary around the assumed wider polarization occurring within American political culture (Zimdars and McLeod Citation2020).

5. Among their fascinating list of scenarios in this ‘new information war’ are those where faked video content inflames existing social and political tensions, including religious and ethnic tensions within and between national populations, or provides a sparking point for military interventions in geopolitical hot spots such as the Middle East, or similar potential war zones.

6. The term ‘documentary’ needs clarification, as it now encompasses a continuum of audio-visual material beyond professionally produced content primarily intended for theatrical release or television broadcast. Documentary practices now include videos funded by interest groups or emerging from ‘amateur’ or low-budget contexts, and online/digital collaborative practices centred within community politics (Zimmermann and De Michiel, Citation2017). I use the term documentary here, then, to capture a broad range of audio-visual content (usually, but not always ‘long-form’) that makes claims about reality through various rhetorical devices.

7. Which is not the same as adhering to an abstract notion of ‘balance’ between opposing views which tends to constrain news reporting.

8. For example, asking participants to repeat actions to allow the camera to capture movement from another angle.

9. Which means not being able to caption and justify each and every image that is included within montages.

10. This was essential at a time when film equipment was physically cumbersome and before the advent of synchronous sound recording.

11. The subjects of documentaries also invariably articulate surprise at how their interviews were edited and deployed, not always recognizing they are subject to the broader agenda of the filmmakers.

12. Interactive and immersive forms of documentary design provide quite distinctive experiences for audiences. They foster forms of engagement from click-through web-based content, more elaborate forms of role-play that are closer to forms to gaming experiences, through to the more immersive experiences within XR (the umbrella term for augmented reality, virtual reality and everything in between) (CitationHight, 2008, Citation2014, Citation2017; Dovey and Rose Citation2012, Citation2013; Nash, Hight, and Summerhayes Citation2014; Aston, Gaudenzi, and Rose Citation2017).

13. This later example is already standard practice within blue chip nature documentaries, but deepfakes could extend this beyond voice-over into seamless integration of multilingual versions of David Attenborough’s on-screen persona, for example.

15. As potential remedies, Chesney and Citron outline a series of government regulations, technological solutions, legal remedies, and commercial initiatives. Among their remedies is the suggestion for political figures engage in such detailed self-surveillance that false content can be easily disproven, and with initiatives such as rigorous new legislation to instal close governance of social media platforms, and to punish those that are insufficiently vigilant against such content. The impulse to capture anything and everything, without predefined boundaries or selection criteria, as a protective practice is something Andrejevic has identified as a core feature of automated culture, and something he terms ‘framelessness’ (Andrejevic, 113–132).

16. See https://guardianproject.info/apps/obscuracam/ (ObscuraCam) and https://guardianproject.info/apps/camerav/ (CameraV). For an outline of ProofMode, see Dia Kayyali, ‘Prove human rights abuses to the world’, https://blog.witness.org/2017/04/proofmode-helping-prove-human-rights-abuses-world/

17. At the time of writing, Google, Facebook, Microsoft, TikTok, Redbubble and Twitter have all agreed to develop a code of conduct targeting misinformation. Gillespie (Citation2020), however, makes the point that platforms have approached such measures purely through an extension of their existing content moderation apparatus. This deploys algorithmic mechanisms paired with a labour force engaged in case by case moderation, a cumbersome approach difficult to operate at the large scales required. Unfortunately, the same blunt instruments seeking to cull objectionable or offensive material also leads to the loss of vital evidence of atrocities (Economist,September222,020), or material which could also determine the authenticity (or not) or other material. These distribution platforms are ill-designed to perform the kind of social and political role being asked of them, and each platform has own internal policies which are not debated, discussed or clearly articulated to their users or the broader public.

18. See https://www.getbadnews.com/#intro and specifically for misinformation around COVID-19 https://www.goviralgame.com/en

19. https://www.facebook.com/help/188,118,808,357,379

20. For example Snopes: https://www.snopes.com/, or Spot the Deepfake resource: https://www.spotdeepfakes.org/en-US, Global Disinformation Index https://disinformationindex.org/

21. See for example, (Hill Citation2005; Hardie Citation2008; Harindranath Citation2009; Hill Citation2007; Austin Citation2013; Rutten and Verstappen Citation2015; Hill et al. Citation2019).

22. The fascinating explosion of conspiracy theories as online phenomenon and how the Trump presidency in the United States strategically used and fostered conspiracy theory discourses is beyond the scope of this article. A useful entry point is The Atlantic’s ‘Welcome to Shadowland’ overview and wider history of American conspiracy theories, available at https://www.theatlantic.com/shadowland/

23. The intensification of bot-driven misinformation as evidenced most prominently during 2016 U.S. election has focused the efforts of media researchers on how the spread of false and misleading information has been enhanced and accelerated by influencers, super-spreaders and just ordinary online users unwittingly enabling the mischievous work of a host of otherwise small-scale political interest groups (Graham et al. Citation2020).

24. Impact productions position documentary media as one component of a multiplatform strategy for fostering direct political engagement around an issue.

Additional information

Notes on contributors

Craig Hight

Dr Craig Hight is an Associate Professor in Creative Industries at the University of Newcastle. His research has drawn on documentary theory, software studies, critical data studies and a variety of approaches within the field of audience research. His most recent work explores the nature of documentary culture and practice within digital media platforms.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 412.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.