6,200
Views
112
CrossRef citations to date
0
Altmetric
Original Articles

Life beyond big data: governing with little analytics

Pages 341-366 | Published online: 04 Sep 2015
 

Abstract

The twenty-first-century rise of big data marks a significant break with statistical notions of what is of interest or concern. The vast expansion of digital data has been closely intertwined with the development of advanced analytical algorithms with which to make sense of the data. The advent of techniques of knowledge discovery affords some capacity for the analytics to derive the object or subject of interest from clusters and patterns in large volumes of data, otherwise imperceptible to human reading. Thus, the scale of the big in big data is of less significance to contemporary forms of knowing and governing than what we will call the little analytics. Following Henri Bergson's analysis of forms of perception which ‘cut out’ a series of figures detached from the whole, we propose that analytical algorithms are instruments of perception without which the extensity of big data would not be comprehensible. The technologies of analytics focus human attention and decision on particular persons and things of interest, whilst annulling or discarding much of the material context from which they are extracted. Following the algorithmic processes of ingestion, partitioning and memory, we illuminate how the use of analytics engines has transformed the nature of analysis and knowledge and, thus, the nature of the governing of economic, social and political life.

Acknowledgements

An early version of this paper was presented at the Calculative Devices in the Digital Age conference, Durham University, November 2013. Our thanks go to all participants at that event for the stimulating discussion that took place. The three anonymous reviewers for Economy & Society and Managing Editor Fran Tonkiss have been extraordinarily helpful and generous with their comments and insights; thank you. The research field-work from which the empirical elements of the paper are drawn, conducted during 2013, involved observations of data analytics industry and governmental events, and interviews with software engineers, analytics consultants and data scientists. We are grateful to everybody who has so generously given time to the project.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. In response to the UK government's announcement of the second phase of funding for ‘Big Data centres’, Chief Executive of the ESRC, Professor Paul Boyle, welcomed the ‘sheer volume of data that is now being created’, a ‘significant resource … that can shape our knowledge of society and help us prepare and evaluate better government policies in the future’ (ESRC, Citation2014).

2. It is not our purpose here to map a linear history of practices of data collection and analysis. Rather, we juxtapose two moments when a specific set of claims are made regarding the scale and scope of social data and its effects on the governing of societies.

3. Bergson's reflections on perception in science are present throughout his body of work. Of particular significance here is his insistence on the shared categories of thought and sensing across science and prosaic perception, so that ‘ordinary knowledge is forced, like scientific knowledge, to take things in a time broken up into particles, pulverized so to speak, where an instant which does not endure follows another without duration’ (Citation1965, p. 120).

4. Indeed, by 1930, Bergson himself appreciated the growing capacity of ‘modern mathematics’ and physics to capture something of perpetual and indivisible change, to ‘follow the growth of magnitudes’ and to ‘seize movement’ from within (Citation1965, p. 211).

5. The earliest use of the concept of ingestion for analysis of data in multiple formats can be found in papers from IBM's research on smart surveillance and web architecture (Chiao-Fe, Citation2005; Gruhl et al., Citation2004). The use of a vocabulary of ingestion coincides with an expansion of analysable samples of digital data, such that it is said that n=all, or the sample is equal to everything.

6. The concept of index is used here in the sense proposed by Deleuze and Guattari to denote the capacity to designate the state of things, territorially locatable in time and space (Citation1987, p. 124). Understood thus, for example, extraction algorithms are required in order territorially to index unstructured objects, as in the use of biometric templates derived from Facebook. It is the extracted template that makes the object searchable in time and space.

7. The case is derived from field-work conducted in London in 2013. For further examples and detailed descriptions of text mining and sentiment analysis, see Bello et al. (Citation2013); Zhao et al. (Citation2013); and Anjaria and Gudetti (Citation2014).

8. Hayles defines the concept of ‘technogenesis’ as the ‘idea that humans and technics have coevolved together’, such that our very capacity for thought and action is bound up with ‘epigenetic changes catalysed by exposure to and engagement with digital media’ (Citation2012, pp. 10–12). The idea is present also in Walter Benjamin's famous essay on art in the age of mechanical reproduction, where he notes that ‘the mode of human sense perception changes with humanity's entire mode of existence’ (Citation1999, p. 216).

9. Retrieved from http://www.theguardian.com/profile/laura-poitras. See also Harding (Citation2014, pp. 110, 204).

10. Though the focus of this essay is not on the interface between data architectures and software, the flattening of differences at this interface is significant. See Galloway (Citation2012); Berry (Citation2011).

11. Despite substantial interest in the automated analysis of large data sets for security purposes in the wake of Edward Snowden's disclosures, the use of algorithmic techniques to analyse Passenger Name Record (PNR) and SWIFT financial data has been known and documented for some time (Amoore, Citation2013; de Goede, Citation2012).

12. Insights drawn from observations at TIBCO® Spotfire® event, London, 13 June 2013.

13. Insights drawn from observations at TIBCO® Spotfire® event, London, 13 June 2013, and SAS Analytics ‘How to’ workshops, 19 June 2013.

Additional information

Funding

This work was supported by the RCUK Global Uncertainties Fellowship Securing against Future Events: Pre-emption, Protocols and Publics [Grant number ES/K000276/1].

Notes on contributors

Louise Amoore

Louise Amoore is Professor of Political Geography in the Department of Geography, Durham University. She is RCUK Global Uncertainties Fellow (2012–2015) and author of The politics of possibility: Risk and security beyond probability (Duke University Press, 2013).

Volha Piotukh

Volha Piotukh is currently Postdoctoral Research Associate at Durham University, working on the ‘Securing Against Future Events’ project. She is the author of Biopolitics, governmentality and humanitarianism: ‘Caring’ for the population in Afghanistan and Belarus (Routledge, 2015).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 294.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.