24,854
Views
94
CrossRef citations to date
0
Altmetric
Articles

Transparent to whom? No algorithmic accountability without a critical audience

&
Pages 2081-2096 | Received 04 Oct 2017, Accepted 11 May 2018, Published online: 18 Jun 2018
 

ABSTRACT

Big data and data science transform organizational decision-making. We increasingly defer decisions to algorithms because machines have earned a reputation of outperforming us. As algorithms become embedded within organizations, they become more influential and increasingly opaque. Those who create algorithms may make arbitrary decisions in all stages of the ‘data value chain’, yet these subjectivities are obscured from view. Algorithms come to reflect the biases of their creators, can reinforce established ways of thinking, and may favour some political orientations over others. This is a cause for concern and calls for more transparency in the development, implementation, and use of algorithms in public- and private-sector organizations. We argue that one elementary – yet key – question remains largely undiscussed. If transparency is a primary concern, then to whom should algorithms be transparent? We consider algorithms as socio-technical assemblages and conclude that without a critical audience, algorithms cannot be held accountable.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes on contributors

Jakko Kemper is a PhD candidate at the Amsterdam School for Cultural Analysis (University of Amsterdam). His research focuses on digital culture, the aesthetics of imperfection, and media theory [email: [email protected]].

Daan Kolkman is a research fellow in decision-making at the Jheronimus Academy of Data Science. His research revolves around the sociology of quantification, intelligent decision support systems (e.g. machine learning, artificial intelligence), and organizational decision-making. Daan received his PhD in sociology from the University of Surrey (England) for his work on computational models in government. He was supervised by Nigel Gilbert, Tina Balke, and Paolo Campo at the Centre for Research in Social Simulation. Daan develops data-intensive products and services for – and in collaboration with – retailers and SMEs [email: [email protected]].

Notes

1 While transparency marks the focus of this paper, the other FACT principles are of course implicated in the valuation of transparency ‒ here follows a brief outline: Fairness ‒ How to avoid unfair conclusions even if they are true? Accuracy – How to answer questions with a guaranteed level of accuracy? Confidentiality – How to answer questions without revealing secrets?

2 In discussing the characteristics of the socio-technical assemblages surrounding algorithms which can impede critical inspection altogether, or prevent the development of a critical audience, we build on previous work on applications of algorithms in practice. We also draw on fieldwork that has been presented extensively elsewhere (Author, forthcoming). We studied 8 cases of algorithmic model use in government and analytic industry over a period of 2.5 years. Data were collected in the form of interviews, participant observations, and (digital) documents analysis.

3 Pensim2 has been reviewed on several occasions with varying degrees of formality. Examples include the review by the US congressional office mentioned in the text and a review by the UK Institute for Fiscal studies (Emmerson, Reed, & Shephard, Citation2004).