1,432
Views
0
CrossRef citations to date
0
Altmetric
Case notes

Proving algorithmic discrimination in government decision-making

&
Pages 352-360 | Received 05 May 2020, Accepted 08 Jul 2020, Published online: 21 Oct 2020
 

ABSTRACT

Public bodies in the United Kingdom are increasingly using algorithms and big data to make decisions. While there is growing awareness of the risks of algorithmic discrimination, it can be very difficult to establish that a specific algorithm is in fact discriminatory. This raises the question of how courts should allocate the burden of testing and proving whether a government algorithm is discriminatory, as between the government and the person affected. In R (Bridges) v South Wales Police [2020] EWCA Civ 1058, the England and Wales Court of Appeal found that public bodies must take positive steps to identify and address risks of algorithmic discrimination. This note explores the decision in Bridges and its implications for algorithmic decision-making in government. It suggests that Bridges, alongside recent decisions in Canada and the Netherlands, forms part of a broader trend: the courts are placing the burden of testing and reviewing potentially discriminatory algorithms on government, rather than the general public.

Notes

1 Joe Tomlinson, ‘Quick and Uneasy Justice: An Administrative Justice Analysis of the EU Settlement Scheme’ (2019) Public Law Project <publiclawproject.org.uk/wp-content/uploads/2019/07/Joe-Tomlinson-Quick-and-Uneasy-Justice-Full-Report-2019.pdf> accessed 4 September 2020.

2 Hannah Couchman, ‘Policing By Machine’ (January 2019) Liberty <www.libertyhumanrights.org.uk/wp-content/uploads/2020/02/LIB-11-Predictive-Policing-Report-WEB.pdf> accessed 4 September 2020; Marion Oswald and others, ‘Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and “Experimental” Proportionality’ (2018) 27 Information & Communications Technology Law 223; Silkie Carlo, Jennifer Krueckeberg, and Griff Ferris, ‘Face Off: The Lawless Growth of Facial Recognition in UK Policing’ (May 2018) Big Brother Watch <bigbrotherwatch.org.uk/wp-content/uploads/2018/05/Face-Off-final-digital-1.pdf> accessed 4 September 2020.

3 Joanna Redden, Lina Dencik, and Harry Warne, ‘Datafied Child Welfare Services: Unpacking Politics, Economics and Power’ (2020) 41 Policy Studies 507; Sarah Marsh, ‘One in Three Councils Using Algorithms to Make Welfare Decisions’ The Guardian (15 October 2019) <www.theguardian.com/society/2019/oct/15/councils-using-algorithms-make-welfare-decisions-benefits> accessed 4 September 2020.

4 See generally Michael Rovatsos, Brent Mittelstadt, and Ansgar Koene, ‘Landscape Summary: Bias in Algorithmic Decision-Making’ (19 July 2019) Centre for Data Ethics and Innovation <assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/819055/Landscape_Summary_-_Bias_in_Algorithmic_Decision-Making.pdf> accessed 4 September 2020. The literature tends to use the term ‘algorithmic discrimination’ and ‘algorithmic bias’ interchangeable. This note generally opts for the former, as it reflects the language of the Equality Act 2010 and the public sector equality duty (s 149(1)).

5 See Solon Barocas and Andrew D Selbst, ‘Big Data’s Disparate Impact’ (2016) 104 California Law Review 671, 677–93.

6 See, eg, Michael Shiner and others, ‘The Colour of Injustice: “Race”, Drugs and Law Enforcement in England and Wales’ (2018) Stopwatch and Release <eprints.lse.ac.uk/100751/1/TheColourOfInjustice.pdf> accessed 4 September 2020.

7 See Jenna Burrell, ‘How the Machine “thinks”: Understanding Opacity in Machine Learning Algorithms’ (2016) 3 Big Data & Society 1; Lilian Edwards and Michael Veale, ‘Slave to the Algorithm? Why a “right to an explanation” is Probably not the Remedy You Are Looking For’ (2017) 16 Duke Law & Technology Review 18.

8 [2020] EWCA Civ 1058 (England and Wales Court of Appeal (EWCA)) (Bridges (EWCA)).

9 R (Bridges) v Chief Constable of South Wales [2019] EWHC 2341 (Admin) (England and Wales High Court (EWHC)) [153]–[158] (Haddon-Cave LJ and Swift J) (Bridges (EWHC)).

10 ibid [153].

11 ibid [156]–[157].

12 Bridges (EWCA) (n 8) [163]–[202] (Sir Terence Etherton MR, Sharp P, and Singh LJ).

13 The Court also found that the police’s use of AFR technology was unlawful on two other bases: it was not in accordance with law for the purposes of art 8(2) of the European Convention on Human Rights, and consequently the police’s data protection impact assessment for the technology did not comply with the Data Protection Act 2018. See ibid [54]–[130], [145]–[154].

14 Bridges (EWCA) (n 8) [182].

15 ibid.

16 Karon Monaghan QC, Equality Law (2nd edn, Oxford University Press 2013) [16.06], quoted in ibid [178].

17 Bridges (EWCA) (n 8) [185].

18 ibid.

19 Bridges (EWHC) (n 9) [154].

20 ibid.

21 Bridges (EWCA) (n 8) [186]–[190]. The police found a higher rate of false positive alerts for women than for men (82 per cent compared to 66 per cent), but concluded that this was due to two women on the police’s watchlist with ‘generic features that may match much more frequently.’ In relation to race, the police found that 98.5 per cent of false positive alerts were for ‘white north European’, and concluded that there was no racial bias.

22 ibid [191].

23 ibid.

24 ibid [193], [197]–[198].

25 ibid [196].

26 ibid [199].

27 ibid.

28 Elias v Secretary of State for Defence [2006] EWCA Civ 1293 (EWCA) [274] (Arden LJ), quoted in ibid [180].

29 [1977] AC 1014, 1065 (Lord Diplock) (House of Lords (HL)). See, eg, R (on the application of Hurley and Moore) v Secretary of State for Business Innovation & Skills [2012] EWHC 201 (Admin) (EWHC) [89]–[90] (Elias LJ).

30 R (on the application of Khatun) v London Borough of Newham [2004] EWCA Civ 55 (EWCA) [35] (Laws LJ, Wilson J and Auld LJ agreeing); R (on the application of Plantagenet Alliance) v Secretary of State for Justice [2014] EWHC 1662 (Admin) (EWHC) [100] (Hallett LJ, Ouseley and Haddon-Cave JJ).

31 Bridges (EWCA) (n 8) [201].

32 Committee on Standards in Public Life, Artificial Intelligence and Public Standards: A Review by the Committee on Standards in Public Life (February 2020) 49 <assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/868284/Web_Version_AI_and_Public_Standards.PDF> accessed 4 September 2020.

33 [2018] SCC 30 (Supreme Court of Canada).

34 ibid [3]–[18] (Wagner J).

35 ibid [13]–[14] (Wagner J).

36 ibid [28]–[67] (Wagner J).

37 ibid [47] (Wagner J).

38 ibid [49] (Wagner J).

39 ibid [50] (Wagner J).

40 ibid [51]–[66] (Wagner J).

41 C-09-550982-HA ZA 18–388 (English) (Hague District Court).

42 ibid [6.59], [6.60]–[6.65].

43 ibid [6.80]–[6.106].

44 Philip Alston, ‘Brief by the United Nations Special Rapporteur on extreme poverty and human rights as Amicus Curiae in the case of NJCM c.s./De Staat der Nederlander (SyRI) before the District Court of The Hague (case number: C/09/550982/ HA ZA 18/388)’ (United Nations Special Rapporteur on extreme poverty and human rights, 26 September 2019) <www.ohchr.org/Documents/Issues/Poverty/Amicusfinalversionsigned.pdf> accessed 25 March 2020.

45 NJCM (n 41) [6.91]–[6.95].

46 ibid [6.93]–[6.95].

47 Bridges (EWCA) (n 8) [193], [197], [199]. AFR systems are notoriously problematic in their treatment of women and people of colour. See, eg, Joy Buolamwini and Timnit Gebru, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ (2018) 81 Proceedings of Machine Learning Research 1, 1–15; Patrick Grother, Mei Ngan, and Kayee Hanaoka, ‘Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects’ (2019) National Institute of Standards and Technology <nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf> accessed 7 September 2020.

48 See Joe Tomlinson, Katy Sheridan, and Adam Harkens, ‘Judicial Review Evidence in the Era of the Digital State’ (2020) Public Law (forthcoming).

49 See, eg, Tameside (n 29). See, in relation to algorithmic decision-making, Christopher Knight, ‘Automated Decision-making and Judicial Review’ (2020) Judicial Review (forthcoming).

50 See, eg, R v Secretary of State for the Home Department, Ex p Doody [1994] 1 AC 531, 565 (Lord Mustill) (HL); R (on the application of Lumba) v Secretary of State for the Home Department [2011] UKSC 12 (United Kingdom Supreme Court (UKSC)) [20]–[39] (Lord Phillips); R (on the application of Hoareau) v Secretary of State for Foreign and Commonwealth Affairs [2018] EWHC 1508 (Admin) (EWHC) [8]–[24] (Singh LJ). See, in relation to algorithmic decision-making, R (on the application of Ames) v Lord Chancellor [2018] EWHC 2250 (Admin) (EWHC).

51 See, in relation to the burden of proof under the Human Rights Act 1998, Aguilar Quila v Secretary of State for Home Department [2011] UKSC 45 (UKSC) [44] (Lord Wilson) [61] (Lady Hale); R (on the application of Kiarie and Byndloss) v Secretary of State for the Home Department [2017] UKSC 42 (UKSC) [78] (Lord Wilson, Lady Hale, Lord Hodge and Lord Toulson agreeing).

Additional information

Notes on contributors

Jack Maxwell

Jack Maxwell is Research Fellow in Public Law and Technology at the Public Law Project.

Joe Tomlinson

Joe Tomlinson is Senior Lecturer in Public Law at the University of York and Research Director at the Public Law Project.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 209.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.