List of References
- ‘A Policy Framework for Responsible Limits on Facial Recognition Use Case: Law Enforcement Investigations’ <https://www3.weforum.org/docs/WEF_A_Policy_Framework_for_Responsible_Limits_on_Facial_Recognition_2021.pdf>
- Abdurrahim SH, Samad SA and Huddin AB, ‘Review on the Effects of Age, Gender, and Race Demographics on Automatic Face Recognition’ (2018) 34 The Visual Computer 1617
- ‘A-Levels and GCSEs: How Did the Exam Algorithm Work?’ BBC News (17 August 2020) <https://www.bbc.com/news/explainers-53807730> accessed 9 June 2023
- Amnesty International, ‘The Overrepresentation Problem: First Nations Kids Are 26 Times More Likely to Be Incarcerated than Their Classmates’ (Amnesty International Australia, 8 September 2022) <https://www.amnesty.org.au/overrepresentation-explainer-first-nations-kids-are-26-times-more-likely-to-be-incarcerated/> accessed 23 November 2022
- Article 19.org, ‘Emotion Recognition Technology Report’ (ARTICLE 19, 2021) <https://www.article19.org/emotion-recognition-technology-report/> accessed 22 November 2022
- Australian Law Reform Commission, ‘Over-Representation’ (ALRC, 2018) <https://www.alrc.gov.au/publication/pathways-to-justice-inquiry-into-the-incarceration-rate-of-aboriginal-and-torres-strait-islander-peoples-alrc-report-133/3-incidence/over-representation/> accessed 23 November 2022
- ‘Austria’s Employment Agency Rolls out Discriminatory Algorithm, Sees No Problem’ (AlgorithmWatch) <https://algorithmwatch.org/en/austrias-employment-agency-ams-rolls-out-discriminatory-algorithm/> accessed 12 June 2023
- Bacchini F and Lorusso L, ‘Race, Again: How Face Recognition Technology Reinforces Racial Discrimination’ (2019) 17 Journal of Information, Communication and Ethics in Society 321
- Bacchini F and Lorusso L, ‘Race, Again: How Face Recognition Technology Reinforces Racial Discrimination’ (2019) 17 Journal of Information, Communication and Ethics in Society 321
- Balko R, ‘Opinion. There’s Overwhelming Evidence That the Criminal-Justice System Is Racist. Here’s the Proof.’ (Washington Post, 2020) <https://www.washingtonpost.com/news/opinions/wp/2018/09/18/theres-overwhelming-evidence-that-the-criminal-justice-system-is-racist-heres-the-proof/>
- Barrett L, ‘Ban Facial Recognition Technologies For Children - And For Everyone Else’ (2020) 26 Boston University Journal of Science and Technology Law 223
- Bradford B and others, ‘Live Facial Recognition: Trust and Legitimacy as Predictors of Public Support for Police Use of New Technology’ [2020] The British Journal of Criminology azaa032
- Brantingham PJ, Valasik M and Mohler GO, ‘Does Predictive Policing Lead to Biased Arrests? Results From a Randomized Controlled Trial’ (2018) 5 Statistics and Public Policy 1
- Buolamwini J and Gebru T, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’, Proceedings of Machine Learning research (2018) <http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf> accessed 17 June 2020
- ‘Charter of Fundamental Rights of the European Union’
- Christakis T and Lodie A, ‘The French Supreme Administrative Court Finds the Use of Facial Recognition by Law Enforcement Agencies to Support Criminal Investigations “Strictly Necessary” and Proportional’ (10 July 2022) <https://papers.ssrn.com/abstract=4321643> accessed 12 June 2023
- Conti J-R and others, ‘Mitigating Gender Bias in Face Recognition Using the von Mises-Fisher Mixture Model’, Proceedings of the 39th International Conference on Machine Learning (PMLR 2022) <https://proceedings.mlr.press/v162/conti22a.html> accessed 28 November 2022
- CPACC SB-H, ‘Disability and AI Bias’ (Medium, 11 July 2019) <https://sheribyrnehaber.medium.com/disability-and-ai-bias-cced271bd533> accessed 10 November 2022
- Criado N and Such JM, ‘Digital Discrimination’ in Natalia Criado and Jose M Such, Algorithmic Regulation (Oxford University Press 2019)
- Dastin J, ‘Amazon Scraps Secret AI Recruiting Tool That Showed Bias against Women’, Martin Kirsten (ed). Ethics of Data and Analytics: Concepts and Cases (Auerbach Publications 2022)
- Dodd V, ‘Met Police to Use Facial Recognition Software at Notting Hill Carnival’ The Guardian (5 August 2017) <https://www.theguardian.com/uk-news/2017/aug/05/met-police-facial-recognition-software-notting-hill-carnival> accessed 15 November 2022
- Dushi D, ‘The Use of Facial Recognition Technology in EU Law Enforcement: Fundamental Rights Implications’ [2020] Policy Briefs 2020- South East Europe<https://repository.gchumanrights.org/server/api/core/bitstreams/51d86ab3-1cb5-45f6-b141-64c06dcef5d8/content>
- Edwards F, Lee H and Esposito M, ‘Risk of Being Killed by Police Use of Force in the United States by Age, Race–Ethnicity, and Sex’ (2019) 116 Proceedings of the National Academy of Sciences 16793
- EFRA, Bias in Algorithms: Artificial Intelligence and Discrimination. (Publications Office 2022) <https://data.europa.eu/doi/10.281125847> accessed 9 June 2023
- European Commission, ‘White Paper on Artificial Intelligence’ <https://ec.europa.eu/info/sites/default/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf>
- European Commission against Racism and Intolerance (ECRI), ‘ECRI General Policy Recommendation N° 11 On Combating Racism and Racial Discrimination in Policing’ <https://rm.coe.int/ecri-general-policy-recommendation-no-11-on-combating-racism-and-racia/16808b5adf>
- European Commission. Directorate General for Justice and Consumers. and European network of legal experts in gender equality and non discrimination., ‘Algorithmic Discrimination in Europe: Challenges and Opportunities for Gender Equality and Non Discrimination Law.’ (Publications Office 2021)
- European Data Protection Board, ‘Guidelines 05/2022 on the Use of Facial Recognition Technology in the Area of Law Enforcement’ <https://edpb.europa.eu/system/files/2022-05/edpb-guidelines_202205_frtlawenforcement_en_1.pdf>
- ‘Facial Recognition Creates Risks for Trans Individuals, Others’ (GovTech, 16 July 2021) <https://www.govtech.com/products/facial-recognition-creates-risks-for-trans-individuals-others> accessed 10 November 2022
- ‘Facial Recognition Software Has a Gender Problem’ (CU Boulder Today, 8 October 2019) <https://www.colorado.edu/today/2019/10/08/facial-recognition-software-has-gender-problem> accessed 10 November 2022
- Fair Trials, ‘Disparities and Discrimination in the European Union’s Criminal Legal Systems’ (Fair Trials, 2021) <https://www.fairtrials.org/articles/publications/disparities-and-discrimination-in-the-european-unions-criminal-legal-systems/> accessed 23 November 2022
- ‘Fair Trials’ Tool Shows How ‘predictive Policing’ Discriminates and Unjustly Criminalises People’ (Fair Trials, 30 January 2023) <https://www.fairtrials.org/articles/news/fair-trials-tool-shows-how-predictive-policing-discriminates-and-unjustly-criminalises-people/> accessed 13 June 2023
- Ferguson AG, The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (New York University Press 2020)
- Fletcher N, ‘Lincs Police to Trial New CCTV Tech That Can Tell If You’re in a Mood’ (LincolnshireLive, 17 August 2020) <https://www.lincolnshirelive.co.uk/news/local-news/new-lincolnshire-police-cctv-technology-4431274> accessed 21 November 2022
- FRA, ‘Under Watchful Eyes: Biometrics, EU IT Systems and Fundamental Rights’ <https://fra.europa.eu/sites/default/files/fra_uploads/fra-2018-biometrics-fundamental-rights-eu_en.pdf>
- FRA, ‘Data Quality and Artificial Intelligence – Mitigating Bias and Error to Protect Fundamental Rights’ (2019)
- FRA, ‘Facial Recognition Technology: Fundamental Rights Considerations in the Context of Law Enforcement’ <https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper-1_en.pdf>
- Fussey P and Murray D, ‘Independent Report on the London Metropolitan Police Service’s Trial of Live Facial Recognition Technology’ (University of Essex Human Rights Centre 2019) <http://repository.essex.ac.uk/24946/1/London-Met-Police-Trial-of-Facial-Recognition-Tech-Report-2.pdf>
- Garcia ACB, Garcia MGP and Rigobon R, ‘Algorithmic Discrimination in the Credit Domain: What Do We Know about It?’ [2023] AI & SOCIETY
- Garvie C, ‘The Untold Number of People Implicated in Crimes They Didn’t Commit Because of Face Recognition | News & Commentary’ (American Civil Liberties Union, 24 June 2020) <https://www.aclu.org/news/privacy-technology/the-untold-number-of-people-implicated-in-crimes-they-didnt-commit-because-of-face-recognition> accessed 28 November 2022
- Garvie C, Bedoya A and Frankle J, ‘Unregulated Police Face Recogniton in America’ (Perpetual Line Up. Georgetown Law. Center of Privacy and Technology, 2016) <https://www.perpetuallineup.org/> accessed 10 November 2022
- Grother P, Ngan M and Hanaoka K, ‘Face Recognition Vendor Test Part 3: Demographic Effects’ (National Institute of Standards and Technology 2019) NIST IR 8280 <https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf> accessed 28 October 2022
- Hourihan KL, Benjamin AS and Liu X, ‘A Cross-Race Effect in Metamemory: Predictions of Face Recognition Are More Accurate for Members of Our Own Race.’ (2012) 1 Journal of Applied Research in Memory and Cognition 158
- Interview with [R1EN], ‘AI Practitioner and Academic from Australia’ (2 November 2021)
- Interview with [R2EN], ‘European NGO Activist’ (2 November 2021)
- Interview with [R4EN], ‘Academic from Australia’ (11 November 2021)
- Interview with [R6EN], ‘Academic from Germany’ (15 November 2021)
- Interview with [R8EN], ‘European NGO Activist’ (22 November 2021)
- James L, ‘The Stability of Implicit Racial Bias in Police Officers’ (2018) 21 Police Quarterly 30
- Knox D, Lowe W and Mummolo J, ‘Administrative Records Mask Racially Biased Policing’ (2020) 114 American Political Science Review 619
- Lee NT, Resnick P and Barton G, ‘Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harms’ (Brookings, 22 May 2019) <https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/> accessed 30 November 2022
- Leslie, David, ‘Understanding Bias in Facial Recognition Technologies’ (2020) <https://doi.org/10.5281/zenodo.4050457> accessed 11 February 2022
- Li DEH Emily Black, Maneesh Agrawala, and Fei-Fei, ‘How Regulators Can Get Facial Recognition Technology Right’ (Brookings, 17 November 2020) <https://www.brookings.edu/techstream/how-regulators-can-get-facial-recognition-technology-right/> accessed 28 October 2022
- Meissner CA and Brigham JC, ‘Thirty Years of Investigating the Own-Race Bias in Memory for Faces: A Meta-Analytic Review.’ (2001) 7 Psychology, Public Policy, and Law 3
- ‘Method Predicts Bias in Face Recognition Models Using Unlabeled Data’ (Amazon Science, 8 November 2022) <https://www.amazon.science/blog/method-predicts-bias-in-face-recognition-models-using-unlabeled-data> accessed 28 November 2022
- Mitchell TM, ‘The Need for Biases in Learning Generalizations’ [1980] Rutgers University, New Brunswick, NJ.
- Mobilio G, ‘Your Face Is Not New to Me – Regulating the Surveillance Power of Facial Recognition Technologies’ (2023) 12 Internet Policy Review
- Murphy K and others, ‘Police Bias, Social Identity, and Minority Groups: A Social Psychological Understanding of Cooperation with Police’ (2018) 35 Justice Quarterly 1105
- Nkonde M, ‘Automated Anti-Blackness: Facial Recognition in Brooklyn, New York’ (2019) 20 7
- O’Leary L, ‘How Facial Recognition Tech Made Its Way to the Battlefield in Ukraine’ (Slate.com, 2022) <https://slate.com/technology/2022/04/facial-recognition-ukraine-clearview-ai.html> accessed 20 May 2022
- O’Neil C, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (1st edition, Crown 2016)
- Perkowitz S, ‘The Bias in the Machine: Facial Recognition Technology and Racial Disparities’ [2021] MIT Case Studies in Social and Ethical Responsibilities of Computing
- Pierson E and others, ‘A Large-Scale Analysis of Racial Disparities in Police Stops across the United States’ (2020) 4 Nature Human Behaviour 736
- Raposo VL, ‘The Use of Facial Recognition Technology by Law Enforcement in Europe: A Non-Orwellian Draft Proposal’ [2022] European Journal on Criminal Policy and Research
- Smith M, Mann M and Urbas G, Biometrics, Crime and Security (Taylor & Francis 2018) <https://researchprofiles.canberra.edu.au/en/publications/biometrics-crime-and-security> accessed 17 July 2020
- Streicher B, ‘Tackling Racial Profiling: Reflections on Recent Case Law of the European Court of Human Rights’ (Strasbourg Observers, 16 December 2022) <https://strasbourgobservers.com/2022/12/16/tackling-racial-profiling-reflections-on-recent-case-law-of-the-european-court-of-human-rights/> accessed 19 September 2023
- ‘TELEFI Project “Summary Report”’ <https://www.telefi-project.eu/telefi-project/results>
- The American Law Institute, ‘Principles of the Law, Policing’ <https://www.policingprinciples.org/> accessed 14 June 2023
- Tolan S and others, ‘Why Machine Learning May Lead to Unfairness: Evidence from Risk Assessment for Juvenile Justice in Catalonia’, Proceedings of the Seventeenth International Conference on Artificial Intelligence and Law (ACM 2019) <https://dl.acm.org/doi/10.11453322640.3326705> accessed 20 January 2022
- Van Brakel R, ‘How to Watch the Watchers? Democratic Oversight of Algorithmic Police Surveillance in Belgium’ (2021) 19 Surveillance & Society 228
- Wachter S, ‘The Theory of Artificial Immutability: Protecting Algorithmic Groups Under Anti-Discrimination Law’ (2022) 97 Tulane law review 149
- Wachter S, Mittelstadt B and Russell C, ‘Bias Preservation in Machine Learning: The Legality of Fairness Metrics Under EU Non-Discrimination Law’ (2021) 123 West Virginia Law Review 735
- Wachter S, Mittelstadt B and Russell C, ‘Why Fairness Cannot Be Automated: Bridging the Gap between EU Non-Discrimination Law and AI’ (2021) 41 Computer Law & Security Review 105567
- Wang Y and Kosinski M, ‘Deep Neural Networks Are More Accurate than Humans at Detecting Sexual Orientation from Facial Images.’ (2018) 114 Journal of Personality and Social Psychology 246
- Xenidis R and Senden L, ‘EU Non-Discrimination Law in the Era of Artificial Intelligence: Mapping the Challenges of Algorithmic Discrimination’, General Principles of EU Law and the EU Digital Order (Wolters Kluwer 2020)
- Zalnieriute M, ‘Burning Bridges: The Automated Facial Recognition Technology and Public Space Surveillance in the Modern State’ (2021) 22 Columbia Science and Technology Law Review 284
- DH and Others v the Czech Republic [2007] ECtHR [GC] 57325/00
- R (on the application of Bridges) v Chief Constable of South Wales [2019] EWHC 2341 (Admin); [2020] 1 WLR 672; [2019] 9 WLUK 9 (DC)
- Council Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment between persons irrespective of racial or ethnic origin. OJ L 180, 19.7.2000, p. 22–26 2000
- Council Directive 2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in employment and occupation. OJ L 303, 2.12.2000, p. 16–22. 2000
- Directive 2014/24/EU of the European Parliament and of the Council of 26 February 2014 on public procurement and repealing Directive 2004/18/EC. OJ L 94, 28.3.2014, p. 65–242 2014
- Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA. OJ L 119, 4.5.2016, p. 89–131 2016
- Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts. COM/2021/206 final 2021
- Regulation (EU) 2017/2226 of the European Parliament and of the Council of 30 November 2017 establishing an Entry/Exit System (EES) to register entry and exit data and refusal of entry data of third-country nationals crossing the external borders of the Member States and determining the conditions for access to the EES for law enforcement purposes, and amending the Convention implementing the Schengen Agreement and Regulations (EC) No 767/2008 and (EU) No 1077/2011 2017 (OJ L)