1,183
Views
0
CrossRef citations to date
0
Altmetric
Articles

Fundamental rights control when implementing predictive policing – a European perspective

ABSTRACT

The paper approaches preventive justice and big data from a predictive policing point of view. Predictive policing is a controversial technology because of the many risks it imposes on fundamental rights. Through use cases from the EU, the paper focuses on how the implementation of predictive policing technologies has been limited in the EU but also how difficult the limitations might be due to the high hopes set upon technology. Although predictive policing technologies are not yet used in Finland by the police, the paper discusses how the use of police powers is limited in Finland and how fundamental rights control in the democratic process will work if the police were willing to adopt such technology.

I. Introduction: from preventive justice to algorithmic prediction

Preventive justice is an ambiguous concept. It can be connected not only to criminal justice but also to other fields of law, such as environmental regulation or taxation. As Zedner and Ashworth point out, preventive justice is not its own field of law like criminal justice: ‘Unlike criminal justice or civil justice, of course, preventive justice does not refer to an established system or even an acknowledged, coherent domain of legal enterprise’.Footnote1 Preventive justice in the criminal law context is connected to the concept of preventive turn in criminal law. Briefly, preventive turn in criminal law refers to using criminal law as a means to prevent crimes and as a form of controlling future events.Footnote2 The development that started in the twentieth century emphasises preventing crime before it happens by using proactive methods instead of merely reacting to crime.Footnote3 For instance, this can be done with preventive criminalisation or preventive sentencing, discussed elsewhere in this publication.

The approach to preventive justice in this paper is a bit different. It does not discuss about using criminal law as a means to prevent future acts but increasing use of modern digital technologies as a tool to predict crime. Datafication of society has enabled new opportunities for authorities to collect data and use it to maintain surveillance over large masses of people that would be impossible to do manually. Additionally, the development of ‘artificial intelligence’ (AI) technologies, such as machine learning, has enabled the authorities to get analysis on large data sets and the creation of which might go beyond human understanding. Thus, it can be stated that the modern digital technology and possibilities seen in it, feed the ethos to prevent crime before happening.Footnote4

There are as many ways to use modern digital technologies for crime prevention purposes as there are imaginative minds. In this paper, the focus has been narrowed down to discuss predictive technologies used by a particular public authority, the police. In addition to policing, they can be used by the prosecution (predictive prosecution)Footnote5 and by the courts (predictive sentencing),Footnote6 to predict future criminality.Footnote7 Predictive technologies used by the police are generally referred to as predictive policing.Footnote8 One definition of predictive policing is ‘the application of analytical techniques – particularly quantitative techniques – to identify likely targets for police intervention and prevent crime or solve past crimes by making statistical predictions’.Footnote9 This paper is limited to proactive policing, which is different from the reactive use of predictive policing technologies. It is another question for the police to act before a crime has been committed compared the police trying to solve a crime which has already happened. Also, a distinction can be made between place-oriented and person-oriented predictive policing. The former refers to predicting the time and place a crime is mostly likely to occur and the latter refers to predicting which people will commit a crime in the future.Footnote10

It needs to be emphasised that predictive policing is not criminal intelligence that is targeted at a certain person. It refers to digital tools used as police officers’ daily work to maintain surveillance over ordinary citizens, usually aiming to prevent violent crime but also petty crimes such as pickpocketing.Footnote11 The difference in predictive policing compared to the previous proactive methods of the police is that it is based on big data and powerful algorithms such as machine learning.

In Europe, many countries’ police forces are adopting predictive policing.Footnote12 In addition to the member states, the EU is doing this. These big data technologies are seen as the panacea to prevent especially terrorism and serious crime.Footnote13 However, a range of supervisory authorities and courts will later control the implementation. The latest case from one of the member states occurred in February 2023 in Germany, when the Constitutional Court declared predictive policing to be unconstitutional.Footnote14

In Finland, intelligence-led policing is one of the cornerstones of police work.Footnote15 However, the National Police Board, the central administrative authority of the Finnish Police, has not announced that big data or ‘AI’ technologies such as predictive policing are used in Finland.Footnote16 As a member of the European Union, the role of the union is essential when discussing big data policing in Finland. For instance, the EU is aiming, the EU is aiming to regulate the use of AI on a union level, including predictive policing.Footnote17

Predictive policing is a controversial technology because of the many risks it imposes on fundamental rights. Since the context of the use of these technologies is in exercising public power and furthermore, the within monopoly of the legitimate use of physical force,Footnote18 their implementation should be put under especially detailed scrutiny specifically to set strict limits on the use of power by the state that goes hand in hand controlling risks of breaching fundamental rights. Through use cases, the discussion in this paper is on how the implementation of predictive policing technologies has been limited in the EU but also how difficult the limitations might be due to the high hopes set upon technology. Although AI technologies are not yet used in Finland, the paper paints the picture of how the democratic process will work in Finland and how fundamental rights control would be done.

The paper is structured as follows. In Section II, there is a short overview of the technological side of predictive policing and fundamental rights risks related to it. It needs to be emphasised that since the paper does not focus on particular predictive policing systems, these are all generalisations related more to the risks of big data and machine learning technologies. The particular functioning of each predictive policing system and the risks posed to fundamental rights have to be done through a case-by-case assessment. However, this overview will hopefully provide the reader with an understanding of why implementing these technologies in a democratic society that respects fundamental rights is especially controversial and why they should be scrutinised so strictly.

In Section III, the paper introduces two cases from the EU in which the union has adopted legislation that enables the use of predictive policing-type technologies. Both of the cases address fundamental rights, worries relating to personal data collection, and analysis for crime prevention in the specific case. What is of interest here is that these two cases have been ex-post target of fundamental rights’ control. Before concluding, Section IV discusses on how the control of fundamental rights would be in Finland if the police were willing to adopt such a technology.

II. Generalisations on predictive policing technologies and their fundamental rights’ risks

A. Generalising predictive policing technology

It is clear that prediction is not new method in policing. However, instead of making predictions based on the training, experience and intuition of police officers, in predictive policing they are made by computer software.Footnote19 When computer software replaces the human in this way, legal scholars have been interested to know how the new technology actually functions. Many contributions on predictive policing contain an analysis of the technological side of predictive policing. This is essential because making a legal assessment of the risks to fundamental rights, it is necessary to know what the phenomenon under review actually is. However, as stated before, the description of the technology is a theoretical generalisation because each system has its own properties regarding the data it uses, algorithms it contains, and so on. The two generalisations discussed here are that: predictive policing is based on big data and the data are processed by machine learning algorithms or ‘AI’.

The first theoretical premise is that predictive policing is based on big data.Footnote20 Big data refers to masses of data, including both personal and other data, the amounts of which are so huge that a human could not process it, much less make conclusions about it. As Chan and Bennet Moses define it, big data it can be characterised through the three Vs: ‘Volume (the amount of data), Velocity (the speed at which data is being added and processed) and Variety (the fact that data may come from multiple sources using different formats and structures)’.Footnote21 Usually the data included are historical crime data but also data from private companies or other public registries.

The data analysed by predictive policing software is only a half of the story. The second theoretical premise is linked to the algorithms that process big data. Simply put, an algorithm is a set of rules or instructions that tell a computer how to function. Not all algorithms are the same. Some of them are rule-based, basically telling the computer an instruction ‘when X, then Y’. However, in predictive policing, the algorithms do not function like this. These algorithms are machine learning, which is a sub-category of artificial intelligence.Footnote22 In machine learning, the algorithm learns from its own experience and evolves over time without human programming. The algorithms are first trained with a set of training data from which it learns to recognise patterns and create rules based on them.Footnote23 After training, it is moved to its true working environment in which the role of training data is reduced, and the role of data provided in this environment increases.Footnote24 It develops continuously and recognises new patterns and creates new rules and replaces the old ones.Footnote25

It is essential to understand the role that the input data has in machine learning. In the case of predictive policing, the use of crime data in the algorithm means that the police officers who are logging incidents in the police systems are also producing information for the algorithm.Footnote26 This problem, characterised as a self-fulfilling prophecy, is one of the problems related to predictive policing technologies that are discussed next.

B. Generalising fundamental rights’ risks of predictive policing

Legal scholars and civil rights organisations have recognised several risks for individuals who are targeted by a predictive policing software. Many of the problems are linked to each other. Here it is only possible to provide an overview of the problems which are complex and each worth their own contribution.

The problems already referred to concern both the data used in predictive policing as well as the nature of machine learning. Firstly, characterised as ‘garbage in, garbage out’, the quality of prediction is as good as the data used as an input.Footnote27 The way data are collected or inserted into the algorithm can have a major impact on the prediction. Also, although some data used in predictive policing algorithm could be empirically accurate, its use might be unethical. For instance, using empirical facts that certain parts of the population are more involved in crime would lead to the algorithm classifying people belonging to these groups as being more likely to be potential offenders.Footnote28 This is again the problem related to self-fulfilling prophecy and accuracy but also a question of discrimination.

It is essential to remember that predictive policing only provides statistical predictions created by complicated algorithms. It does not imply what will happen in the future. There is a danger that the predictive policing system merely repeats the society’s hidden structural biases, which is a problem that cannot be solved merely by deleting discriminatory elements from the data.Footnote29 In addition to the data, the functioning of the algorithm is not neutral because the programmer impacts the system variables and what data the system uses.Footnote30 Hence, neither the data nor the algorithms are neutral and never can be. They will always reflect either the societies’ discriminatory imbalances or the programmers’ worldview.

Another problem linked to machine learning is the opacity of the functioning of the algorithm.Footnote31 This can be approached from three points of view. Firstly, if the predictive policing software is developed by a private company, it is likely that the software and its functions will be trade secrets.Footnote32 Secondly, the functioning of machine learning algorithm is a sort of black box.Footnote33 Due to the self-learning property, there is a risk that the developer of the system will be unable to explain how the software ended up with a certain result. Lastly, although all the information about the technological properties of the predictive policing software was public, it is unlikely that an ordinary citizen could understand it when considering that explaining the workings of a machine learning system is difficult even for a professional.Footnote34

Opacity is problematic because it reduces the legitimacy of these systems and trust in the authorities. Additionally, understanding the working of the predictive policing software would be essential for the police officers acting on it.Footnote35 This would be important because there is the risk for automation bias also in predictive policing. Automation bias refers to the risk of humans blindly trusting technology and limiting their own consideration if there is suspicion about whether the algorithm is working correctly.Footnote36 There is also a risk that the police officer in this case would not consider arguments contrary to the prediction but would instead feel the need to act according to the prediction provided by the algorithm. This is even though the known problems of predictive technologies would urge police to be highly critical when using these systems.

Lastly, the issues with privacy, data protection and the fear of surveillance are seen as one of the major problems regarding predictive policing. Many contributions concerning predictive policing start with a reference to the movie Minority report (2002) by Steven Spielberg or the book 1984 (1949) by George Orwell. The concerns are necessary since examples from outside Europe show that predictive policing can be used for mass surveillance and deprivation of some parts of population. The examples from the EU presented in the next section are strongly connected to the issues of privacy and data protection risks and well as opposing mass surveillance in Europe. In the EU as well as in Finland the right to privacy and data protection are protected as fundamental rights. Especially in Europe, after the GDPRFootnote37 and LEDFootnote38 came into force, data protection is usually the first question that is raised in the context of predictive policing since these systems are usually based on the collection and processing of personal data.

III. Examples of limiting predictive policing in the EU

A. Limiting the use of passenger name record data and analytics

Air travel is highly regulated, especially after the terrorist attacks using aircraft in the US in September 2001. Modern digital technologies and the digital form of data enable law enforcement authorities to pre-screen large numbers of passengers in the name of crime prevention. Finding potential terrorists is one goal of the EU’s Passenger Name Record (PNR) directive 2016/681.Footnote39 The directive was implemented in Finland as the Act on the use of passenger name record data for combatting terrorist offences and serious crime (657/2019).

The core idea of the PNR directive is that airline companies transfer personal data provided by passengers to the EU member states’ law enforcement authorities for purposes of preventing, detecting, investigating and prosecuting terrorist offences and serious crime. PNR data includes information such as name, date(s) of intended travel, all forms of payment information, frequent flyer information, all baggage information, seat number and other seat information. PNR data is used for carrying out an assessment of passengers prior to their scheduled arrival in or departure from the Member State to identify people who require further examination by the competent authorities. The assessment includes comparing PNR data against relevant databases and also assessing passengers against the risk criteria and profiles that have been developed.Footnote40 If a passenger fits into a risk profile or criteria inserted in the system, they are flagged as a positive match. For example, a person travelling without luggage or buying their ticket at the last minute and paying by cash are ‘deviant behaviour’.Footnote41 Any positive match resulting from the automated processing of PNR data is individually reviewed by non-automated means to verify whether the competent authority needs to take action under national law.

The PNR directive was under the scrutiny of the European Court of Justice (CJEU) in its recent ruling Ligue des droits humains (C-817/19) handed down on 21 June 2022. The case was brought before the Belgian national court by a fundamental rights organisation and sought the annulment of the national law implementing the directive. The national court referred the case to the CJEU that according to Art. 267 of the Treaty on the Functioning of the European Union (TFEU) has the competence to give a preliminary ruling regarding the interpretation or validity of a provision of EU law. The case had several questions referred for a preliminary ruling. Only the relevant parts of the case for the purposes of the paper have been discussed here. These are firstly, what the Court stated about the relationship between the PNR Directive and fundamental rights to data protection and privacy protected by the Charter of Fundamental Rights of the European Union,Footnote42 and secondly what it stated about processing PNR data against pre-determined criteria.

To start with the conclusion, the CJEU did not declare the directive invalid but limited its application. The CJEU stated that

the PNR Directive entails undeniably serious interferences with the rights guaranteed in Articles 7 and 8 of the Charter, in so far, inter alia, as it seeks to introduce a surveillance regime that is continuous, untargeted and systematic, including the automated assessment of the personal data of everyone using air transport services.Footnote43

It limited the use of PNR data to what is only strictly necessary for the purposes of preventing, detecting, investigating and prosecuting terrorist offences and serious crime.Footnote44 Firstly, the member states must ensure to ‘that the application of the system established by the PNR Directive is effectively limited to combating serious crime and that that system does not extend to offences that amount to ordinary crime’.Footnote45 Secondly, the wording of the directive allows member states to choose whether they apply PNR analysis only to extra-EU flights, or also to intra-EU flights.Footnote46 This option was also limited by the court. The application of the system

must be limited to the transfer and processing of the PNR data of flights relating, inter alia, to certain routes or travel patterns or to certain airports in respect of which there are indications that are such as to justify that application.Footnote47

The court also limited the type of technology that can be used when discussing processing PNR data against pre-determined criteria. Firstly, the starting point of the court was that despite the fact that the system also produces false positives, ‘automated processing carried out under the said directive have indeed already made it possible to identify air passengers presenting a risk in the context of the fight against terrorist offences and serious crime’.Footnote48 However, according to it, the wording of the directive ‘pre-determined criteria’ limits the type of algorithms that produce matches on the basis of PNR data:

As noted by the Advocate General in point 228 of his Opinion, that requirement precludes the use of artificial intelligence technology in self-learning systems (‘machine learning’), capable of modifying without human intervention or review the assessment process and, in particular, the assessment criteria on which the result of the application of that process is based as well as the weighting of those criteria.Footnote49

These quotations from the CJEU are only a very short overview of the case. What is essential here is that after the judgement, PNR analysis cannot be developed into a predictive policing type of tool because of the technological limitations the court set. Also, as can be seen from the judgment, although the PNR directive is seen as being necessary, strict limits were set on its application.

B. Europol’s big data challengeFootnote50

Europol is the European Union Agency for Law Enforcement Cooperation with the purpose of countering serious crime and terrorism. Since its establishment in 1995,Footnote51 the objective of the agency has been to improve the cooperation between the EU member states and its tasks have centred on facilitating information exchange and analysing information.Footnote52 Hence, the purpose of the Europol has always been strongly connected to working with data.

Although the Convention of 1995 had provisions on computerised systems of information, the technological reality especially when it comes to opportunities for data collection, was different. In the 2010s, the aim in the EU was that Europol would become ‘a hub for information exchange between the law enforcement authorities of the Member States, a service provider and a platform for law enforcement services’.Footnote53 Now, the foundation of Europol is enacted in Art. 88 of the Treaty on the Functioning of the European Union. According to Art. 88(1) Europol’s task is to support and strengthen action by the Member States’ police authorities and other law enforcement services and their mutual cooperation in preventing and combating serious crime affecting two or more Member States, terrorism and forms of crime which affect a common interest covered by a Union policy.

According to the TFEU 88(2)(a), one of the tasks of Europol is the collection, storage, processing, analysis and exchange of information, in particular that forwarded by the authorities of the Member States or third countries or bodies. More specific regulation on Europol’s powers is enacted in Europol regulation (EU) 2016/794.Footnote54 The regulation contains provisions on Europol’s rights to process personal data.Footnote55

The right to data protection is fundamental and is recognised in Art. 8 of the Charter of Fundamental Rights of the European Union.Footnote56 European data protection supervisor (EDPS) is the data protection authority for the European Union institutions, bodies and agencies and thus, it also monitors Europol’s data processing activities (Europol Regulation Art. 43). In 2019 the EDPS opened its ‘own initiative inquiry on the use of Big data analytics by Europol for purposes of strategic and operational analysis’.Footnote57 The problematic practice was that Europol received large datasets from the member states, other operational partners and also collected data in its open-source intelligence. The EDPS characterised these data as large datasets because their volume and the nature or the format of the data, could not be processed with ‘regular tools, but require the use of specific tools and/or storage facilities’.Footnote58

Annex II B (1) of the Europol regulation limits the categories of data subjects to those whose data may be collected and processed by Europol. These categories can only be (a) people suspected of a crime, (b) persons who potentially commit a serious crime or terrorist act in the future based on factual indications or reasonable grounds, (c) witnesses, (d) victims, (e) contacts and associates, and (f) persons who can provide information on the criminal offences under consideration. However, as the EDPS noted, due to the volume of information it is impossible for Europol to ascertain that all personal data included is within the limits of the aforementioned categories.Footnote59

In 2020 the conclusion of the inquiry was clear: Europol had overstepped the mandate given to it by the provisions concerning data processing in the Europol regulation. As the EDPS describes in his decision, over the years, Europol’s operational practices had evolved towards gaining larger and larger volumes of data.Footnote60 In their current form there was a high risk that Europol was storing personal data which were not linked to criminal activity. This could cause damage to these people’s fundamental rights, for instance freedom of movement.Footnote61

Although according to Art. 43(3)(e) of the Europol regulation the EDPS has the power to order Europol to carry out the rectification, restriction, erasure or destruction of personal data, or according to 43(3)(f) impose a temporary or definitive ban on processing operations by Europol, the EDPS chose not to do so. Instead, the inquiry led to the EDPS admonishing Europol and it required Europol to draw up an action plan to mitigate data protection issues. Europol responded to the admonishment cooperatively, but the agency also called for revision of the Europol Regulation.Footnote62 In December 2020, the Commission presented a proposal to review and expand Europol’s mandate.Footnote63 In the proposal, the Commission suggested changes to Europol’s mandate to enable Europol to process large and complex datasets as a response to the EDPS’s decision.Footnote64 The reform of the Europol regulation raised concerns in civil society organisations. In January 2022 23 civil society organisations wrote a public letter to the EU legislators on their concerns about the impact of the reforms on fundamental rights and hoped the legislator would change its mind on the changes to the Europol Regulation.Footnote65 Despite of the loud fundamental rights activists, the amending regulation was accepted on 8 June 2022, and it came into force on 28 June 2022.

In addition to allowing the big data practices of Europol, what is especially problematic from the fundamental rights oversight perspective is that the amendments made to the regulation retroactively legalises forbidden practices. In January 2022, the EDPS had ordered Europol to delete the personal data of individuals who were not linked to criminal activity.Footnote66 The EDPS gave Europol 12 months to comply with the decision regarding datasets received before the decision. However, when the amended Europol regulation entered into force, in practice it overrode the EDPS. In September 2022 the EDPS requested that the CJEU annul two provisions of the amended Europol regulation because

The two provisions have an impact on personal data operations carried out in the past by Europol. In doing so, the provisions seriously undermine legal certainty for individuals’ personal data and threaten the independence of the EDPS – the data protection supervisory authority of EU institutions, bodies, offices and agencies’.Footnote67

In June 2023, the case is still pending.Footnote68

The Europol big data challenge shows how the ethos to prevent serious crime and terrorism can override fundamental rights in political decision-making. Interestingly, the EU Parliament has taken a different approach to predictive policing in general. As mentioned earlier, the EU tries to respond to predictive policing technologies in the Proposal for Artificial Intelligence Act (AIA).Footnote69 In the AIA, the EU’s aim is to regulate AI systems also used in law enforcement or on its behalf to assess the risk posed by a natural person for offending or reoffending and AI systems that predict the occurrence or reoccurrence of an actual or potential criminal offence based on profiling of natural persons or to assess personality traits and characteristics or past criminal behaviour of natural persons or groups.Footnote70 On 14 June 2023 the European Parliament adopted its negotiating position on the AIA. The Parliament's stand concerning predictive policing is that it should be banned in the AIA. However, the fate of predictive policing is not yet decided because the European Commission, the European Parliament, and the Council of the European Union still have to negotiate on the final wording of the AIA.Footnote71 The Parliament's stand concerning predictive policing in the AIA is a bit controversial because as noted by the European Digital Rights organisation, the new Europol regulation actually legalises Europol’s predictive policing.Footnote72

IV. Limits on the use of predictive policing practices in Finland

Before concluding, the paper looks into the Finnish fundamental rights’ control and how it would function in the case that the Finnish Police would adopt predictive policing. Limitations to using predictive policing can be derived from data protection, as the cases from the EU level that were presented in the previous section have shown.Footnote73 However, data protection only controls the use of personal data. Other relevant question is what the law enforcement authorities are entitled to do based on the predictions from predictive policing. The section starts by discussing the limitation to use predictive policing technologies as a basis for police activities. These limitations can be derived from the legislation concerning policing in general in Finland. The other half of the section focuses on controlling the adoption of predictive policing at the legislative level.

In Finland, the starting point in limiting the use of public power, such as by the police, is enacted in Section 2(3) of the Constitution of Finland (731/1999) in the form of the Principle of Legality: The exercise of public power must be based on the law. The law must be strictly observed in all public activities. Additionally, the police officers are government officials and work under the official accountability required of them in Section 118 of the Constitution: an official is responsible for the lawfulness of his or her official actions. In addition, an individual who has suffered an infringement or damage due to an unlawful act or omission by an official, has the right to seek imposition of a punishment on the relevant official and claim damages for the harm suffered.

The key statute for policing in Finland is the Police Act (872/2011). It lists in Section 1 of Chapter 1 the duties of the Police to inter alia secure the rule of law, maintain public order and security, and prevent, detect and investigate crimes. However, despite this general listing, all policing activities must always be based on a specific provision of law when an officer intervenes in an individual’s rights.Footnote74 Thus, this means that the law then should contain a particular provision that allows the police to intervene in individuals’ rights, such as the right to liberty and security has been enacted in Section 7 of the Constitution, merely based on a software prediction.

Currently, the Police Act contains a provision in its Chapter 2 in Section 10 on Preventing an offence or disturbance. According to subsection 1 of the Section:

A Police officer has the right to remove a person from a scene if there are reasonable grounds to believe on the basis of the person’s threats or other behaviour, or it is likely on the basis of the person’s previous behaviour, that he or she would commit an offence against life, health, liberty, home or property, or would cause a considerable disturbance or pose an immediate danger to public order or security.Footnote75

The Police Act came into force in 2014 when the usage of big data technologies was most likely not been considered by the Finnish police. The preparatory documents do not contain references to using big data or other algorithmic tools when assessing a person’s behaviour and the provisions clearly refer to onsite evaluation by the police officers.Footnote76 Hence, this provision would not be a sufficient basis for the police to act based on predictive policing prediction. In addition to providing powers, Chapter 1 of the Police Act also contains principles that restrain activities of the police officers. It contains the obligation to respect fundamental and human rights (Section 2), the Principle of Proportionality (Section 3),Footnote77 the Principle of Minimum Intervention (Section 4),Footnote78 the Principle of Intended Purpose (Section 5),Footnote79 and postponing actions and refraining from taking actions (Section 9).Footnote80 Thus, despite whether in the future the Police Act would give explicit power to act based on a predictive policing prediction, the officers would still need to comply with these principles.

If and when the day would come when the police wanted to implement predictive policing, it would have to go under a strict fundamental rights' scrutiny first. According to Section 22 of the Constitution, public authorities must guarantee the observance of fundamental rights and liberties and human rights.

When assessing the potential implementation of predictive policing in Finland from the fundamental rights control point of view, it is necessary to understand how the constitutional control of securing fundamental rights works in Finland. When the fundamental rights chapter of the Constitution was reformed in the 1990s, the Government proposal explicitly stated:

Fundamental rights influence the legislator in many ways. Not only do they limit Parliament’s powers as legislator, but they can also impose active obligations on the legislator. A fundamental right provision may give general guidance to the legislator or contain an explicit constitutional mandate to implement a particular act.Footnote81

In Finland, fundamental rights control happens in two ways: parliamentary ex-ante supervision by the Constitutional law committee and ex-post by the courts. This obligation is explicitly stated in Section 74 of the Constitution: The Constitutional Law Committee shall issue statements on the constitutionality of legislative proposals and other matters brought for its consideration, as well as on their relation to international human rights treaties. In addition, the supreme overseers of legality, the Ombudsman of the Parliament and the Chancellor of Justice can assess the legality if technological implementationFootnote82 as well as different supervisory authorities, such the data protection ombudsman who supervises the application of legislation in practice as well as give statements during the legislative process. Ex-post supervision of fundamental rights is done by the courts and Finland does not have a constitutional court. Sections 106 (The primacy of the Constitution)Footnote83 and 107 (Limitation on the application of subordinate legislation)Footnote84 of the Constitution give courts the mandate to ex-post oversee constitutionality of the Acts and lower regulations such as government’s degrees. However, ex-post supervision is exceptional and secondary to parliamentary supervision by the Constitutional Law Committee. In addition the provisions concern only individual cases and do not give the courts powers to assess overall validity of an act.

Lastly, from the individuals’ point of view, Section 21 of the Constitution constitutes a right to an effective remedy, fair trial, and good administration and is always relevant when exercising public power.Footnote85 This refers to individuals’ rights to take their case, e.g. related to predictive policing for the assessment of supervisory authority or under a court scrutiny. Currently, the Ministry of Justice has an initiative to reform the Finnish Data protection Act and Act on the Processing of Personal Data in Criminal Matters and in Connection with Maintaining National Security (1054/2018).Footnote86 The initiative aims to alter the provisions of the two acts so that the data subjects can refer the case to the court, if the data protection ombudsman has not dealt with the complaint or informed the data subject within three months of the progress or outcome of the case.Footnote87

V. Conclusions

At the EU level, the approach to predictive policing has been controversial. The EU legislator has implemented legislation that allows predictive policing or at least technological solution that is very close to predictive policing. However, when it comes to the AIA, it is still unclear whether the EU will allow predictive policing in this context or not. What seems clear is that in the EU, the different institutions (the EDPS, the CJEU but also the legislator), and fundamental rights organizations are very aware of the fundamental rights risks of predictive policing. Because of the high hopes set upon the modern technology in crime prevention, the difficult thing is to find a right balance between the implementation of these technologies and fundamental rights.

When it comes to implementing predictive policing, Finland is lagging. This can be considered to be a good thing because of the complex fundamental rights problems outlined in this paper. The EU legislator has implemented predictive policing practices in the context of fighting serious crime and terrorism, which is still different from applying these technologies to prevent petty crime.

Although the EU could allow the use of predictive policing in the future, Finland has constitutional structures that place its use under strict scrutiny. Legislation on the police and the use of public power as well as the fundamental rights control during and after legislative process provide many ways to control fundamental rights risks if Finland would decide to implement predictive policing.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 Andrew Ashworth and Lucia Zender, Preventive Justice (OUP 2014) 6.

2 See in general Richard V Ericson, Crime in an Insecure World (Wiley 2007).

3 E.g. David Garland, The Culture of Control (The University of Chicago Press 2001); Zedner and Ashworth (n 1); Rosamunde van Brakel and Paul De Hert, ‘Policing, Surveillance and Law in a Pre-crime Society: Understanding the Consequences of Technology-based Strategies’ (2011) 20(3) Cahiers Politiestudies Jaargang 163.

4 This phenomenon could be seen connected to technological normativity. See Mireille Hildebrandt, ‘Legal and Technological Normativity: More (and Less) than Twin Sisters’ (2008) 12 Techne 169.

5 In Europe e.g. Harm Assessment Risk Tool, ‘HART' that is used in the UK.

6 E.g. Compas algorithm used in the United States.

7 Another question is the usage of predictive policing prediction later as evidence if a crime has actualised.

8 Andrew Guthrie Ferguson, ‘Predictive Policing and Reasonable Suspicion’ (2012) 62 Emory LJ 259, 265.

9 Walter L Perry, Brian McInnis, Carter C Price, Susan C Smith, and John S Hollywood, Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations (1st edn, RAND Corporation 2013) 29.

10 These concepts are used in Amnesty International Netherlands, Meeting report: PHRP Expert Meeting on Predictive Policing. 20–21 May 2019.

11 See examples in ‘Automating Injustice’ (Fair Trials, 9 September 2021) <www.fairtrials.org/app/uploads/2021/11/Automating_Injustice.pdf> accessed 29 March 2023.

12 ibid.

13 See European Security Union, ‘the EU Security Union Strategy from 2020 to 2025: Communication from the Commission on the EU Security Union Strategy’ COM (2020) 605 final.

14 Judgment of 16 February 2023 1 BvR 1547/19, 1 BvR 2634/20. See the press release: Legislation in Hesse and ‘Hamburg Regarding Automated Data Analysis for the Prevention of Criminal Acts Is Unconstitutional’ (Bundesverfassungsgericht, 16 February 2023) <www.bundesverfassungsgericht.de/SharedDocs/Pressemitteilungen/EN/2023/bvg23-018.html> accessed 29 March 2023.

15 Mika Sutela, ‘Tiedon, analyysin ja analytiikan hyödyntämisen tarve poliisissa – ilmeinen ja suuri?’ [The Need for the Police to Use Information, Analysis and Analytics – Obvious and Great] (Polisi, 15 September 2020) <https://poliisi.fi/blogi/-/blogs/tiedon-analyysin-ja-analytiikan-hyodyntamisen-tarve-poliisissa-ilmeinen-ja-suuri-> accessed 29 March 2023; ‘The Expert Opinion of the National Bureau of Investigation on Data Processing in the Police (Polisi, 10 January 2019) <www.eduskunta.fi/FI/vaski/JulkaisuMetatieto/Documents/EDK-2019-AK-236903.pdf> accessed 29 March 2023.

16 Ministry of the Interior, ‘Finland’s Strategy on Preventive Police Work 2019–2023’ (Publications of the Ministry of the Interior, 2019 11) 37: ‘Major investments are being made in the development of automation and artificial intelligence in Finland and other countries and using such applications in different tasks is still in its initial stages’ <https://julkaisut.valtioneuvosto.fi/bitstream/handle/10024/161343/SM_11_19_Strategy%20on%20preventive%20police%20work.pdf?sequence=1&isAllowed=y> accessed 29 March 2023. See more Vesa Syngelmä, Ennustamisteknologioiden hyödyntämismahdollisuudet osana ennakoivaa poliisitoimintaa [The Opportunities for Using Predictive Technologies as a Part of Predictive Policing] (University of Tampere 2021) <https://trepo.tuni.fi/handle/10024/130523> accessed 29 March 2023 (only available in Finnish). In his master’s thesis Vesa Syngelmä has conducted qualitative research on whether predictive policing is actually to be used in Finland by interviewing police officers. The research concluded that predictive technologies are not currently in used in Finland.

17 EU, ‘Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS’, COM/2021/206 final.

18 Max Weber, Hans Gerth, Charles Wright Mills, and Bryan S Turner, ‘Politics as a Vocation’ in HH Gerth and C Wright Mills (eds), From Max Weber: Essays in Sociology (Routledge 2009 78).

19 Elizabeth E Joh, ‘Feeding the Machine: Policing, Crime Data, & Algorithms’ (2017) 26 Wm Mary Bill Rts J 287.

20 Joh (n 20) 287; Beth Pearsall, ‘Predictive Policing: The Future of Law Enforcement?’ (2010) 266 NIJ Journal 16.

21 Janet Chan and Lyria Bennet Moses, ‘Is Big Data Challenging Criminology?’ (2016) 20(1) Theoretical Criminology 24.

22 E.g. Opinion of the European Economic and Social Committee on ‘Artificial intelligence—The consequences of artificial intelligence on the (digital) single market, production, consumption, employment and society’ (own-initiative opinion) 2017/c 288/01, sections 2.1 ja 2.3.

23 ibid.

24 Michal S Gal, ‘Algorithmic Challenges to Autonomous Choice’ (2018) 25(1) Michigan Telecommunications and Technology Law Review 59, 65.

25 Peter Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Cambridge University Press 2012) 3.

26 Joh (n 20) 289.

27 Mittelstadt et al, ‘The Ethics of Algorithms: Mapping the Debate’ (2016) July–December Big Data & Society 1, 5.

28 Lindsey Barrett, ‘Reasonably Suspicious Algorithms: Predictive Policing at the United States Border’ (2017) 41(3) NYU Review of Law & Social Change 327, 340–41.

29 Riikka Koulu, ‘Digitalisaatio ja algoritmit – oikeustiede hukassa?’ (2018) 7–8 Lakimies 840–67, 858–59.

30 ibid 859.

31 For detailed discussion on the opacity, see Barrett (n 29).

32 E.g. Compas in the US.

33 Frank Pasquale, The Black Box Society (Harvard University Press 2015) 3.

34 Kroll et al, ‘Accountable Algorithms’ (2017) 165(3) University of Pennsylvania Law Review 633, 638.

35 Barrett (n 29) 344.

36 ibid 343; Danielle Keats Citron, ‘Technological Due Process’ (2008) 85(6) Washington University Law Review 1249, 1271–72. Linda Skitka et al, ‘Automation Bias and Errors. Are Crews Better Than Individuals?’ (2000) 10(1) The International Journal of Aviation Psychology 85, 86: ML Cummings, ‘The Social and Ethical Impact of Decision Support Interface Design’ in Waldemar Karwowski (ed), International Encyclopedia of Ergonomics and Human Factors (Vol I, 2nd edn, Taylor & Francis 2006) 1249, 1250; Andrew Guthrie Ferguson, ‘Policing Predictive Policing’ (2017) 94 Wash U L Rev 1109, 1178.

37 EU, Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC.

38 EU, Directive 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.

39 EU, Directive 2016/681 of the European Parliament and of the Council of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime.

40 Art 6

41 Government Proposal on the PNR legislation (HE 55/2018 vp) 26–27.

42 Charter of Fundamental Rights of the European Union [2000] OJ C364/1, Arts 7 and 8.

43 ibid para 111.

44 ibid paras 117, 122.

45 ibid para 152.

46 For instance, Finland chose to apply the directive both to intra- and extra-EU flights.

47 ibid para 174.

48 ibid para 123.

49 ibid para 194.

50 The ‘Europol’s big data challenge' originates from the EDPS DECISION of 17 September 2020.

51 Europol was originally established on the COUNCIL ACT of 26 July 1995 drawing up the Convention based on Art K.3 of the Treaty on European Union, on the establishment of a European Police Office (Europol Convention).

52 Europol Convention [1995] OJ C316/2, Arts 2 and 3.

53 Recital 3 of the Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA. The recital notes that this aim was laid down already in the Stockholm Programme (2010), which is a five-year political, strategic document describing the focus of cooperation in the policy areas of justice and home affairs of the EU Member States for the years 2010–2014.

54 ibid.

55 ibid Ch IV.

56 Art 8: Protection of personal data – data should be processed fairly and for specified purposes and on the basis of consent or some other lawful basis.

57 EDPS, ‘DECISION of 17 September 2020 relating to EDPS own inquiry on Europol’s big data challenge’ section 2.3 (EDPS, 18 September 2020) <https://edps.europa.eu/sites/default/files/publication/20-09-18_edps_decision_on_the_own_initiative_inquiry_on_europols_big_data_challenge_en.pdf> accessed 29 March 2023.

58 ibid section 1.1.

59 ibid sections 4.7–4.9.

60 As the EDPS notes: ‘The nature of the data collected at national level in the context of criminal investigations and criminal intelligence operations is not limited anymore to targeted data collection but also increasingly includes the collection of large datasets. More digital content is generated and thus available for law enforcement in the context of criminal investigations, which, in turn, impacts the methods used to produce criminal intelligence' ibid section 3.9.

61 ibid section 4.10.

62 Europol’s correspondence 16 October 2020 <www.europol.europa.eu/cms/sites/default/files/documents/Public%20version%20of%20EDOC%20%231133469.pdf> accessed 29 March 2023.

63 COM (2020) 796 final Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL amending Regulation (EU) 2016/794, as regards Europol’s cooperation with private parties, the processing of personal data by Europol in support of criminal investigations, and Europol’s role on research and innovation.

64 Proposal Section 5. Other elements.

65 EDRI, ‘Letter to Policymakers on Europol’ (EDRI, 26 January 2022) <https://edri.org/wp-content/uploads/2022/02/Letter-to-Policymakers-on-Europol.pdf> accessed 29 March 2023.

66 EDPS, Decision on the retention by Europol of datasets lacking Data Subject Categorisation (Cases 2019-0370 and 2021-0699) <https://edps.europa.eu/data-protection/our-work/publications/investigations/edps-orders-europol-erase-data-concerning_en> accessed 29 March 2023.

67 See EDPS, ‘EDPS Takes Legal Action as New Europol Regulation Puts Rule of Law and EDPS Independence Under Threat’ (EDPS, 22 September 2022) <https://edps.europa.eu/press-publications/press-news/press-releases/2022/edps-takes-legal-action-new-europol-regulation-puts-rule-law-and-edps-independence-under-threat_en> accessed 29 March 2023.

68 Case T-578/22: Action brought on 16 September 2022 — EDPS v Parliament and Council.

69 COM(2021) 206 final.

70 ibid Annex III.

71 European Parliament resolution of 6 October 2021 on artificial intelligence in criminal law and its use by the police and judicial authorities in criminal matters.

72 See EDRI, ‘Secret Negotiations About Europol: The Big Rule of Law Scandal’ (EDRI, 31 January 2022) <https://edri.org/our-work/secret-negotiations-about-europol-the-big-rule-of-law-scandal/> accessed 29 March 2023.

73 In Finland, Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, the ‘Law Enforcement Directive' has been implemented into the Act on the Processing of Personal Data in Criminal Matters and in Connection with Maintaining National Security (1054/2018). In addition, there are distinct acts for personal data processing by different law enforcement authorities which complement the aforementioned: Act on the Processing of Personal Data by the Police (616/2019), Act on the Processing of Personal Data by the Border Guard (639/2019), and Act on the Processing of Personal Data by the Customs (650/2019).

74 The Government proposal on the Police Act (HE 224/2010 vp) 72; Decision of the Deputy-Ombudsman of the Parliament (EOA 1634/4/01) 18 December 2003 <www.oikeusasiamies.fi/r/fi/ratkaisut/-/eoar/1634/2001> accessed 29 March 2023.

75 In subsection 2, the provision adds that ‘A person may be apprehended if his or her removal is likely to be an inadequate measure and the offence cannot otherwise be prevented or the disturbance or danger otherwise removed’. The period of apprehension can last maximum 24 hours.

76 The Government proposal on the Police Act (HE 224/2010 vp).

77 Police action shall be reasonable and proportionate with regard to the importance, danger and urgency of the duty; the objective sought; the behaviour, age, health and other specifics of the person targeted by the action; and other factors influencing the overall assessment of the situation.

78 The police shall not take action that infringes anyone’s rights or causes anyone harm or inconvenience more than is necessary to carry out their duty.

79 The police may exercise their powers only for the purposes provided by law.

80 Subsection 1: The police have the right to refrain from taking an action if completion of the action could lead to an unreasonable conclusion compared with the outcome sought.

81 Government proposal to amend the fundamental rights provisions of the Constitution (HE 309/1993 vp) 26.

82 Previously the assessment has focused on automated decision making in administration, e.g., in taxation and immigration services.

83 According to the Section: If the application of a statutory provision in the case before the court would be manifestly unconstitutional, the court must give preference to the constitutional provision.

84 If a provision of an ordinance or other subordinate legislation is contrary to the Constitution or other law, it may not be applied by a court or other authority.

85 Section 21: Everyone has the right to have their case dealt with appropriately and without undue delay by a legally competent court of law or other authority, as well as to have a decision pertaining to his or her rights or obligations reviewed by a court of law or other independent organ for the administration of justice. Provisions concerning the publicity of proceedings, the right to be heard, the right to receive a reasoned decision and the right of appeal, as well as the other guarantees of a fair trial and good governance shall be laid down by an Act.

86 which implements the Law Enforcement Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA

87 Oikeusministeriö Justitiministeriet, ‘Tietosuojalain ja rikosasioiden tietosuojalain muuttaminen’ (Oikeusministeriö Justitiministeriet, 14 February 2023) <https://oikeusministerio.fi/hanke?tunnus=OM016:00/2023> accessed 29 March 2023.