3,279
Views
3
CrossRef citations to date
0
Altmetric
Research Articles

Staying in control of technology: predictive policing, democracy, and digital sovereignty

ORCID Icon
Pages 963-978 | Received 26 Oct 2022, Accepted 23 Mar 2023, Published online: 18 Apr 2023

ABSTRACT

This article engages the question how police departments internally delegitimize the use of private sector technology vis-à-vis democratic considerations. Specifically, it retraces argumentation around the discontinuation of commercial predictive policing software and the alternative development of in-house tools. Conceptually, the analysis builds on the notion of digital sovereignty that has more recently become a popular analytical trope vis-à-vis public-private partnerships/public sector procurement and the question which technologies and services state agencies should keep under their own control in order to avoid dependencies on external suppliers and ensure functionality even in times of crisis and/or breakdown. Importantly, digital sovereignty allows us to study the complex and multi-faceted power relations between the police, private sector technology suppliers, and the public. The analysis identifies three key facets of digital sovereignty around which police concerns regarding the use of private sector technology revolve: (1) the control over data; (2) skepticism towards black boxes; and (3) accountability requirements towards the general public.

Introduction

The literature on technology and democracy tends to foreground the challenges to democratic norms that emerge from state agencies’ use of data-driven and algorithmically mediated tools. Scholars have in this context wondered how well established democratic political models are equipped to cope with technological innovationFootnote1 and have pointed out how increasingly data-driven forms of governance “profoundly change state–citizen relations and the way governments understand and respond to citizens.”Footnote2 Particular attention has thereby been paid to security and intelligence agencies and the transformations that interactions between state actors and citizens undergo when they come into being in ways that are mediated through data and algorithms.Footnote3 Even when formally curbed by legal frameworks, security and intelligence agencies have demonstrated a tendency to push the limits in regard to knowledge generation through data collection at scale and AI-supported means of analysis.Footnote4 In simple words, while security and intelligence agencies are supposed to safeguard democracy, by way of their technology-mediated actions they simultaneously threaten to undermine democratic principles.

This dilemma is of particular importance for the case of the police and there is a rich body of literature that engages the nexus of policing, technology, and democracy. While this literature acknowledges that the ways in which the police produce knowledge and action have always been mediated by technology,Footnote5 there is a broad consensus that digital technologies present an unprecedented challenge for established democratic control mechanisms. At the core of this argument lies the acknowledgement that data-driven and algorithmically mediated forms of knowledge production differ from traditional forms of knowledge production in scale, speed, and complexity. In this capacity, digital technologies have the potential to supersede the cognitive capacities of humans, thus presenting a new form of authority that can be hard to challenge.Footnote6 Moreover, digital technologies come with an appeal of scientific methods, impartiality, neutrality, and objectivity that has been praised as a solution for human bias and prejudice.Footnote7 Research has, however, shown that in practice, the use of data and algorithms has a tendency to reinforce and/or rationalize existing issues in police behaviour such as racism or sexism rather than to overcome them.Footnote8

In summary, the literature has so far mostly concentrated on how technologically mediated capacities could undermine the ways in which the police protect democratic principles and act in democratically acceptable ways. In this sense, the police are commonly portrayed as an actor that is mostly concerned with effectiveness and efficiency and whose main interest is how these categories could be increased through the procurement and use of new technologies. A so far underexplored aspect of the nexus of policing, technology, and democracy is, however, how the police themselves struggle with the alignment of technology and democracy. Building on empirical material collected throughout a multi-year research project on the implementation and use of predictive policing technology in Germany and Switzerland, this article analyses how police departments internally delegitimize the use of private sector technology and instead opt for the development of in-house tools with regard to democratic considerations.

To understand this turn away from commercial products and towards internal research and development activities, it builds on the concept of digital sovereignty. Digital sovereignty has more recently become a popular analytical trope vis-à-vis public-private partnerships/public sector procurement and the question which technologies and services state agencies should keep under their own control in order to avoid dependencies on external suppliers and ensure functionality even in times of crisis and/or breakdown.Footnote9 In this sense, it is regarded a key requirement for public agencies to preserve and safeguard democratic norms. Analytically, digital sovereignty allows us to study the complex and multi-faceted power relations between the police, private sector technology suppliers, and the public. In the context of the police, considerations of digital sovereignty have particular relevance as they relate to the long-standing practice of police procurement of technological tools from external vendors.Footnote10

The article proceeds as follows. First, it reviews conceptual work on digital sovereignty and carves out the analytical benefits of a digital sovereignty perspective on the question whether public agencies should use third-party technology or rely on their own tools. It then explicates data and methodology and discusses how the analysis presented here relates to larger trajectories of democracy and technology use in policing. The empirical part subsequently retraces how police departments justified their choices to not use/discontinue to use commercial software tools. In doing so, the analysis identifies three key themes around which police internal arguments revolved: (1) the control over data; (2) skepticism towards black boxes; and (3) accountability requirements towards the general public.

Digital sovereignty

“The police have been digitally sovereign before there was digital sovereignty.” This is how one participant of an expert panel on “digital sovereignty” at the European Police Congress in May 2022 framed the relationship between police departments and the technologies that they use. In the context of the panel debates, the statement referred to the fact that the police, in their capacity as a central state actor concerned with the production and maintenance of public order, must be rather careful as to which tasks and capacities they outsource or out-contract to private sector companies due to the sensitive nature of many police tasks and the power relations and dependencies that can emerge from reliance on commercial technologies. Debates about the police and the use of private sector tools, although a long-standing theme in the history of policing, have in fact more recently gained renewed impetus against the backdrop of predictive policing and other forms of data-driven and algorithmically mediated forms of knowledge production and intervention.Footnote11

The notion of sovereignty has been traditionally linked to statecraft and the nation-state, referring to an idea of acknowledged supreme authority over a population or a territory – which in practice were thought to be largely congruent.Footnote12 According to this logic, sovereign tasks such as taxation, the use of force, or the maintenance of public order must, although they can be delegated to non-state actors, in principle remain with the state and its institutions. More recently, sovereignty has, vis-à-vis digitization and datafication of many of these tasks, seen a considerable renaissance in academic debates, questioning whether it could in principle still be upheld and, if so, how it becomes reconfigured through new technological capacities and new actor constellations.Footnote13 Importantly, there is a direct link between digital sovereignty and democracy. As Floridi has argued, digital sovereignty reinforces the idea that in democratic forms of societal organization, governmental capacities and the population are closely entangled – with the latter legitimizing the former and the former executing the latter’s direct or indirect mandate. As in theory, this relationship should not be interfered with by other actors (i.e. the private sector), for him, digital sovereignty “could deliver full democratic legitimacy and great innovative flexibility, if designed successfully.”Footnote14

It should at this point be noted that there is some terminological variance in the literature. Notions of data sovereignty, information sovereignty, cyber sovereignty, network sovereignty, computer sovereignty, or technological sovereignty make different points of emphasis, yet largely carry an overlapping basic definition of sovereignty that pertains to the protection of a particular (digital) sphere or domain that must be shielded from outside interference, i.e. kept sovereign.Footnote15 Against the backdrop of such terminological multiplicity, Couture and Toupin suggest using the term digital sovereignty as a broad category that includes technologies, infrastructures, as well as data/content.Footnote16 Digital sovereignty in this sense relates to how notions of autonomy, independence, and protection from unwanted interference are becoming renegotiated in relation to digital technologies and algorithmic forms of knowledge production.

In concrete terms, digital sovereignty covers both “the capacity for collectivities (states, communities, social movements, etc.) to innovate and/or engage in technological development” as well as “the security and/or privacy of individuals or collectives, [and] ownership and control over data related to oneself, citizens, or a state.”Footnote17 Importantly, both Couture and Toupin as well as Pohle and Thiel note that, when conceived of in digital terms, sovereignty no longer applies exclusively to the state and its institutions, but also to private companies and individuals (and in extension social movements).Footnote18 As a heuristic device, digital sovereignty can thus be used as an analytical lens to explore the constellations and power relations between state actors, the private sector, and citizens.

Shifting knowledge and power relations between the state and private actors are in fact a pertinent theme in the literature that engages with digital sovereignty. Floridi goes as far as to see the struggle between states and private companies as the pinnacle for digital sovereignty.Footnote19 As he argues, the digital sphere primarily comes into being through private actors that design, produce, sell, and maintain digital infrastructures and the tools to exploit them. States, on the other hand, largely depend on the capacities of private actors from which they buy products or expertise in the form of technologies and services. While state actors often lack the capacities to actively shape digital technologies, they are, however, capable of installing regulatory frameworks that can set (dis-)incentives and determine what is legal and what is not. In the words of Floridi, the relationship between private companies and states might thus best be described as one where “the former can determine the nature and speed of change, but the latter can control the direction of change.”Footnote20

This relationship is reflected in the domain of policing, where technological change is often triggered by the private sector that develops novel tools for prevention and enforcement activities and sells these tools to state authorities. There is in fact a long history of private sector innovations that have transformed police capacities and in turn the ways in which society is policed, including patrol cars, two-way radio, or biometrics and forensic capabilities.Footnote21 Enhanced police capabilities by means of technology procurement from private sector suppliers have, in this context, been characterized as a form of agenda-setting/policy-making in its own right, for example in regard to drone purchases and subsequent surveillance capabilities/practices.Footnote22 As Ayling and Grabosky argue, there are “benefits and pitfalls of [business] relationships between police and the private sector” – but notably technology procurement has effects on “the desirability of institutional and procedural safeguards, and the implications of these arrangements for efficiency, equity and efficacy in public policing.”Footnote23 For Konina, procurement policies could thus in fact be conceptualized as an incision point for regulation and improved protection of human rights.Footnote24

Criminological research has, however, at the same time shown that there is no default effect of private technology on the ways in which the police go about their tasks. On the contrary, numerous studies have shown that police departments have considerable leeway when it comes to the ways in which new technologies are implemented into established cultures, routines, and processes.Footnote25 Technology might, in this sense, not produce the originally intended outcomes, or even be resisted internally.Footnote26 These are important points to consider for the analytical context of this article and the ways in which police departments actively resist and/or discontinue commercial predictive policing tools.

Besides power relations between state actors and private companies, the acknowledgement of the analytical importance of the individual/public in the literature on digital sovereignty is a second key theme in the context of this article. From a conceptual perspective, citizenship has always been predicated upon the assembly and mobilization of information about the population, enabling governments and state institutions to interact with citizens in targeted and individualized ways.Footnote27 The widespread production and availability of digital data has, however, profoundly transformed the significance of such information and its circulation – with unprecedented implications for the knowability and targetability of groups and individuals.Footnote28 Digital sovereignty in this context then also relates to the individual’s “ability or entitlement to steer data flows and/or to govern informational resources”Footnote29 about them. In other words, digital sovereignty pertains to the question who knows what about a particular person and what that person can do to control the sharing of information between state actors and private companies. This constellation, as will be explicated throughout the empirical analysis, is an important one in the context of police work, as the police by definition produce information about individuals and might by default share this information with third parties if they use private sector tools for data processing.

Data and methodology

The analysis presented in this article is based on qualitative empirical data gathered during a multi-year research project on the implementation and use of predictive policing among 12 police departments in Germany and Switzerland.Footnote30 Between 2016 and 2022, three types of data were gathered. First, a total of 69 semi-structured expert interviews with police officers (leadership, crime analysts, patrol officers) as well as industry representatives were conducted. Second, data were created through forms of participant observation, shadowing crime analysts during their work with predictive policing software and participating in several end user workshops and other professional events organized by industry companies and police departments. Third, a total of 375 documents (public and non-public domain) pertaining to legal frameworks, policy, implementation and use of predictive policing were collected. Non-public domain documents included white papers, instruction manuals, best practices guidelines, as well as personal communications.

The resulting data corpus was coded with qualitative content analysis software (MaxQDA) using an in-vivo coding approach, i.e. initial code categories were formed from the empirical material to break it down into thematic sections and subsequently re-aggregate them into cross-cutting themes and concepts. Coding was refined during multiple rounds, resulting in a code tree comprising 3,555 coded elements among 11 main categories and two levels of sub-categories. This structure was throughout the analysis for this article used to identify police-internal modes of reasoning against the use of private sector predictive policing technology. Throughout the empirical data, three main clusters of coded segments speak to this question: (1) concerns about data; (2) concerns about the lack of transparency; and (3) concerns about external accountability. The following empirical section presents a structured analysis of the viewpoints that respondents presented vis-à-vis these categories.

All empirical data were, as per agreement with research participants, anonymized in order to prevent the identification of individuals or institutions. Empirical material is throughout the text referred to as either “I” (interview), “P” (protocol), or “D” (document) and numbered according to the project internal organization scheme. Quotes have been translated from German by the author.

It should at this point be noted that not all researched police departments decided to not use/discontinue the use of commercial tools. Out of the 12 departments, 6 had starting using third-party software and 6 had decided to build their own tools from the start. Of the 6 using commercial solutions, 4 discontinued them at some point during or after the research phase. The research design (a limited number of in-depth case studies) raises the question whether the findings presented here are representative of larger trends and whether they could be generalized. While the “multi-sited ethnography”Footnote31 approach pursued throughout the project provided situated perspectives on predictive policing and software from multiple contexts, there is usually a considerable degree of variance between different police departments in terms of organizational structure, available resources, and (politically informed) strategies and policies. This variance in most cases stems from the long history and relative autonomy of police departments, leading to largely idiosyncratic structures.Footnote32 Moreover, there is arguably some cross-country variance when it comes to the general willingness to use private sector technology, as well as in terms of data protection legislation.

Vis-à-vis the initially discussed focus of the literature on either the risks of state actors’ use of digital technologies or, more specifically, the implementation and use of data-driven and algorithmically mediated tools into police work, it remains unclear whether the case of resistance against private sector technology based on democratic considerations is an outlier, whether state actors internal rationalizations and engagements with technology have been simply been overlooked so far, or whether we might be seeing the emergence of a new phenomenon against the backdrop of the increasingly wide-spread digital technologies in the public sector. More research will be required to substantiate or contest the findings presented here.

Staying in control of technology

Predictive policing has been one of the most prevalent innovations in police work in recent years.Footnote33 Generally speaking, predictive policing approaches seek to produce short-term operational predictions about the future occurrence of crime by identifying patterns in data about crime and society.Footnote34 Identified patterns can then in a second step be used as a basis for operational crime prevention measures, usually in the form of targeted patrols for the sake of deterrence, targeted controls, or awareness campaigns among residents or the general public. Predictive policing thus has the potential to profoundly transform the ways in which the police produce knowledge and turn knowledge into action. As such, it ties in with Terpstra et al.’s diagnosis that police organizations are increasingly becoming “abstract,” i.e. replacing individual knowledge and discretion with digitized processes and system-level decision-making procedures that bureaucratically reconfigure both internal organizational structures and the police’s relationship with the public.Footnote35

This relationship is usually considered to be a complicated one in the first place, as police forces have repeatedly been shown to cross the boundaries between legal and illegal, moral and immoral, and adequate and inadequate interventions.Footnote36 In particular regard to predictive policing, concerns have been voiced about potential new or transformed democratic defiances, for example in the form of data-based discrimination, over- or under-policing of certain areas or neighbourhoods, or the lack of transparency and accountability of complex algorithmic systems.Footnote37 When conceived through the lens of digital sovereignty, predictive policing thus poses considerable challenges for the ways in which state actors (i.e. the police) enter into dependency relationships with technology suppliers (i.e. private sector software manufacturers), and for the ways in which data about citizens are being created and shared.

In Europe, Germany and Switzerland had been among the first countries to use predictive policing tools on a regular basis, with several state and cantonal level police departments testing or implementing predictive policing software between 2014 and 2017.Footnote38 As the idea of predictive policing was mostly pushed by the private sector, initially only one German language software package – PRECOBS (Precrime Observation System) by German manufacturer IfmPt – was readily available for immediate implementation. The default option for police departments seeking to use predictive policing technology was thus to use private sector technology. However, throughout the research period most researched police departments at some point opted to discontinue PRECOBS and instead decided to develop in-house alternatives.

The remainder of this article empirically reconstructs police arguments against the use of private sector technology. Three key themes emerging from the empirical data will be discussed. The first theme pertains to data and the reluctance to share sensitive data with third parties. The second theme pertains to analytical processes and the ambition that these should remain understandable for the police themselves. And the third theme relates to the fact that police work should, at least in principle, remain understandable for the general public as well. Taken together, they arguably express a desire to stay digitally sovereign vis-à-vis the democratic pitfalls that data-driven and algorithmically mediated technologies present.

Data sharing

Predictive policing software runs on data. From a private sector perspective, real crime data provided by police departments offer the best chance to render their products most efficient. Similar to training data in machine learning, as one industry representative explained, their company considered the use of actual crime data key to the successful implementation and adjustment of their product, as it allowed them to identify spaces historically susceptible to serial criminal activity and use these historical vulnerabilities as a baseline for the identification of future risk. Without tweaking the algorithm of their software based on such data, they claimed no accurate statements about future crime risk could be computed (I 01). Moreover, as they explained, historical data analysis would need to be repeated in regular intervals due to recursive interaction effects between crime prevention and offender behaviour:

You need to adjust the software. The very moment you start using software and have officers patrol in a targeted fashion, you also change offender behavior. You get a reaction. That means near-repeat areas won’t remain near-repeat areas forever.Footnote39 You can simulate that. Our usual simulation period is three years. And as a result, you get a certain number of areas that had high crime rates. And if there were high crime rates in the past three years, then there’s a high likelihood for high crime rates in year four as well. But if you look at these areas at a different point in time – six years ago, you’ll see a difference. (I 01)

For the police, the use of commercial software then turns into a trade-off between effectiveness and the willingness to share crime data with private companies. For interviewees, this trade-off involved considerations that tie in neatly with the expansion of digital sovereignty to the individual, i.e. “the more or less abstract idea that individuals, specific groups, or communities should retain control over the handling of their data.”Footnote40 As part of their core task of knowledge production, the police produce large amounts of data on a daily basisFootnote41 and these data are prone to be particularly sensitive as they contain personal data. Individual citizens, in other words, have a chance to be represented in police data in multiple roles: as suspects, victims, or witnesses. These links must not be even be direct ones, but can also become established via proxies. In the case of residential burglary, for example, the street address or GIS coordinates in the data might more or less easily reveal who had been victimized. Many of the researched police departments thus decided that potentially sensitive crime data should not be accessible to commercial third-party actors. As one respondent forcefully argued, their department came to the conclusion that using commercial software would have been a “deal-breaker” in this context (P 30). And as another analyst framed it, their department was in fact somewhat worried about the public perception of police data sharing practices, rendering the in-house development of their own tool a viable alternative as it would not involve data sharing:

We wanted to do this without having a company looking at our data. I’m not sure whether citizens would think that’s a good idea. That’s also what we hear from other public agencies: that people are not thrilled that commercial companies work with their data. I mean [commercial software suppliers] need our data to create their models, just like we work with our data. And of course they sign data protection agreements and non-disclosure agreements. But still, maybe the better way forward is to ask whether we can’t do this ourselves. (I 31)

What is interesting about this statement is the conceptual shift in thinking about sovereignty. While it might be argued that crime data are police data in the sense that the police create and substantiate them through their work, it might just as well be argued that crime data are about citizens, rendering them the sovereign. Such a view would closely correspond with the idea that individuals should retain control of data sharing processes that go beyond the required minimum. While there might be different interpretations about who should be the sovereign in regard to data, the considerations put forward by the respondent hint at the special relationship between the police and the general public. Interviewees were clearly interested in the effectiveness of their work and the tools that they use, but just as well they were concerned about their image and the trust and/or legitimacy relationships that the police seek to build with the population.Footnote42 In this context, considerations about data sovereignty and the reluctance to share crime with third parties do extend to citizens as well.

A different aspect of data sovereignty considerations put forward by interviewees concerns control about which data actually become part of analytical processes. Police departments developing their own predictive policing tools argued that there was added value in staying in control of data selection. As opposed to blindly trusting a commercial product and its data requirements, as one analyst explained, they “[k]now which data are in there, why they are in there, what works, what doesn’t work. That alone is huge for us.” (I 36). This argument is closely linked to the understandability and accountability issues that will be discussed in the following sections. It is, however, telling in itself that during research many police departments were keen on staying in control of data selection and experimentation as well – something that they arguably would not have been able to do when relying on commercial off-the-shelf solutions.

Black boxes

A second theme in police departments’ decisions to stay in control of technology pertains to the lack of retraceability of analytical processes in commercial software. Interviewees would in this sense regularly refer to the idea of “black boxes” or “black-boxed processes.” The concept of the black box, used by Latour to describe the inaccessibility of the inner workings of complex mechanisms and the societal implications from such inaccessibility,Footnote43 has become a widely used heuristic to describe and scrutinize democratic challenges vis-à-vis increasingly opaque forms of digital knowledge production.Footnote44

Among interviewees, a general assumption in this context would be that commercial software would by default remain inaccessible and that analysts or patrol officers could therefore not understand the logic used to come to a given conclusion. In fact, one interviewee recounted how their department had made inquiries with major software vendors as to whether they would be willing to agree to some trials/experimentation with their products – but none of them answered positively to the requests (I 54). And another analyst recalled that a software company had not been willing to grant their potential customers an independent code evaluation, despite such evaluations being good business practice (I 55). One senior police official summarized the general predicament that their department experienced vis-à-vis black boxes as follows:

[Software] is a proprietary system. What the algorithm looks like, that’s not open source. […] I believe that software paid for by taxpayer money should be transparent. We know the effects of bad data, if you think of the US, the kind of bias that bad data can produce. And if you have at least open source code – I think that can be really beneficial for reviews. […] So personally I will insist on things to be and remain transparent. I have no idea how to ensure this in the long-term, but I’m convinced it’s the way forward. Commercial companies have an interest in creating revenue, and whether their algorithms are good or not – you can’t really argue about this as long as you don’t see the code. (I 63)

Digital sovereignty, in this context, would for research participants largely mean to not rely on something that they could not understand and retrace. As one interviewee framed it, their department had decided to embed the production of crime estimates within established routines for risk assessment, turning predictive policing into an activity that could be carried out “not with a black box that we don’t understand, but in the context of tactical crime analysis where we can retrace our analytical processes” (I 64). And another officer added that their department was not comfortable with third-party analytical processes and therefore decided to “simply analyze our data ourselves” (I 31). As they further expanded on the anticipated advantages of that decision:

If we do this ourselves, with transparent processes, then everybody benefits. Analytical principles are not secret or proprietary or otherwise protected. In fact they can be implemented rather easily and you can visualize both the results and the way towards the results. (I 31)

One senior analyst highlighted how such transparency would also be paramount for practical implementation of any analytical insights into street-level patrolling:

We did not want a black box that uses data as input and then you get a map or something. For us, it was important to have a certain transparency, something that our people understand and that they can use. Just as well, from our perspective, there’s little sense in having a scientist doing the analysis and instructing our patrol units. Because they will feel patronized. (I 02)

Especially the latter part of this statement is of analytical interest, as it explicates another major reason why commercial tools tend to be problematic in the context of policing. In police work, scholars have described an entrenched conflict between “craft” and “science,” i.e. between a form of cultural-organizational authority that rests on tradition, expertise, and experience on the one hand, and a scientific form of authority that rests on data, theories, models, and algorithms on the other hand.Footnote45 In the particular case of predictive policing, it has been shown how machine-generated recommendations for action (i.e. where to police based on risk estimates) can spark resistance among patrol officers, in some cases even leading to the deliberate disregard of risk areas during patrol activities.Footnote46 As one senior analyst put forward, they would “never trust a machine” and find it “helpful if decision criteria [used by the software] would be explicated more explicitly” (I 03). The craft/science divide is likely to become aggravated when knowledge production is no longer understandable and plausible.Footnote47 As one respondent framed it: “Our colleagues have a critical attitude towards [software] and demand transparency, they want to be able to understand what is going on from a technical point of view: how risk estimates are generated and what they mean.” (I 12)

Public accountability

A final theme of arguments against the use of commercial software revolves around the question of the understandability of police work towards the outside. Accountability, i.e. the ability to give an account of one’s actions, has been deemed “the hallmark of modern democratic governance” as it provides a mechanism to hold those in power responsible for “their acts and omissions, for their decisions, their policies, and their expenditures.”Footnote48 As such, it is directly linked to the ability to understand the knowledge and decision-making processes that have led to a particular intervention – and there is thus also a direct connection to police skepticism towards black boxes and their democratic duties to remain accountable for their actions. And while accountability in police contexts has been described as an ambivalent concept that at times tends to become obscured rather than actively fostered,Footnote49 one senior officer summarized the importance of accountability capacities in police work as follows:

Especially when we use new technologies, algorithmic decision-making: police accountability is key. When someone asks us how we came to a decision, we need to be able to answer to that in concrete terms. We can’t say: ‘The machine told me what to do.’ (P 61)

To achieve this, interviewees highlighted that their departments were looking for a “methodology that is transparent and comprehensible for third persons” (I 12). It should be noted here that methodology refers to the abstract principles (i.e. theories, models, algorithms, and data) that underpin predictive policing software logics, and not concrete cases of risk estimation. The latter would arguably be reserved for explication in actual accountability hearings and not be distributed pro-actively. However, transparency about abstract principles can also be seen as a trust-building measure towards the public – something that makes sense given the considerable skepticism put forward against predictive policing from NGOs and other civil society organizations.Footnote50 Notably, interviewees did, however, also consider the wider repercussions of transparency and accountability in technological systems for democratic processes. As one analyst put forward, their department was during software development aware of the implications of non-black-boxed processes for courts:

We want to make this understandable for everyone, also when it comes to court proceedings: for the prosecutor, for the judge. How does the algorithm work? How did we arrive at this conclusion? What data have been processed and why? How did we compute an increased risk? (I 31)

One evaluation report of predictive policing software even came to the conclusion that any kind of data-driven, algorithmically mediated analysis in police work should remain independent, open, and transparent – or, in other words, sovereign:

A fundamental decision [for the in-house development of predictive policing software] was to keep the methodology for prediction theory-based, within the responsibility of the police, and to use independent, open systems. […] Keeping analytical processes with [department], having a flexible system and the relevant technical and professional competencies will unlock future potential for new approaches to preventive policing. (D 142)

Accountability is in this sense seen not only as a democratic value in itself, but also as an enabler for structural developments within the department. If accountability relies on the professional capacities to build open and transparent tools, so the rationale at work here, then these capacities must be built and established in the first place. Such a perspective is arguably not one that is genuinely interested in the value of accountability but rather in securing the budget to build in-house expertise. Nevertheless, it could in the long-run contribute to police technology that is more in line with democratic principles.

Conclusions

Based on empirical data from a multi-year research project, this article has investigated how the police themselves frame and legitimize their choices not to use/discontinue to use commercial software products for predictive policing purposes and instead develop their own in-house alternatives. In doing so, the article adds a novel layer to the literature on technology and democracy by engaging the perspective of state actors. In explicating police internal frictions and argumentation lines, the analysis shows how police departments not necessarily opt for the technological tools that promise to be most effective and efficient, but also take into consideration larger concerns about their own role in/vis-à-vis society. The analysis thus challenges the wide-spread notion that the use of digital technologies in public sector contexts would primarily be contested by civil society, while state actors and the industry would unanimously push for their implementation.

Conceptually, the analysis presented here allows us to reconsider the technology choices and underlying assumptions of state actors. The traditional understanding of relationships between state actors and the private actors sector is coined by the assumption that while certain tasks can be delegated, state actors formally stay in control in order to maintain sovereignty. Such delegation and the considerations and choices that go with it, as the analysis presented here has shown, is also crucial with regard to the tools that state actors use for their tasks. While no tools are ever “neutral,” data-driven and algorithmically mediated applications are particularly challenging as they do work that used to be carried out by humans. In doing so, they implicitly challenge the sovereignty assumptions that underpin the role of the police, i.e. the production and maintenance of public order. Moreover, they potentially undermine the link between public accountability of state authorities and the principle of democratic control.

The analysis of how state actors (in this case: the police) attempt to stay in control of complex technological tools speaks to multiple core themes in democratic societies: autonomy/self-determination, understandability, and accountability are fundamental democratic concepts that might easily become undermined by black-boxed and irretraceable knowledge production processes. As mentioned in the beginning, there is no guarantee that state-owned technology will in fact be “better” as compared to private sector products. It might even involve the “right” choices for the “wrong” reasons. After all, the impact of technology cannot be evaluated in a vacuum but must be studied empirically through the formation of socio-technical relations – and much more research is required in the nexus of technology and democracy.

The reluctance to use third-party technologies for analytical purposes does, however, tie in neatly with more recent conceptual advancements in regard to digital sovereignty, particularly with regard to the relation between state actors, private companies, and citizens. Moreover, it speaks closely to the importance of technology and design when it comes to safeguarding key democratic principles. And while it remains at least questionable whether “state-owned” technology would by default be more compliant with democratic principles than private sector technology, the thematic brings to the fore a so far understudied aspect of the nexus of technology and democracy, i.e. the perspective of involved state actors. Through its actor-focused empirical perspective, this article explicates the struggles and rationales of the police rather than portraying them as either “good” (i.e. democratic) or “bad” (i.e. a threat to democracy).

Almost needless to say, the relation between technology, policing, and democracy remains an ambiguous and multi-faceted one. On the one hand, we have seen throughout the analysis how police departments, in their role as democratically legitimized actors in the maintenance of particular social orders, are concerned with how private sector technology might potentially undercut democratic principles. On the other hand, replacing private sector tools with in-house developments might resolve issues around data sharing, black boxes, and accountability capacities – but it might not resolve general concerns regarding technologically mediated police powers and their potentially larger detrimental effects on democratic societies. While this has not been the primary concern of this article, the ambiguities of both public security agencies and technological tools within democratic context should be kept in mind vis-à-vis questions of digital sovereignty, especially in light of the further digitization and platformization of the work of the police and other security actors.Footnote51

Acknowledgments

This manuscript has benefitted profoundly from the support of many colleagues. In particular, I would like to thank the guest–editors of this Special Issue of Democratization, Irem Tuncer Ebetürk, Jelena Cupać, and Hendrik Schopmans, for extensive pre–submission feedback and notes. Further, I am indebted to the constructive engagement of two anonymous reviewers and Jeff Haynes and the editorial team at Democratization.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Matthias Leese

Matthias Leese is Assistant Professor for Technology and Governance at the Department of Humanities, Political and Social Sciences, ETH Zurich, Switzerland.

Notes

1 Bruce Bimber and Homero Gil de Zúñiga, “The Unedited Public Sphere.”; Chris Marsden, Trisha Meyer, and Ian Brown, “Platform Values and Democratic Elections.”

2 Shama Redden, “Democratic Governance in an Age of Datafication: Lessons from Mapping Government Discourses and Practices,” 3.

3 Shama Ams, “Blurred Lines” the Convergence of Military and Civilian Uses of AI & Data Use and its Impact on Liberal Democracy.; Jędrzej Niklas and Lina Dencik, “What Rights Matter?. Examining the Place of Social Rights in the EU’s Artificial Intelligence Policy Debate”

4 Zygmunt Bauman et al., “After Snowden”: Rethinking the Impact of Surveillance.; David Lyon, “Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique.”

5 Stephen Ackroyd et al., New Technology and Practical Police Work; Janet Chan, “The Technological Game How Information Technology is Transforming Police Practice.”

6 Simon Egbert and Matthias Leese, Criminal Futures: Predictive Policing and Everyday Police Work.

7 Lyria Bennett Gary T. Moses and Janet Chan, “Algorithmic Prediction in Policing Assumptions, Evaluation, and Accountability.”

8 Andrew Guthrie Ferguson, The Rise of Big Data Policing Surveillance, Race, and the Future of Law Enforcement.

9 Bianca Herlo et al., eds Practicing Sovereignty (Bielefeld: Transcipt, 2021).

10 James Byrne and Marx, “Technological Innovations in Crime Prevention and Policing: A Review of the Research on Implementation and Impact.”

11 Catherine Crump, “Surveillance Policy Making by Procurement.”; Anastasia Konina, “Promoting Human Rights in the Context of Police Procurement A Study of Predictive Policing Instruments.”

12 Thomas J. Biersteker, “State, Sovereignty, and Territory.”

13 Herlo et al., Practicing Sovereignty Digital Involvement in Times of Crises.

14 Luciano Floridi, “The Fight for Digital Sovereignty: What It Is, and Why It Matters, Especially for the EU,” 377; see also Ramesh Srinivasan and Peter Bloom, “Tech Barons Dream of a Better World- Without the Rest of Us.”

15 Patrik Hummel et al., “Data Sovereignty: A Review.”; Stephane Couture and Sophie Toupin, “What Does the Notion of “Sovereignty”. Mean Referring to the Digital?”

16 Couture and Toupin, “What Does the Notion of “Sovereignty” Mean When Referring to the Digital?,” 2306.

17 Couture and Toupin, “What Does the Notion of “Sovereignty” Mean When Referring to the Digital?,” Ibid., 2317.

18 Couture and Toupin, “What Does the Notion of Sovereignty" Mean When Referring to the Digital?,” Ibid., 2310; Julia Pohle and Thorsten Thiel, “Digital Sovereignty,” 55.

19 Floridi, “The Fight for Digital Sovereignty What It Is, and Why It Matters, Especially for the EU.”

20 Floridi, “The Fight for Digital Sovereignty: What It Is, and Why It Matters, Especially for the EU.” Ibid., 371.

21 Detlef Nogala, “The Future Role of Technology in Policing.”

22 Crump, “Surveillance Policy Making by Procurement.”

23 Julie Ayling and Peter Grabosky, “When Police Go Shopping,” 667.

24 Konina, “Promoting Human Rights in the Context of Police Procurement: A Study of Predictive Policing Instruments.”

25 Ackroyd et al., New Technology and Practical Police Work; Chan, “The Technological Game: How Information Technology is Transforming Police Practice”

26 Peter K. Manning, “Technological Dramas and the Police Statement and Counterstatement in Organizational Analysis.”

27 James C. Scott, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed.

28 Arne Hintz, Lina Dencik, and Karin Wahl-Jorgensen, Digital Citizenship in a Datafied Society.

29 Hummel et al., “Data Sovereignty: A Review,” 10.

30 Egbert and Leese, Criminal Futures.

31 Marcus, “Ethnography In/Of the World System.”

32 Sylvia Marlene Wilz, “Die Polizei als Organisation.”; Egbert and Leese, Criminal Futures: Predictive Policing and Everyday Police Work, 7.

33 Walter L. Perry et al., Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations.

34 Mareile Kaufmann, Simon Egbert, and Matthias Leese, “Predictive Policing and the Politics of Patterns.”

35 Jan Terpstra, Nicholas R. Fyfe, and Renze Salet, “The Abstract Police: A Conceptual Exploration of Unintended Changes of Police Organisations.”

36 Ferguson, The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement.

37 Bennett Moses and Chan, “Algorithmic Prediction in Policing: Assumptions, Evaluation, and Accountability.”

38 Egbert, “Siegeszug der Algorithmen?”, Leese, “Predictive Policing in der Schweiz.”

39 “Near repeat” refers to the assumption that there is an increased likelihood that residential burglary offenses will trigger further offenses in the close vicinity, see Michael Townsley, Ross Homel, and Janet Chaseling, “Infectious Burglaries: A Test of the Near Repeat Hypothesis.”

40 Hummel et al., “Data Sovereignty: A Review,” 1.

41 Mark Maguire, “Criminal Statistics and the Construction of Crime.”

42 James Hawdon, “Legitimacy, Trust, Social Capital, and Policing Styles. A Theoretical Statement”; Tom R. Tyler, “Trust and Legitimacy: Policing in the USA and Europe.”

43 Bruno Latour, Science in Action: How to Follow Scientists and Engineers Through Society; Trevor J. Pinch, “Opening Black Boxes Science, Technology and Society.”

44 Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information; Matthias Leese, “The New Profiling: Algorithms, Black Boxes, and the Futureof Anti-discriminatory Safeguards in the European Union.”

45 James J. Willis and Stephen D. Mastrofski, “Improving Policing by Integrating Craft and Science: What Can Patrol Officers Teach Us About Good Police Work?.”; Jerry Ratcliffe, Ralph B. Taylor, and Ryan Fisher, “Conflicts and Congruencies Between Predictive Policing and the Patrol Officer's Craft.”

46 Egbert and Leese, Criminal Futures: Predictive Policing and Everyday Police Work.

47 Terpstra, Fyfe, and Salet, “The Abstract Police: A Conceptual Exploration of Unintended Changes of Police Organisations.”

48 Mark Bovens, “Public Accountability,” 182.

49 Trevor Jones, “The Accountability of Policing.”

50 Rosamunder van Brakel, “Pre-Emptive Big Data Surveillance and its (Dis)Empowering Consequences: The Case of Predictive Policing.”; Mark Andrejevic, “Data Collection Without Limits: The Case of Predictive Policing.”

51 Simon Egbert, “Predictive Policing and the Platformization of Police Work.”

Bibliography

  • Ackroyd, Stephen, Richard Harper, John A. Hughes, Dan Shapiro, and Keith Soothill. New Technology and Practical Police Work. Buckingham/Philadelphia: Open University Press, 1992.
  • Ams, Shama, “Blurred Lines: The Convergence of Military and Civilian Uses of AI & Data Use and its Impact on Liberal Democracy.” International Politics online first. (2021). doi:10.1057/s41311-021-00351-y.
  • Andrejevic, Mark. “Data Collection Without Limits: Automated Policing and the Politics of Framelessness.” In Big Data, Crime and Social Control, edited by Aleš Završnik, 93–107. Milton Park/New York: Routledge, 2018.
  • Ayling, Julie, and Peter Grabosky. “When Police Go Shopping.” Policing: An International Journal of Police Strategies & Management 29, no. 4 (2006): 665–690.
  • Bauman, Zygmunt, Didier Bigo, Paulo Esteves, Elspeth Guild, Vivienne Jabri, David Lyon, and R. B. J. Walker. “After Snowden: Rethinking the Impact of Surveillance.” International Political Sociology 8, no. 2 (2014): 121–144.
  • Bennett Moses, Lyria, and Janet Chan. “Algorithmic Prediction in Policing: Assumptions, Evaluation, and Accountability.” Policing and Society 28, no. 7 (2018): 806–822.
  • Biersteker, Thomas J. “State, Sovereignty, and Territory.” In Handbook of International Relations, edited by Walter Carlsnaes, Thomas Risse, and Beth A. Simmons, 2nd ed., 245–272. Los Angeles/London/New Delhi/Singapore/Washington DC: Sage, 2013.
  • Bimber, Bruce, and Homero Gil de Zúñiga. “The Unedited Public Sphere.” New Media & Society 22, no. 4 (2020): 700–715.
  • Bovens, Mark. “Public Accountability.” In The Oxford Handbook of Public Management, edited by Ewan Ferlie, Laurence E. Lynn Jr., and Christopher Pollitt, 182–208. Oxford: Oxford University Press, 2005.
  • Byrne, James, and Gary T. Marx. “Technological Innovations in Crime Prevention and Policing: A Review of the Research on Implementation and Impact.” In Technology-led Policing, edited by Evelien de Pauw, Paul Ponsaers, Kees van der Vijver, Willy Bruggeman, and Piet Deelman, 17–40. Antwerpen/Apeldoorn/Portland: Maklu, 2011.
  • Chan, Janet. “The Technological Game: How Information Technology is Transforming Police Practice.” Criminology & Criminal Justice 1, no. 2 (2001): 139–159.
  • Couture, Stephane, and Sophie Toupin. ““What Does the Notion of “Sovereignty” Mean When Referring to the Digital?” New Media & Society 21, no. 10 (2019): 2305–2322.
  • Crump, Catherine. “Surveillance Policy Making by Procurement.” Washington Law Review 91, no. 4 (2016): 1595–1662.
  • Egbert, Simon. “Predictive Policing and the Platformization of Police Work.” Surveillance & Society 17, no. 1/2 (2019): 83–88.
  • Egbert, Simon. “Siegeszug der Algorithmen? Predictive Policing im deutschsprachigen Raum.” Aus Politik und Zeitgeschichte 67, no. 32–33 (2017): 17–23.
  • Egbert, Simon, and Matthias Leese. Criminal Futures: Predictive Policing and Everyday Police Work. London/New York: Routledge, 2021.
  • Ferguson, Andrew Guthrie. The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement. New York: New York University Press, 2017.
  • Floridi, Luciano. “The Fight for Digital Sovereignty: What It Is, and Why It Matters, Especially for the EU.” Philosophy & Technology 33, no. 3 (2020): 369–378.
  • Hawdon, James. “Legitimacy, Trust, Social Capital, and Policing Styles: A Theoretical Statement.” Police Quarterly 11, no. 2 (2008): 182–201.
  • Herlo, Bianca, Daniel Irrgang, Gesche Joost, and Andreas Unteidig, eds., Practicing Sovereignty: Digital Involvement in Times of Crises. Bielefeld: Transcript, 2021.
  • Hintz, Arne, Lina Dencik, and Karin Wahl-Jorgensen. Digital Citizenship in a Datafied Society. Cambridge: Polity Press, 2018.
  • Hummel, Patrik, Matthias Braun, Max Tretter, and Peter Dabrock. “Data Sovereignty: A Review.” Big Data & Society 8, no. 1 (2021): 1–17.
  • Jones, Trevor. “The Accountability of Policing.” In Handbook of Policing, edited by Tim Newburn, 693–724. Cullompton/Portland: Willan Publishing, 2008.
  • Kaufmann, Mareile, Simon Egbert, and Matthias Leese. “Predictive Policing and the Politics of Patterns.” British Journal of Criminology 59, no. 3 (2019): 674–692.
  • Konina, Anastasia. “Promoting Human Rights in the Context of Police Procurement: A Study of Predictive Policing Instruments.” McGill GLSA Research Series (forthcoming).
  • Latour, Bruno. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge: Harvard University Press, 1987.
  • Leese, Matthias. “The New Profiling: Algorithms, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union.” Security Dialogue 45, no. 5 (2014): 494–511.
  • Leese, Matthias. “Predictive Policing in der Schweiz: Chancen, Herausforderungen, Risiken.” In Bulletin zur schweizerischen Sicherheitspolitik, edited by Christian Nünlist and Oliver Thränert, 57–72. Zurich: Center for Security Studies, 2018.
  • Lyon, David. “Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique.” Big Data & Society 1, no. 2 (2014): 1–13.
  • Maguire, Mark. “Criminal Statistics and the Construction of Crime.” In The SAGE Handbook of Global Policing, edited by Ben Bradford, Beatrice Jauregui, Ian Loader, and Jonny Steinberg, 206–244. London/Thousand Oaks/New Delhi/Singapore: Sage, 2016.
  • Manning, Peter K. “Technological Dramas and the Police: Statement and Counterstatement in Organizational Analysis.” Criminology; An interdisciplinary Journal 30, no. 3 (1992): 327–346.
  • Marcus, George E. “Ethnography In/Of the World System: The Emergence of Multi-Sited Ethnography.” Annual Review of Anthropology 24 (1995): 95–117.
  • Marsden, Chris, Trisha Meyer, and Ian Brown. “Platform Values and Democratic Elections: How Can the Law Regulate Digital Disinformation?” Computer Law & Security Review 36 (2020): 1–18.
  • Niklas, Jędrzej, and Lina Dencik. “What Rights Matter? Examining the Place of Social Rights in the EU’s Artificial Intelligence Policy Debate.” Internet Policy Review 10, no. 3 (2021): 1–29.
  • Nogala, Detlef. “The Future Role of Technology in Policing.” In Comparisons in Policing: An International Perspective, edited by Jean-Paul Brodeur, 191–210. Aldershot: Avebury, 1995.
  • Pasquale, Frank. The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge: Harvard University Press, 2015.
  • Perry, Walter L., Brian McInnis, Carter C. Price, Susan C. Smith, and John S. Hollywood. Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations. Santa Monica: RAND Corporation, 2013.
  • Pinch, Trevor J. “Opening Black Boxes: Science, Technology and Society.” Social Studies of Science 22, no. 3 (1992): 487–510.
  • Pohle, Julia, and Thorsten Thiel. “Digital Sovereignty.” In Practicing Sovereignty: Digital Involvement in Times of Crises, edited by Bianca Herlo, Daniel Irrgang, Gesche Joost, and Andreas Unteidig, 47–67. Bielefeld: Transcript, 2021.
  • Ratcliffe, Jerry, Ralph B. Taylor, and Ryan Fisher. “Conflicts and Congruencies Between Predictive Policing and the Patrol Officer’s Craft.” Policing and Society 30, no. 6 (2020): 639–655.
  • Redden, Joanna. “Democratic Governance in an Age of Datafication: Lessons from Mapping Government Discourses and Practices.” Big Data & Society 5, no. 2 (2018): 1–13.
  • Scott, James C. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New Haven/London: Yale University Press, 1998.
  • Srinivasan, Ramesh, and Peter Bloom. “Tech Barons Dream of a Better World - Without the Rest of Us.” In Practicing Sovereignty: Digital Involvement in Times of Crises, edited by Bianca Herlo, Daniel Irrgang, Gesche Joost, and Andreas Unteidig, 23–46. Bielefeld: Transcript, 2021.
  • Terpstra, Jan, Nicholas R. Fyfe, and Renze Salet. “The Abstract Police: A Conceptual Exploration of Unintended Changes of Police Organisations.” The Police Journal 92, no. 4 (2019): 339–359.
  • Townsley, Michael, Ross Homel, and Janet Chaseling. “Infectious Burglaries: A Test of the Near Repeat Hypothesis.” British Journal of Criminology 43, no. 3 (2003): 615–633.
  • Tyler, Tom R. “Trust and Legitimacy: Policing in the USA and Europe.” European Journal of Criminology 8, no. 4 (2011): 254–266.
  • van Brakel, Rosamunde. “Pre-Emptive Big Data Surveillance and its (Dis)Empowering Consequences: The Case of Predictive Policing.” In Exploring the Boundaries of Big Data, edited by Bart van der Sloot, Dennis Broeders, and Erik Schrijvers, 117–141. Amsterdam: Amsterdam University Press, 2016.
  • Willis, James J., and Stephen D. Mastrofski. “Improving Policing by Integrating Craft and Science: What Can Patrol Officers Teach Us About Good Police Work?” Policing and Society 28, no. 1 (2018): 27–44.
  • Wilz, Sylvia Marlene. “Die Polizei als Organisation.” In Handbuch Organisationstypen, edited by Maja Apelt, and Veronika Tacke, 113–131. Wiesbaden: VS Verlag für Sozialwissenschaften, 2012.