9,467
Views
14
CrossRef citations to date
0
Altmetric
Introduction for Geopolitics SI

Data Matters: The Politics and Practices of Digital Border and Migration Management

ORCID Icon, ORCID Icon & ORCID Icon

Data matter more than ever in the regulation of borders and migration. An apt illustration of how movement is enabled or restricted by data collection and analytics was recently reported by Eyal Weizman, founding director of the London-based research agency Forensic Architecture that specialises in the production and analysis of evidence about human rights violations by state and corporate actors. Prior to a business trip to Miami where he was supposed to open Forensic Architecture’s first major exhibition in the US that, among other things, displayed investigations into a CIA drone strike in Pakistan and police killings of black US citizens, Weizmann was notified that his visa waiver request had been denied and that he would not be allowed to enter the United States.

Upon further inquiry at the US embassy in London, he was informed that he had been flagged as a ‘security threat’ by an algorithm looking for suspicious patterns in applicants’ data. While officials at the embassy could not tell Weizman what exactly had triggered the unfavourable judgement by the algorithm, they suggested that “it could be something [he] was involved in, people [he] was in contact with, places to which [he] had travelled (had [he] recently been in Syria, Iran, Iraq, Yemen, or Somalia or met their nationals?), hotels at which [he] stayed, or a certain pattern of relations among these things” (Weizman Citation2020, n.p.). Weizman was subsequently encouraged to provide the US Department of Homeland Security with details on individuals or connections that could point in the direction of terrorism or organised crime in order to purify himself and eventually be able to travel again. Since the digital records that had prompted the algorithm to flag Weizman as a security risk concerned personal and professional networks and connections that informed investigative work into human rights violations – including those committed by US institutions and their allies – Weizman declined to provide this information.

Weizman’s case forcefully illustrates how the “datafication of mobility and migration management” (Broeders and Dijstelbloem Citation2016) reconfigures what Hyndman (Citation2012) calls the “geopolitics of migration and mobility”, i.e. the creation and maintenance of differential access to and experiences of mobility and related life opportunities along intersecting lines of culture, class, capital, gender, race, and political orientation. Digital devices and the data they produce preconfigure who gets to be a member of the privileged “kinetic elites” and who is sorted into the “kinetic underclasses” (Adey Citation2006). At the current conjuncture of more and more data being produced and mobilised for purposes of border control and migration management worldwide, there is an urgent need to empirically engage with the power dynamics and effects of processes of datafication and to analyse their implications – not only for operational logics and practices of borders, but also for the wider geopolitics of mobility.

This is the aim of this Special Issue (SI) on ‘Data Matters’. As the contributions to the SI demonstrate, data are informed by and carry different political rationales (e.g., security, public health, economics), they are a testament of different forms and temporalities of political regulation and intervention (e.g., prevention, reaction, long-term planning), they involve and connect different actors (e.g., intelligence services, humanitarian organisations, mobile individuals), and they help to enact and regulate various types of mobility (e.g., business travel, tourism, migration). By and large, data determine how we can or cannot move through the world and whether we are considered to be threats, risks, victims, or assets – and none of us are exempt from data politics and practices (Bigo, Isin, and Ruppert Citation2019; Scheel, Ruppert, and Ustek-Spilda Citation2019).

Data also play a key role in the proliferation of borders beyond geopolitical demarcation lines, constituting delocalised and multi-sited assemblages that have been described as “deterritorialized” (Lahav and Guiraudon Citation2000), dispersed (Huysmans Citation2011) or “virtual” (Vukov and Sheller Citation2013) borders. These transformations go hand in hand with shifts in border functions. In many regards, border control has become a practice of data-driven knowledge production that serves to facilitate processes of social sorting, risk assessment, and prevention, thereby informing the differentiation of variegated forms of movement. Based on analytics, today’s borders seek to accelerate and facilitate certain forms of mobility (e.g., business travel, tourism) while decelerating and inhibiting others (e.g., asylum seekers, ‘irregular migration’, security risks). Data are thereby mobilised for the identification and classification of individual subjects, often long before they start their journeys (Amoore Citation2011; Bigo and Guild Citation2005; Leese Citation2014).

This SI contributes to current debates in border and migration studies with innovative conceptual work and new empirical insights. Conceptually, it explores how we can grapple with the proliferation of data for purposes of border and migration management in conceptual and theoretical terms. It investigates how the analytical vocabulary from fields such as critical data studies, media studies, and science and technology studies (STS) can be productively integrated into the study of borders and migration. In empirical terms, it sets out to systematically investigate the socio-technical composition, operational logics, practices, and politics of datafied border control and migration management. We propose to do so along three lines of inquiry that foreground how digital data and technologies transform the government of mobility: (1) By rendering data as a matter of concern and questioning their ontological status and political effects; (2) by asking how data come to matter in knowledge production and decision-making; and (3) by interrogating the matter of data, i.e. the material artefacts and infrastructures facilitating their production, cleaning, storage, sharing, and analysis. Working through these three registers, in this introduction we elaborate on our understanding of ‘Data Matters’, situate it in existing literature, and identify avenues for future research.

Data as a matter of concern

In this first section we turn to a problematisation of data themselves. More recently, reflexive approaches in data and software studies have criticised understandings of data as mere representations of pre-existing, external realities. Instead, scholars subscribing to reflexive accounts argue that data enact – that is, bring into being and perform – the realities they allegedly only account for and describe (Kitchin Citation2014a; Ruppert Citation2011; Scheel, Ruppert, and Ustek-Spilda Citation2019). Such a perspective accounts for the fact that data are always actively produced in particular ways, which is why they can in fact never be “raw” (Gitelman Citation2013) but are always already “cooked” (Kitchin Citation2014b, 26). This renders the cooking process – the recipes, the cooks, the kitchenware, and the ingredients – central for analyses and interventions into the role of data in the constitution and government of realities. Speaking with Latour (Citation2004), we should not approach data as a settled matter of fact, but rather unsettle them and treat them as a “matter of concern”. Scholars have thus called to study and explicate the often invisible or invisibilized practices and infrastructures through which data are produced, cleaned, circulated, certified, imputed, linked, and matched (Bowker et al. Citation2010; Ruppert and Scheel Citation2021; A. MacKenzie Citation2017).

Treating data as a matter of concern means to challenge the wide-spread “data-realism” (Glouftsios and Scheel Citation2021) in border control and migration management, i.e. the belief that data held about migrants and travellers objectively represent their identities, bureaucratic trajectories, and past and future itineraries. Attending to the practices, networks, and associations through which data are produced, shared, and put to use is also important in political terms, as it allows scholars to move towards a mode of analysis and critique that attends to the performative effects of data. In brief, the idea of performativity posits that not only speech acts, but also modes of presenting oneself in public space (Butler Citation1993) or knowledge practices in economics, medicine, and other scientific fields (Latour and Woolgar Citation1986; Mol Citation2002; D. MacKenzie Citation2006) help to constitute and perform the very realities to which they refer. In the case of border control and migration management, performativity implies that “the ontologies of mobile bodies (i.e. the meanings and identities attached to them) are performed by control practices enacted through digital means” (Glouftsios and Scheel Citation2021, 10).

A situated analysis of the socio-technical data practices and related assemblages through which data are produced and analysed highlights how different data practices produce different data and thus different accounts of the real (Scheel, Ruppert, and Ustek-Spilda Citation2019). Focusing on data and related knowledge practices as primary research objects, as Glouftsios (Citation2018) puts it, allows us to “observe that what is conventionally understood as a pre-given singular reality […] is in fact not singular at all: it multiplies”. Data practices thus closely correspond with what Mol (Citation2002) calls “ontopolitics”, i.e. a politics of the real that is concerned with the question how mobility realities are enacted as objects of government and political intervention (cf. Scheel Citation2021). The performative dimension of data requires scholars to take seriously the production, circulation, and analysis processes of data that must not be reduced to administrative or technical questions, but that are politically and ethically charged. Paying attention to the generative powers of data allows for showing that data and politics can in fact not be separated (Bigo, Isin, and Ruppert Citation2019).

Consequently, it is important that scholars turn data into a matter of concern, particularly in those instances where the debate seems to be closed and where data tend to be accepted as indisputable, objective matters of fact that could speak for themselves. To push Latour’s (Citation2004) argument for a constructivist mode of critique somewhat further, approaching data as a matter of concern invites scholars to enter the thing and become part of, and interfere in, the gatherings that produce, shape, and make up data in specific ways and with particular effects. Scholars in fields like data and software studies, media studies, or cultural studies have developed a range of concepts and approaches that have not yet been fully exploited by border and migration scholars, but that prove helpful to turn data into a matter of concern and expose them to political debate.

For example, recent works in data feminism (D’Ignazio and Klein Citation2020; Leurs Citation2017) embrace an intersectional feminist perspective to investigate how particular data practices help to maintain and reinforce existing power asymmetries and inequalities in terms of age, (dis-)ability, class, race, and gender. Data feminist analyses ask who benefits from the production, circulation, and analysis of data and in which ways, who owns data, who shapes related production processes, and who in turn becomes silenced, monitored, or exploited by data produced in one way or another. In light of these questions it is no coincidence that feminist thought constitutes a central source of inspiration for research on data justice (Dencik et al. Citation2019; Taylor Citation2017). It allows for an approach to data as a social justice concern that takes the lived experiences and social struggles of marginalised and vulnerable communities as entry points to assess digital technologies and data practices in terms of the injustices and power asymmetries they help to enact and sustain. Such a feminist-inspired approach does not risk “neutralising issues of social justice to questions of technology, effectively promising technological solutions to social problems” (Dencik, Jansen, and Metcalfe Citation2018, n.p.), but rather politicises them and opens them up for intervention.

For instance, the operators of border and migration databases, such as the European Union’s Agency for the Operational Management of Large-Scale IT Systems in the Area of Freedom, Security and Justice (eu-LISA), only register the refusal of data subjects such as asylum seekers to provide comprehensive information about themselves in terms of noise and “dirty data” (Steyerl Citation2019) that has to be identified and cleaned from its systems. Hence, the political claims for data justice carried by such practices of refusal get erased in a technical language of data quality or failure to enrol rates. The lived experiences of marginalised data subjects offer, in contrast, a valuable political-analytical counter-point that allows scholars to avoid the epistemic trap of getting ensnared in the technocratic and depoliticising language of policy reports and to expose relations of domination and oppression upheld by border security-related data practices and infrastructures in order to enact them as unsettled, contested matters of concern.

Driven by a concern for such data (in)justices, Metcalfe (Citation2021, this issue) advances debates on the “autonomy of migration” (Mezzadra and Neilson Citation2013; Scheel Citation2019), an approach that calls for the analysis of border regimes from a migrants’ perspective with a focus on migrants’ border struggles, by combining it with Castoriadis’ work on the imaginary institution of society to show how alternative imaginaries of fingerprints animate and inform practices of subversion of asylum seekers in Greece. While emphasising the subjective dimension of migration, the autonomy of migration literature has so far not sufficiently considered the role of alternative imaginaries in migrants’ border struggles. Based on extensive fieldwork in Greece, Metcalfe shows how many asylum seekers try to keep their fingerprints “small” as a way to avoid a future deportation back to Greece under the Dublin regulation, which is enforced with the help of the biometric Eurodac database. Based on this analysis, Metcalfe uses asylum seekers’ alternative imaginaries, and the practices of resistance animated by them, as a source to enact both Eurodac and the Dublin system as matters of concern.

Another important source for enacting data as a matter of concern are discussions about “technocolonialism” (Madianou Citation2019), “data colonialism” (Couldry and Mejias Citation2018) and alternative ”data epistemologies” (Ricaurte Citation2019). The notion of colonialism highlights how the appropriation of data for purposes of government and profit facilitates new forms of exploitation, domination, and dependency. Starting from the observation that, unlike oil, data are not a substance that can simply be extracted from nature but have to be appropriated from people, Couldry and Mejias (Citation2018, 337) argue that data colonialism involves a “new type of appropriation that works at every point in space where people or things are attached to today’s infrastructures of connection”. While they develop their argument in relation to commercial platforms like Facebook, the concept of data colonialism is also useful to grasp data relations in the context of border control and migration management, where it is, for instance, routinely taken for granted that people’s biometric data are readily available and can be used for purposes of identification by state authorities (van der Ploeg Citation2011).

Tazzioli (Citation2020, this issue) shows that, based on an analysis of the delivery of financial assistance to refugees in Greece via cash cards, asylum seekers feature as both passive surfaces of data extraction and active interlocutors requested to participate in data production. This twofold role resonates with the paradoxical injunction to which asylum seekers are subjected by the cash card system: On the one hand, they are called upon to act as autonomous subjects (i.e. as consumers equipped with financial means provided via the cash card), and, on the other hand, to accept the spatial and disciplinary restrictions imposed on them by the eligibility criteria of the cash card system (i.e. to stay in Greece and refrain from ‘secondary movements’ to other EU member states). Through a careful analysis of the processes of data extraction and circulation of the cash card system, Tazzioli’s analysis explicates how asylum seekers and migrants are turned into a source of value extraction by state and corporate entities.

What is illustrated by Tazzioli’s contribution is an important aspect of data colonialism that has only received scant attention so far. As many scholars have noted, innovative digital technologies tend to be piloted and tested on populations in the global South, often under humanitarian pre-texts and slogans such as “data for good” or “tech4good” (Broeders and Taylor Citation2015; Jacobsen Citation2015; Jacobsen and Sandvik Citation2018). However, the imperial, (neo-)colonial impetus of the datafication of border control and migration management also shows itself in related projects and initiatives in the global North that primarily target people coming from the global Souths (Glouftsios and Scheel Citation2021). Aradau (Citation2020, this issue) mobilises the lens of experimentality to investigate how the deployment of rather mundane digital devices in borderzones, such as the use of Skype for the pre-registration of asylum claims in Greece, reconfigures social relations between authorities and mobile subjects. Experimentality emerges as a rationality of government that keeps mobile subjects in a debilitating state of precarity through the “injection” of mundane (digital) devices, such as mobile apps or cash cards. Importantly, these devices do not just recompose the relations between authorities and refugees, but they also insert them within wider circuits of “platform capitalism” (Srnicek Citation2016), as they enable the production, extraction, and exploitation of “surplus data” (Aradau Citation2020, this issue) by private companies.

These examples show that scholars in data and software studies, as well as in neighbouring fields such as media studies, cultural studies, and STS have developed numerous approaches and concepts that allow us to frame and enact data as a matter of concern. Rather than getting dazzled by claims of data realism and the promises of high-tech solutions for social and political problems, scholars should take cue from these lines of thought and follow Haraway’s (Citation2016) dictum “to stay with the trouble” in the analysis of datafied border control and migration management.

How Data Come To Matter

In this section we engage with the question how data and analytics reconfigure decision-making and the production of ‘truth’ in border control and migration management. Butler’s (Citation2004) figure of the “petty sovereign” has long served as a popular trope to analytically grapple with decision-making at the border, as it resonates closely with the discretionary power of border guards (Hall Citation2012; Heyman Citation2009; Salter Citation2008). For Butler (Citation2004, 56), petty sovereigns are bureaucrats or mid-level officials who perform a type of quasi-sovereign power and render “unilateral decision[s]” within otherwise largely governmentalised structures. The border guard as a petty sovereign is the one who makes a final decision as to who gets to cross the border unbothered and who is stopped and interrogated as a possible security threat (Amoore Citation2006; Epstein Citation2008; Jones Citation2009). Decision-making in border control and migration management has in this sense been read against the notion of Agamben’s (Citation2005) state of exception, where, in the de facto legal vacuum of the border zone, the authority of the state official must be surrendered to unconditionally (Salter Citation2012).

More recently, scholars have however drawn attention to how decision-making processes at the border become transformed through data and analytics (Ajana Citation2015; Metcalfe and Dencik Citation2019; Pötzsch Citation2015). A major rationale of mobilising data at the border is to enhance situational awareness about flows of mobility and the potential risks associated with them, and to support decisions about border crossings. As Leese (Citation2020, this issue) shows in his analysis of the governmental rationales behind the EU’s interoperability framework, data are politically conceived as key for the identification and assessment of travellers. Based on information about an individual’s biographic features (e.g., name, sex, country of origin) as well as previous encounters with authorities, they are used as a means to render mobile populations knowable and governable. Decision-making with and through data thereby inevitably transforms decision-making processes and restructures them as entanglements of humans and data.

Building on recent dialogue between critical border and security studies and STS (Bellanova, Jacobsen, and Monsees Citation2020; Evans, Leese, and Rychnovská Citation2021; Scheel, Ruppert, and Ustek-Spilda Citation2019), we contend that approaches, insights, and analytical sensitivities from STS can provide border and migration scholars with a rich methodological and conceptual resource to investigate how data reconfigure the production of knowledge and authority. Starting from the notion that the social and the technical should not be conceived as separate spheres but rather as constitutive of sociotechnical systems that are coined by interactive dynamics between human and non-human elements (Law Citation1991), STS literature has proposed to analyse the interplay of data, organisational structures, and human experience and expertise through concepts such as “co-production” (Jasanoff Citation2004), “actor-network” (Latour Citation2005a), or “intra-action” (Barad Citation2007). What these approaches have in common is the assumption that there is no predefined epistemic (and analytical) priorisation of any element, but that knowledge and authority emerge in entangled, distributed, and at times messy ways. While we see increasing analytical attention to the interplay between humans and data in the constitution and maintenance of border control and migration management, there is still a shortage of research into how data come to matter within concrete decision-making frameworks at the border.

A notable exception is Hall’s (Citation2017) study of data-supported decision-making at the UK border. Hall shows how human discretion is transformed through data analytics that provide an assessment of the traveller to the border guard. In doing so, she demonstrates how knowledge production through data is turned into a visual practice that largely preconfigures human assessments of a situation or an individual. Although no factual decision is made by analytics and human discretion remains formally in place, data unfold a strong form of persuasive power that might well nudge the decision rendered by a border guard in a specific direction. Hall’s analysis neatly ties in with findings from other domains such as drone warfare (Chamayou Citation2015; Walters Citation2014) or policing (Egbert and Leese Citation2021; Ferguson Citation2017) where data-driven decision-support has also been shown to have strong effects on human decision-making. Hence, processes of datafication highlight the need to go beyond anthropocentric understandings of discretion and to recast it as a socio-technical issue (Ustek-Spilda Citation2019) in order to grasp how border guards make decisions in relation to the capacities, constraints, and affordances of digital technologies and data.

Loukinas, (Citation2021, this issue) analysis of the use of drones for data production on migrant movements across the Mediterranean serves as a starting point for further inquiry of how different socio-technical constellations emerge around different use-cases of drones and the production of knowledge for separate domains (e.g., human rights; repressive security measures). What he describes as the constitution of the “multipurpose drone” indeed speaks closely to specific modes of seeing, knowing, and acting that become pre-structured through the interaction between human actors, technology, and data. For Loukinas, the alleged “versatility” of drone technology can thus be mobilised as an analytical lens to reconstruct how diverging types of knowledge and ensuing action can be constituted from a single body of data, depending on the specific relations that data form with their surroundings.

Besides novel socio-technical constellations that are able to nudge decision-making into a certain direction, the use of data analytics for purposes of border policing brings about another political complication: the opacity of knowledge production. When complex, mutable algorithms analyse large amounts of data, the ways in which recommendations for specific actions (e.g., let this person pass; single out and further question that person) come into being can become difficult or even impossible to understand (Amoore Citation2011; Côté-Boucher Citation2010; Leese Citation2014). A lack of insight into analytical processes in turn undercuts the possibility to challenge and contest decisions. While discretionary decisions made by border guards are certainly also difficult to challenge, in ‘analog’ settings at least there is a shared understanding of how human reasoning works. The mobilisation of data at scale does, however, fundamentally transform the discursive and legal space of the border: as discretion dissolves into black-boxed algorithmic processes and decisions come into being in distributed ways through interactions between humans and machines, it becomes increasingly difficult to create accountability. Careful empirical analysis of how knowledge and authority are co-produced by humans and machines within wider assemblages of digital borders that comprise diverse human and non-human actors such as databases, algorithms, back-office analysts, and front-line border guards is thus much needed.

A second major theme in how data come to matter in border control and migration management is the constitution of novel “regimes of proof” (Sriraman Citation2018), both for purposes of identification and for the determination of credibility. Data are increasingly used as evidence that supports or falsifies claims made by travellers or asylum seekers. Whereas such claims would in the past need to be manually investigated and their acceptance or rejection would be left to the discretion of border guards or bureaucrats (Magalhães Citation2018), data are now considered to be telling an unquestionable truth. They constitute, as Amoore (Citation2009) puts it, a “digital alter ego” that can be invoked by border control authorities as a reference for inquiries into legitimate/illegitimate forms of mobility. The most notorious example of such truth-telling through data is the proliferation of biometric databases as an allegedly infallible proof of identity in border control and migration management. Biometric recognition systems operate on the (very much contested) assumption that certain bodily features such as fingerprints, iris patterns, or DNA are unique to a specific person and allow for the unambiguous identification of that person, thus establishing truth through the datafication of the body (Epstein Citation2007; Muller Citation2005; Leese Citation2020, this issue).

Cheesman (Citation2020, this issue) attends to another example of new forms of evidence for identity claims. She examines the implications of nascent ‘self-sovereign identity’ (SSI) schemes for border and migration management in order to discuss the promises and emancipatory potentials that are attributed to this innovative technology in the field of refugee protection. In the humanitarian sector, SSI – a blockchain-based method of identification – is advocated as a means to potentially empower marginalised groups like refugees by providing them with credentials for official, certified identities that offer individuals ownership over their identity information. Since they rely on shared databases that distribute authority across a network of nodes, distributed ledger technologies like blockchain allow to dispense, at least in theory, with the established model of a centralised (state) authority that validates identity claims. However, illustrated through an analysis of discourses in the aid industry and documentation of pilot projects in Thailand and Jordan, Cheesman shows that SSI is still a contested, embryonic technology whose promise of providing data sovereignty to refugees sits in tension with the claimed monopoly of nation-states to assign and verify officially recognised identities. Nevertheless, the experimentation and testing of ‘SSI solutions’ for refugees shows that we are witnessing profound changes in methods of identification and related regimes of proof that are likely to transform the organisation and operational logics of border control and migration management.

Apart from purposes of identification, data are also used to investigate the credibility of migrants’ accounts, for example in the context of asylum claims. Data are in this context regarded as a means of consolidation (or falsification) of narratives that would otherwise be difficult to validate from the point of view of a bureaucrat in the country where an asylum application was filed. In order establish ‘truthfulness’ of individual accounts, data sources such as social media or the smartphones of migrants are supposed to reveal details about their countries of origin, their journeys, social relations, and activities (Latonero and Kift Citation2018; Noori Citation2020). The credibility of data thus becomes valued higher than the one attributed to human story-telling, particularly in contexts of “institutionalized distrust” (Griffiths Citation2012; Scheel Citation2020) that tends to prevail in asylum and migration procedures. Similar conceptions of credibility have also been shown in other contexts, for example when a hit in a biometric database such as the Visa Information System (VIS) is considered as more truthful than the identity claims of the person concerned (Glouftsios and Scheel Citation2021).

In summary, data inform and transform decision-making in border control and migration management in multiple ways. Independently of whether they are mobilised for decision support, for identification, or for the investigation of credibility, data shift the ways in which knowledge and authority are brought into being in technologically mediated ways. While not formally doing away with discretion, data have the capacity to nudge border and migration practices in specific directions and to surpass professional expertise and experience – for good and for bad. Analytically, this implies a need to investigate the specific ways in which data come to matter in border and migration regimes. The contributions in this SI present a much needed first step in this direction and should, in the best case, be considered as an inspiration for further inquiry.

The Matter of Data

In this final section we engage with the need to consider the material aspects of data and analytics. While early accounts of digitisation in border control and migration management have predicted that the border would likely lose some of its significance as a physical site (Koslowski Citation2005; Lyon Citation2005; Vaughan-Williams Citation2010), its material footing very much persists. More than that: apart from considerations of the physical and architectural characteristics of border demarcations and crossing points, the turn towards data and analytics has brought new materialities to the centre of attention. Digital bordering and the data it generates rely on a wide-spread network of technical devices, databases, and networked connections. More recently, scholars have started to acknowledge and conceptualise these “data assemblages” (Kitchin Citation2014b) that underpin borders and migration management.

For Kitchin, data assemblages are composed of multiple, mutable elements that are entangled in multifarious, shifting ways. They include modes of thinking, artefacts of knowledge production such as reports, graphs, or statistical tables, business models (of IT-providers and platform owners), laws and legal regulations, technical affordances, standards and system requirements, software interfaces, end-users at computer work stations, related communities of practice, routinised ways of doing, and so forth. Just as well, assemblage thinking highlights the processes and practices of assembling and re-assembling datafied border control and migration management. In other words, it shifts our analytical attention to what Walters (Citation2011) has called “technological work”, i.e. the design, implementation, and maintenance activities that are required for databases to operate continuously and without breakdown (Glouftsios Citation2020).

In the context of borders and migration, data assemblages have been described as “heterogeneous and open‐ended groupings of material and semiotic elements that do not form a coherent whole” but rather interact in dynamic and contradictory ways without following an overarching logic, thereby illustrating the “ontological multiplicity of borders” (Sohn Citation2016, 184). To conceive of and research contemporary border control and migration management as data assemblages opens up a whole range of new research questions as it shifts the attention to the tools and infrastructures through which data are produced and put to use. For example, if data that shape decision-making at the border are produced by an instrument such as a fingerprint scanner and a biometric matching system, interpreting the data and ‘hits’ produced by the system requires an understanding of the instrument and other devices that it is connected with: What do the sensors of the fingerprint scanners detect under which conditions? In which instances do they fail to enrol a subject? What are the false acceptance and false rejection rates of the biometric matching system? And are frontline staff aware of the fact that the two correlate and that every biometric recognition system inevitably generates ‘false positives’ and ‘false negatives’ (Noori Citation2018)?

As this example suggests, the materiality of data can be studied through the devices that are used to produce, store, exchange, and process information about mobility (Amicelle, Aradau, and Jeandesboz Citation2015). Border crossing points have been equipped with technical devices that read and capture fingerprints, irises, or facial characteristics; with document scanners that extract data stored on the chips of electronic travel documents; and with hardware and user interfaces that assist border guards in managing the queues in front of them (Sontowski Citation2018). In the background, databases at the local, national, and supranational level relate different bodies of data to each other and make them remotely accessible. All of these devices rely on some kind of material connection: servers stay connected via secured communication networks; fingerprint scanners, cameras, and other sensors hinge on glass fibre cables that transmit template images and enable comparison with existing data; and computer workstations make sure that data can be accessed in a timely and reliable fashion.

Vis-à-vis these developments, scholars have over the past years started to explore the ways in which technologies of border control are being designed and developed (Bourne, Johnson, and Lisle Citation2015; Lisle Citation2018; Noori Citation2018) and how they become implemented into already existing infrastructures and policies (Glouftsios Citation2018; Jeandesboz Citation2016; Leese Citation2018). A particular focus has been put on large-scale information systems and interoperability (Broeders Citation2007; Broeders and Hampshire Citation2013; Leese Citation2020, this issue), reinforcing the diagnosis that borders are multiplying and extending their functionalities towards the inside and the outside of the territory (Bigo Citation2008; Follis Citation2017; Walters Citation2002). The European databases for border control and migration management are an apt example of these tendencies. Consisting of EU-wide systems that collect data from all member states, they are connected to national databases in each participating country, paired with scanning and capturing technologies deployed at all border crossings points. Input for EU databases, however, is not only produced and accessed at border check posts, but also in consulates across the globe (e.g., for the processing of visa applications), in national migration administration or by law enforcement authorities – and increasingly also in ‘mobile’ settings such as during border checks in trains and on cruising vessels, or in ad-hoc asylum registration facilities or refugee camps.

The study of databases closely resonates with a conceptual turn to infrastructure in border and migration studies. Infrastructures can be defined as the “built networks that facilitate the flow of goods, people, or ideas and allow for their exchange over space” (Larkin Citation2013, 328). They include the likes of “roads, railroads, education systems, computer networks” and other “routinized media through which information and commands are transmitted” (Mann Citation2008, 358) that provide the grounds for the seamless functioning of everyday life. Transferred to the realm of border control and migration management, the concept of infrastructure hence allows to tease out the role of material elements that facilitate the production, storage, sharing, and processing of data.

Infrastructures, as Star (Citation1999) has put it, are in many ways “singularly unexciting” and they often remain below the threshold of (analytical) attention unless there is infrastructural breakdown that renders technologies and services unusable. There is, however, a distinctly political element to the construction and implementation of infrastructures. For Mann (Citation2008, 335), infrastructures are key for questions of power and government, as they allow for the enactment of policy across the territory of a state, thus providing “the capacity of the state to actually penetrate civil society and implement its actions”. Recent research on data infrastructures has particularly highlighted the shifts in power relations that emerge once data is channelled into the networks of national and supranational bureaucracies. Studying the registration and identification ‘hotspots’ at the EU’s external borders, Pollozek and Passoth (Citation2019) as well as Pelizza (Citation2020) have for example shown how emerging IT infrastructures create and process data in order to sort and channel refugees and asylum seekers, but also how they perform statehood and difference.

Bellanova and Glouftsios (Citation2020, this issue) engage with the politics of data infrastructures by showing how the Schengen Information System (SIS) and the data it stores emerge as objects that require maintenance and care. Through their focus on the potential shortcomings of data infrastructures, for example in terms of data quality and technical components, they draw our attention to the low-level and often invisible, yet constant efforts that are needed to keep infrastructure up and running. What their analysis highlights is both the fragility of data infrastructures and the importance of maintenance and repair in the make-up of borders, thus depriving digital border control apparatuses like the SIS of their alleged power and omnipotence. It thus closely resonates with Latour’s (Citation2005b, 49) dictum that “endurance is what has to be obtained, not what is already given by some substrate, or some substance”. Rather than as smooth and monolithic structures, border and migration databases should in fact be understood as fragile and susceptible to failure and breakdown (Sontowski Citation2018).

The perspective introduced by Bellanova and Glouftsios also speaks to the idea of “infrastructural inversion” as proposed by Bowker (Citation1995). For Bowker, explicating the largely invisible work of infrastructures can serve as an analytical angle into rendering them as an object of inquiry, but also to expose them as objects of multiple processes of challenge and contestation. Not only do infrastructures connect and relate different actors and as such perform a key function in the constitution of data assemblages, but in light of the truth claims associated with data and the knowledge produced with and through them, the study of the material underpinnings that mobilise them in the first place provides us with an important tool to question the (politically) assumed infallibility of data and analytics.

In summary, tending to the material dimension of datafied border control and migration management yields several analytical benefits. Firstly, it shows that digital border arrangements tend to be fragile and prone to failure and breakdown. Secondly, a focus on the instruments and infrastructures through which data are produced and put to use allows us to highlight how data do not pre-exist their generation. And thirdly, paying attention to the materiality of the encounters between border crossers and authorities shows that these encounters are not only mediated by various human actors that are not directly involved in the border encounter (i.e. the back-office staff and IT specialists mentioned above), but also by a range of non-human actors that equally shape decision-making at sites of border control. Hence, paying attention to the matter of data offers multiple points of (analytical) entry and ample opportunities for negotiation and resistance vis-à-vis digital forms of mobility governance that are much less consistent and infallible than their proponents want to make us believe.

Conclusions

We have suggested here to engage with ‘Data Matters’ in contemporary border control and migration management through three connected registers: (1) Through the enactment of data as a matter of concern by questioning the constitution and performative effects of data; (2) through inquiries into the specific ways in which data come to matter in knowledge production and decision-making processes, i.e. to unravel their role in risk assessment, identification, and in regimes of proof; and (3) through a focus on the matter of data, i.e. the material underpinnings of data and analytics. The contributions to this SI engage with these three registers in different ways and through a variety of empirical cases. What unites them is that by taking data matters seriously, they push the boundaries of current understandings of the role of data in contemporary forms of mobility governance.

At the same time, the contributions to this SI draw attention to the fact that inquiries into datafied border control and migration management raise methodological questions. Understanding borders and migration as digital assemblages requires us to attend to a multiplicity of actors that are involved in the regulation of mobility, including data scientists, back-office analysts, or maintenance workers. Engaging with their professional worlds and tools requires a good deal of technical, organisational, and bureaucratic understanding. Moreover, the involvement of private companies as either vendors, suppliers, or contractors arguably further complicates matters, as they tend to have little interest in (and no legal requirement to) engaging with researchers. And finally, even if those issues can be resolved, a crucial question remains how to study algorithmic modes of analysis that constantly change – and are also shaped and reconfigured by those who work with them – with the effect that it becomes difficult to retrace how exactly knowledge comes into being.

With regard to suitable concepts and theory-building, we have argued that border and migration studies can benefit from the vocabulary of neighbouring disciplines that have laid the groundwork for engagement with digital data and their implications. Drawing on literature from data and software studies and STS, so we contend, will be particularly fruitful for conceptual reconsideration of contemporary forms of border control and migration management in light of the continuing onslaught of data. Thinking about the impact of data in terms of their performativity, in terms of their socio-technical fabric, and in terms of their infrastructural embeddings will enable border and migration scholars to enrich and go beyond existing debates.

The contributions to this SI demonstrate that empirical inquiry into digitised border control and migration management is sorely needed. As they show, data unfold, either directly or in mediated ways, considerable impacts on the lives and life chances of individuals. For some of them, such as in the example of Weizman’s denied visa waiver request, such interference might be nothing more than a nuisance that can quickly be addressed and resolved (or, in his case: ignored). For others, though, data prestructure decisions about the division between the inside and the outside, i.e. whether asylum claims are considered credible, whether an identity is confirmed to be ‘true’, whether a return decision will actually result in a deportation, or simply whether one is considered a threat or an asset. In light of the massively expanding amounts of data produced and mobilised in border control and migration management, normative concerns are thus no longer limited to dwindling levels of privacy and intimacy or to legal issues of data retention periods. Rather, they strike at the very core of how we as individuals appear before state authorities and how we come to be empowered to or excluded from moving through the world.

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.