6,511
Views
17
CrossRef citations to date
0
Altmetric
Articles

People’s strategies for perceived surveillance in Amsterdam Smart City

ORCID Icon, &
Pages 1467-1484 | Received 16 Apr 2018, Accepted 30 Apr 2019, Published online: 15 May 2019

ABSTRACT

In this paper, we investigate people’s perception of datafication and surveillance in Amsterdam Smart City. Based on a series of focus groups, we show how people understand new forms of hypervisbility, what strategies they use to navigate these experiences, and what the limitations of these strategies are. We show how people tried to discern between public and private sector actors, to differentiate who they trusted by building on the existing social contract. People also trusted the objectivity of data in relation to prior experiences of social contexts and discrimination. Lastly, we show how the experiences of some of the inhabitants in our study who were most vulnerable to hypervisibility highlight the limits to strategies based on the neutrality of data. By asking about perceived surveillance rather than emphasising actual practices of surveilling, we show differentiated contexts and strategies, providing empirical grounds to question the dominant technical framing of smart cities.

Introduction

In the smart city of Amsterdam, when asking how long it takes to get somewhere, the answer will usually be given in the number of minutes to get there by bicycle. With the world-renowned cycling infrastructure, as well as the entrenched social norms to treat cyclists with equal priority to cars, bicycles are a major element of what life in the city is like. It is a common knowledge that it is incredibly frustrating and disruptive when one’s bicycle will, inevitably, get stolen. How people choose to keep their bicycle safe, then, makes visible the ways in which they navigate their sense of experience in the city.

In our discussion with EU immigrants, Pietro felt safer parking his bicycle in an area with CCTV cameras, but DanielFootnote1 “wouldn’t have wanted a camera watching him”. This paper outlines many contradictions and nuances in people’s relationship towards surveillance in the city. People were very uncertain about how new forms of data are being produced and integrated for urban governance and profit, unsure of what exactly was going on and whether they had influence in responding. When surveillance is experienced differently, how should we understand how people may respond to smart city strategies?

In recent years much has been made in the literature about the phenomenon of, and narratives around, smart cities and the datafication and digitization of urban governance, yet comparatively little research has engaged directly with people’s own experience. In this paper, we use empirical findings from our research on people’s subjective perspectives to argue for a “thicker” debate around smart-city datafication and surveillance. Based on a series of focus group discussions, we show how people understand and experience new forms of hypervisbility, what strategies they use to make sense of and navigate those experiences, and what the limitations of these strategies are, before connecting these experiences to wider debates. We begin the paper with a brief discussion of the Amsterdam Smart City initiatives, then place the smart urbanism literature in context, before elaborating on our empirical methods and findings.

By intentionally seeking the perspectives of citizens – and thus asking about perceived surveillance rather than placing the emphasis on actual practices of surveilling – our empirical research uncovers the extent of residents’ awareness and agency in the processes of urban datafication. The diversity of people’s views shows that these new forms of urban governance inherit trust from the historical social contract between people and their government. Even though cracks in the system offer an opportunity for people to disentangle increasingly overlapping information flows, people are not clear as to what the implications for datafication are. Our focus on experience offers an understanding of differentiated contexts and strategies, and questions the dominant technical framing of smart cities. A better understanding of these different practices of meaning-making advocate for a contextualized use of datafication in governance, and a contextualized understanding of data governance itself.

We contribute to the urban studies literature by bringing together an account of the processes and aims of urban datafication with accounts of the lived experience of that datafication. Our project sought to understand how people understand their “right to the datafied city” (Joe. & Graham, Citation2017), in the context of this historical background of protest. As Holvast stated in an interview for this project, during the protests of 1970 people referred to “our privacy”, but in the 2000s this has shifted to “my privacy”. We offer an account of what it means to experience, or lose, privacy in contemporary urban space, but also make the case for demanding a thicker account of people’s autonomy in relation to urban surveillance, which can encompass both privacy and the mutual shaping of people and city through technology, and resistance to it.

Amsterdam Smart City

Amsterdam was Europe’s first municipality to launch a Smart City program, building on earlier Digital City projects (Dameri, Citation2014). The city also has a long history of activism in relation to digitization. One example of this is the debate that began in the late 1960s, recorded by the historian and activist Jan Holvast, in relation to the extension of the Dutch census questionnaire in combination with the first computerization of its analysis (Holvast, Citation2013). The activists realized that the expanded and digitized census would be more intrusive than previous censuses: it would be possible to cross-tabulate results in ways that would identify people as undocumented, for example. The census used punchcards, the same technology used to categorize the Jewish population and facilitate the Holocaust. Amsterdam led a national census boycott, with the result that the Dutch census was abandoned and the country substituted a municipal registration system. This combination of a municipal register with the datafication of city functions and infrastructure is worth exploring because of these past tensions.

Amsterdam has continued to be a city that shows some of the most sophisticated debates about the implications of digitization of city functions and services for everyday urban life, and during the period of this study (2014–15) those debates were intensifying and becoming more visible in the public sphere due to increased investment by the municipality in digitization. The Amsterdam Smart City program is a public-private initiative of the Amsterdam Economic Board (AEB), which articulates its mission as “increasing prosperity and well-being” (AEB, Citationn.d.) through connecting the private sector with knowledge institutes and government. The AEB’s mission focuses on six priority areas, all through an economic lens and with a logic of economic growth: the circular economy, digital connectivity, energy, health, mobility and “jobs of the future” (AEB, Citationn.d.). Addressing social questions such as diversity and inequality, or shaping and understanding the social impacts of datafication, are not part of the stated portfolio of the driving institution of Amsterdam’s smart city project. The smart city is run by an Economic Board, not a technical or a social one, and its articulated purpose is encouraging business engagement, supported by funds and knowledge from the public sector. One of the main organizations operationalizing the AEB’s policy is the Amsterdam Metropolitan Solutions center, an initiative composed of business partners and a core of computer science and economics professors who are at the forefront of “engineering the future city” (TUDelft, Citationn.d.). AMS’s discourse frames the smart city as a complex adaptive system to be engineered, where “The city and partners share urban data and Amsterdam allows the researchers to use their city as a living lab and testbed”.

These conditions epitomize the divide between “smart city” policies in urban planning and governance, where datafication is framed as a purely economic and technical phenomenon, and social policy, which frames problems as social and political. By taking Amsterdam as a case study, this paper aims to interrogate this dichotomy by identifying the points at which people encounter the effects of smart city policies and understanding how lived experiences of urban datafication align, or diverge from, “smart” visions of the city.

Critical perspectives on smart city surveillance

The dominant framings of “smart city” development point to a post-political space of technological solutionism (Graham, Citation2002; Morozov, Citation2014), where processes of datafication and digitization have been promoted for their promise to increase efficiency, convenience, safety, and in some cases, citizen participation (Vanolo, Citation2014; Verrest & Pfeffer, Citation2018). The vision evokes orderly precision and control, optimization of resource allocation, and uninhibited flows of information, people, and materials (Luque-Ayala & Marvin, Citation2016). As these are all facilitated by embedded sensor networks, frictionless flows of data, and analytics, there is a strong current for facilitating private sector involvement in the creation of urban governance through surveillance (Kitchin, Lauriault, & McArdle, Citation2015; Sadowski & Pasquale, Citation2015).

The smart city ideal and related modes of implementation have been critiqued because the de-politicization of the urban is problematic (Greenfield, Citation2013). By conceiving of society in systems thinking, shocks to the “system” are naturalized and thus the space is no longer up for discussion (Welsh, Citation2014); on the contrary, the only possible solutions are technological (Morozov, Citation2014) and by extension invite a dominance of private investment for development (Söderström, Paasche, & Klauser, Citation2014). The shift from data-informed to data-driven urbanism has reinforced an instrumental rationality and realist epistemology which naturalizes surveillance (Kitchin, Citation2016) and further obscures transparency in decision-making processes (Brauneis & Goodman, Citation2017). The transfer of power from the public sector to the commercial-municipal nexus has the effect of removing the role of public contestation in shaping the city’s development, assuming consensus, and furthering urban entrepreneurialism (Harvey, Citation1989). The efficiency framing of much smart city discourse positions the citizen as the taxpayer whose funds must be conserved through the application of technology, rather than the person to be represented and served within the city.

Murakami Wood (Citation2017) has described how smart cities constitute surveillant assemblages. Aside from claims of efficiency and innovation, these assemblages also serve the function (as Amsterdam’s census boycott highlighted) of sorting and classifying people within the city. This function has led to critiques of the new dynamics of discrimination inherent to database classification systems, and the risk of large-scale surveillance at the cost of losing individual and group privacy (Barocas and Selbst Citation2016; Bowker & Star, Citation1999; Datta, Citation2015; Gitelman, Citation2013; Greenfield, Citation2013; Murakami Wood, Citation2017). From this perspective, the aim of creating frictionless mechanisms of social ordering and control at best sidetracks, and at worst destroys the city as site of political and cultural mingling and emergence (Sadowski & Pasquale, Citation2015, Taylor & Richter, Citation2017).

The return of the social is present in smart city debates: in particular, Verrest and Pfeffer (Citation2018) present a new call for critical smart urbanism, arguing that the scholarly field deconstructing the “smart city” as a policy concept would benefit from re-engaging with the literature on critical urbanism. Relevant issues include how publics are constructed in urban policy (Cowley, Joss, & Dayot, Citation2018), and how the “smart citizen” may be discursively deployed to justify policy (Shelton & Lodato, Citation2019), in ways that fail to address inequality, given that cities still face “how to deal with the widening problem of social inequalities in part caused by their own processes” (Hollands, Citation2008). Another possibility is to find a framing that includes the non-citizen as well as the citizen, in order to explore a variety of experiences. Rather than paraphrasing technical solutionism and legitimizing dominant policy framings by investigating how the technology can improve the urban way of life, critical smart urbanism shines a light on how the technological actively shapes and creates what it means to live in the city. Doing so requires acknowledging both how the socio-political context constructs urban problems and their solutions, and how the urban is constructed of non-material flows also from beyond local administrative boundaries. Our analysis, therefore, seeks to understand the lived experience of urban design: how it is shaped through specific modalities such as urban cycling and walking, and how this information and urban design might be brought to mutually shape each other.

Methodology

The empirical basis for this paper comes from a project conducted at the University of Amsterdam during 2014 and 15.Footnote2 The project, focusing on Amsterdam’s residents and their experiences of urban datafication in the context of “smart city” programs, employed an iterative ethnographic strategy for data collection and analysis. This involved two initial strands: first, observation at events sponsored by the city for technology vendors, such as the Smart City Event (2015). Second, we conducted interviews with 20 experts who were either scientists collaborating on the AEB’s and Amsterdam Metropolitan Solutions’ portfolio, local activists, or academic experts on privacy and urban datafication. We then conducted a scenario mapping exercise with the entire team of 7 researchers to generate ideas of possible futures for urban datafication in Amsterdam, based on these two strands of data collection.

From those scenarios we generated questions for 8 focus group discussions, selecting participants based on characteristics that we expected would place them in a critical or supportive relationship to urban datafication strategies. We did not aim for a representative sample of Amsterdam residents, but formed the topics for the focus groups inductively based on our findings from the first phase of the research, and invited people to participate through intermediaries including activist groups, representative organizations, and local contacts of the team. The eight groups were: technology developers; people who might be subject to ethnic profiling; non-citizens; older people; schoolchildren; self-employed people; sex workers and people who did not use a smartphone. Participants chose freely to participate based on information provided by our contacts or intermediary organizations.

In the focus groups we aimed to find out what people were aware of in terms of digital data collection in urban space, how they understood Amsterdam’s “smart city” program, what kind of data collection they felt comfortable with in urban space and what kind of communication they would like to have with the city about its datafication processes. In the course of the focus groups questions were also raised by participants to both the researchers and to each other about the nature of datafication in the city, for example about mandates and responsibilities of governmental agencies, and the types of data being collected and how these may be used. As far as possible the researchers informed and educated the participants regarding these questions. However, we often also did not have clear answers. This aspect of the focus groups itself illustrated the need to know more about the structures of data governance (or lack thereof). It is itself indicative of our interpretation of a general sense of “navigating through the dark,” that we discuss and illustrate in the next section.

While the study itself focuses on Amsterdam and focus group participants are all residents of the city, the discussion went beyond the administrative boundaries of the municipality in two ways. First and naturally, in discussing experiences both participants and researchers made references to experiences from other cities, which illustrated the point that the participant was making well. One place referred to, for instance, was Stratumseind in Eindhoven, one of the main streets and exemplary “living labs” in the Netherlands for experimenting with smart city technologies. Second, and this is again indicative of the research subject itself, datafication and digitalization, as well as its perceived effects, cross established administrative and jurisdictional boundaries; immigrant focus groups discussed how their data, and the effects of the data’s interpretation and viewing, cross international boundaries. Each focus group lasted between one and three hours, and was recorded then transcribed. We then conducted a thematic analysis of the focus group transcripts, coding for main themes, for “pivot points” where people recounted a particular insight, and for narratives that explained how people’s views of datafication were formed. Each researcher focused on at least two transcripts, sharing findings to check for inter-coder reliability.

Citizens’ perspectives on surveillance in Amsterdam

Our analysis of the focus group discussions revealed five themes. The first two relate to how datafication of the city is perceived as a general feeling of uncertainty and hypervisibility, on one hand, but also in the forms of cracks in the presumably smooth data flows, on the other. These cracks offer entry points, of sorts, to people to try and discern between public and private sectors actors in order for people to differentiate who they trust. The third and fourth themes relate to the strategies that people use to navigate the opaque territory of digital data networks that they engage with. The first strategy people use in response is to inherit trust from the existing social contract. The second strategy people showed was building an understanding based on trusting the objectivity of data in relation to prior experiences of social contexts and discrimination. Lastly, we show how the experiences of some of the most exposed inhabitants, sex-workers, highlight the limits to strategies based on the neutrality of data.

Uncertainty and hypervisibility

As we asked people what they thought, conversations across all focus groups were tinged with a distinct uncertainty and unease. When reflecting on datafication and its uses, people felt hypervisible, particularly through the amounts, frequency, and diversity of information posted online:

“I feel extremely visible: check ins on Facebook, everything you post on Twitter, Google, that knows through your phone every step you take pretty much, everything you post using Gmail. I am pretty sure everything is scanned and collected and aggregated” [Professional from a large energy company]”

While the above quote refers to hypervisibility through online media, the sense of hypervisibility also extended to interactions with the government. This was particularly prominent around the civic registration numberFootnote3; for example, one immigrant stated:

‘Everybody already knows it. Everywhere you go and say your name or your birthdate, everyone knows and you get the feeling this is the number you have tattooed on you like a cow’ [EU immigrant group].

The sense of hypervisibility alongside uncertainty of who sees and when finds expression in the frequent use of “every” and “everyone.” The hypervisibility was often spoken about with tinges of fear and sadness. Rooted in a sense of uncertainty, these feelings were only amplified by fears around cybersecurity. Increased datafication and dependence on technology was seen as insecure because everything could be leaked and hacked:

‘Even if Facebook doesn’t share it, then there are hackers that can’ [Ethnic profiling group].

People did not know what was possible with data; neither what was technically feasible nor what was actually legal. What was striking was that focus group conversations were active discussions amongst participants to figure out what exactly was the state of datafication in the city, and what they understood data flows to be. It was not always explicitly stated as such, but the process of discussions reflects this unsettling awareness of not-knowing. For example, we provide here a snippet of a focus group discussion between five European immigrants and an interviewer, as they reflect on surveillance and which agencies know what about them.

“ [1] If you think about it it’s beyond that, government knows about your movement. Because of the OV chipkaart.Footnote4

[2] But is it necessarily the government?

[3] No, that’s not the government.

[…]

[5] For instance what I am wondering about., for example the tax office they do know what is on your bank account and in what you‘re spending it and they actually check your income and expenditure for irregularities..etc. You do know that indeed nobody tells you to which degree this is happening and concretely how it is used and how the data is used.”

In discussion, people were trying to understand the processes of surveillance and the agencies involved. Amidst this searching, on one hand people were resigned, on the other people were aware of the lack of anonymity. Broad and cross-scalar surveillance lead to what Haggerty and Ericson (Citation2000) refer to as the “disappearance of disappearance”, where anonymity is no longer possible. People were aware of the tenuousness of anonymization as a safeguard, and worried that processes for data protection could also be overridden:

“Anonymization: it doesn’t really exist, right? There is always a way to go back and find who is that person even if the data is anonymized and there is no identification.” [Tech developers group]

This is significant because the processes of surveillant data analysis in public space are often legitimized by the powerful claim of anonymity. Yet people’s wariness undermine the validity and acceptance of the discourse, and begins to shift the power balance between citizens and data analytics.

Together, uncertainty, hypervisibility and the feeling of powerlessness in the face of developments happening at a level way over their head meant many people effectively gave up. People felt resigned, in the same way, that people said they accept cookies when visiting websites because they have no option to opt-out:

“If there is CCTV all over. In reality you‘re always being watched. I mean if you‘re uncomfortable I think you should stay home pretty much.” [EU immigrants group]

This suggests that no clear story is being told, at least in Amsterdam, about the extent to which data derived from an instrumented environment is being used to influence and make people governable – and that similarly there is no way for individuals to check how data is being used. Contrary to the image of a smooth-running, convenient, safe urban machine, from the perspective of citizens then moving through the digitized and datafied landscape of the city is akin to navigating a dark, unknown territory with the spotlight in one’s eyes. The only reliable information comes from moments of intervention by the authorities, but even then it is hard for an individual to check whether a particular intervention is data-driven or based on existing policy directions.

As the processes of datafied governance are changing, it is unclear for the city-dweller what the implications are nor who are the actors responsible. The opacity of data collection and processing, and of data-driven decision-making, meant that our interviewees had little information about how their data doubles were being used. People do not have a clear grasp of what is going on, and instead (where they do know they are being monitored) feel distinctively uncertain about who is watching, while at the same time feeling hypervisible. The combination of feeling hypervisible, on one hand, and the uncertainty about how the data structures and links are organized in the smart city, on the other, creates a general sense of moving and navigating through opacity.

Discernment in opacity

We have seen that once people expressed their awareness and discomfort, the space of conversation often turned to people trying to discern and make a difference in the opacity of surveillance. This is akin to a decision-making process, largely based on considerations of who and how to trust.

First, participants often spoke about “the government” as a monolithic entity without distinguishing between different departments, agencies or municipalities, and without seeing the myriad of bureaucratic checks and balances implemented to protect them. Datafication and digitization of government presented an integrated vision of a smart city governing them, a new form of a black box, where data were flowing in new ways and being connected in order for them to be made more visible, without a clear narrative of what the purposes of data processing were.

However, when data flows across institutional boundaries were not so seamless, the veneer of the smart city image began to crack, and people’s experience with bureaucracy began to introduce the possibility for the discernment of what was behind government platforms. For instance, several recounted their experience of living in rented accommodations, and through the process of requesting housing benefits (a relatively common practice in the saturated housing market), discovering that over 20 people were registered but not living at the same address. As a result, one stated:

‘I‘ve always idealised the system as being perfectly integrated and communicating across each other and I have no evidence to really know if that’s true or not, but I‘ve got the feeling that they are maybe not as seemingly integrated as I might have liked to think that they were’ [non-EU immigrant group].

From the perspective of the data analyst, data collection is far from seamless across institutions. There is no single “data” department of the city municipality, and in some cases, the municipality does not have the technical capabilities and so must outsource some of the analytics. For a private analyst seeking to build on the data flows of the smart city, it is a networked process of permissions from different institutions that must be navigated in order to be able to collect data from public space.

In response to this demand for negotiation, data analysts are seeking new streamlined ways to enable data collection. For example, we spoke with research institutes working on innovative solutions for urban management by mapping traffic flows, including pedestrian traffic. They stated that it is easier to collect data at privately-run public events, because in this case, the event organizer gives permission, rather than the city council. A privately sponsored event such as a concert or sports match effectively generates what has been termed “pseudo-public space” (Mitchell, Citation1995), which looks public but is regulated and policed as private space.

Despite these blurred distinctions, some participants did have an awareness that “the government” was certainly not a cohesive monolith and that there was a differentiation between the different organizations. The technology developers group, in particular, was highly aware of their own data flows, partly because of their technical knowledge, but also because as freelancers, they have to manage their own online visibility as part of their business. In this instance, the participant reflected an understanding of the opacity of data flows for credit ratings as being embedded in contextual trust relations.

“If you are a tax payer in the NL you get a number in their system and depending on this number, it’s like a rating, they think you are more possible., it’s rating the possibilities that you commit fraud or stuff like that. The problem with the tax is that they don‘t tell you where they get this information from and it‘s even not written in the law what kind of things they are allowed to do […] The police is more transparent than the tax office and not a lot of people know this. I think most people trust the government but I don‘t know if everybody trusts the tax office.’ [Technology developers group]

In this example, the individual discerns different levels of opacity for different institutions. Suggesting that transparency correlates with trust would be speculating beyond the limits of the participant’s own words. However, for the scope of this paper, it is interesting that reflection on discerning differences in opacity is almost immediately tied with thinking about the different levels of trust in different institutions. This social context of levels of trust in different institutions remains one point of reference with which to navigate the changing landscape of urban datafication. Given people’s insecurities and the opacity of the actual state of hybrid data flows, these points of reference become the seeds of strategies to respond to this changing landscape. In the following sections, we turn to how these seeds grow and how different people navigate digital urban governance in their processes of meaning-making.

Trusting the social contract

How, then, do people navigate these feelings? The first strategy people used was trusting the historical social contract. Part of the ways the increasing hybridity is experienced is through the dissolving of old categories – for instance, the differentiation between government/corporate and public/private space. Throughout conversations about datafication in the city, despite their uncertainties, people first tried to discern which social actors were involved, and then drew on known, historical sense-making devices, such as the distinction between government and private sector, to evaluate who and what to trust.

In particular, people made a distinction between the government and corporations, only discussing the latter as “Big Brother”. In general, people were rather trusting of the city government, as citizens or otherwise. Even as people were conscious of trading their information for convenience and services and sometimes feeling unsettled by it, there was an understanding that more data and the integration thereof leads to better service provision and they trusted the government to do good.

‘I was very surprised how the system is organised and that every government/administrative unit knows everything about me starting from my bank account, my transport history and sometimes even they can know what kind of diet I consume because your purchase is directly related to your bank account. Sometimes I felt that I was being invaded all the time. But when I also saw what kind of services I get without having to go through a lot of hassle I appreciate and understand why things had to be structured and organised in that way. Because there are people coming in and picking up my garbage every 3 days’ [non-EU immigrant group].

From this perspective, increasing datafication would only be an extension of the historical practice of governance by nation states as where people accept civic registration in exchange for the provision of public services. Whilst it is somewhat hyperbolic to suggest the linking of administrative data to personal bank accounts, the quotation nonetheless suggests that by giving data, people were aware that the locus of control shifts more and more so into the hands of the government. For some, this meant public safety, and was appreciated as reassuring:

‘It feels like there is a higher safety. It seems like for the fact that you know who lives where is sort of more under control and I like it more’ [non-EU immigrant group].

As well as increasing datafication as an end for public safety in and of itself, some people felt that public safety legitimized the instrumental value of digitization of government. As focus group discussions turned to the limits and boundaries of privacy and what that would mean, even those at risk of being ethnically and religiously profiled and discriminated against were quite firm in their willingness to share information for public safety and anti-terrorism purposes:

‘As long as it doesn’t have a profit motive, they can even know my shoe size, they can really have all my information, as long as it only is about safety.Footnote5’ [Ethnic profiling group]

In trusting the government to provide services, safety, and welfare for its people, residents follow the logic of ideal-type democratic governance, whether consciously or otherwise. In return for votes, the elected politician together with a Weberian bureaucracy will distribute value across society in a fair manner in return for knowing details about its population. Dutch citizens, in particular, began to reflect upon their relationship with the state and the broader position in society. For example,

“The nature of the Dutch is indeed to trust the Government and the things they do and maybe is changing now a bit but not enough I think. […] Sometimes the Dutch are very furious about something, but when they get something in return or they find out the voting machines are very cheap/cheaper than counting manually, then people are willing to give their option for control in return. So it‘s also a financial thing I think. The Dutch are very.., willing to give up a lot if they get something in return.” [Technology developers group]

The trust in government is therefore also dependent on the historical regimes; as a European welfare state, the polity is historically geared towards mistrust of corporations in favor of the government, which would be quite the opposite in an American context. As the quote above clearly demonstrates, there is a strong cultural element. Thus, informed by cultural and national backgrounds, people draw on historical points of reference to make sense of datafication processes.

Across the focus groups, people generally trusted the government in exchange for services and especially safety and security, whether as an end in and of itself or in its instrumental value. For example, a data science student stated: “I have agreed to [being monitored in Stratumseind] because I have a Dutch passport. The government can do anything it wants with my information.” Here, there is an embedded assumption that the government has a natural right to citizens’ data, as opposed to the citizen having rights and data for which the state must barter in exchange for services. Secondly, there is an assumption that as Stratumseind is public space, it must be the government doing the surveilling. However, Stratumseind is actually a corporate living lab. The logics of being governed are thus transferred to where the logic no longer holds, and the role of the state in the digital is renegotiated to become an intermediary in trust relationships. Yet the situation is not so direly clear-cut; the patchworked hybridity of governance arrangements was also experienced and discussed by people as they became increasingly aware of the problems that such a historical translation entails in smart environments. This awareness may explain the second strategy people draw on to makes sense of and navigate the changing smart environment.

Drawing on experiences of social profiling

The second strategy which people used to navigate their experiences of urban surveillance was to draw on their previous experiences of social context and discrimination. Because understanding is a social construction tempered by people’s context and experience, our findings reinforce the point that the ways in which people give meaning to the processes of datafication and profiling depend on pre-existing contexts. As such, whilst urban solutions may be using analytics and profiling as an efficient form of management for governance intended to be neutral, they are interpreted depending on social experience.

Concretely, the conversation around datafication for public safety purposes was nuanced by pre-existing structural biases. On the one hand, people felt their privacy was protected in data analytics for service provision, because the analysis was only done on the group level and was more about the number of people as a whole, rather than individual characteristics. For instance, tracking the number of people on trains in order to see which ones are busy was completely acceptable. Yet on the other, tracking in the realm of safety, anti-terrorism and predictive policing raised flags of discrimination. Especially those at risk of ethnic and religious profiling felt that processing data at the individual level was better in order to distinguish the differences between members of a shared group identity at risk of discrimination, such as Muslim men with beards.

‘I think it’s better we deal with individual cases, because if we talk about groupings, then I know for sure, it is simply the media, they influence people, and then people with beards are the first to be stamped as terrorists, and then these are the first to be followed’ [Ethnic profiling group].

The idea that individual targeting was preferred to group-based decisions found echoes in other places, particularly in one technical researcher, who preferred more data to be collected so that he could be profiled more accurately:

‘I’m scared of being quarantined in a data-poor environment. I don’t know if I’m more scared of city planning in a data poor environment or in a data rich environment. I don’t know which is weirder. Probably the poorer environment is again more problematic, because yeah, it’s less accurate. … so this might also be the incentive for people that they know if they give away their data, that they’re at least privileged in some respects, know that companies and municipalities etcetera have the correct data and they’re not on a false positive list of some kind.’ [Privacy Law researcher]

Whilst both these individuals agree that increased datafication will help to overcome bias and false positives, they do so for very different reasons. The man who fears his beard will mark him out as a potential security threat sees the people behind the process. He questions the assumptions that drive categorizations, based on his previous experience of social discrimination. The technology researcher, on the other hand, trusts that increased datafication will lead to better, more accurate profiling, overcoming the structural inaccuracies of insufficient training data.

People make sense of the new in light of their previous experiences; earlier we found that trust in government and accountability structures were carried over into hybrid arrangements, and here people base their interpretations of analytics on the experiences they have had as members of specific groups with memories of discrimination.

Going beyond the nuance of the ways in which people interpret datafication, some people’s experience also questions the assumption that data itself is more objective than other forms of representation in governance. That such an assumption is not sustainable, regardless of the quantity or detail of the data in question, and the pitfalls and risks embedded in such assumption that gives precedence of one form of representation over another are often seen first and foremost by those groups of people whose lives are characterized by frequent experiences of discrimination as a result of categorizations.

Limits to the neutrality of data

Despite a recurring narrative that more data is instrumental to safety, convenience and may eliminate bias on the parts of both analysts and residents, data remains inherently political. The logic that more data is correlated with public safety fails at the fringes, where increased datafication may actually increase issues for groups already vulnerable to visibility.

The most dramatic experiences of the changing politics of data were reported by some of those who are marginalized within the smart, governable city: sex workers. The group we spoke to were by far the most educated, opinionated and activist of all our focus groups, possibly because they are the focus of public health, spatial planning, law enforcement, and public security policy, and therefore frequently have to respond to both data-driven and traditional policy-driven interventions.

In the Netherlands, sex work is regulated at least partly as a legal profession, meaning that datafication plays a major role in legal and accounting administration. However, in order to avoid exposure sex workers tend to keep a distinction between their sex worker persona and their “wallet-name”, separating their data into silos to minimize crossover and the resulting risk of being “outed” and vulnerable to violence. The risk involved in juggling work and private identities is highlighted by reports that by aggregating data from different parts of sex workers’ lives that they had taken great pains to keep separate, Facebook’s “people you may know” algorithm was revealing sex workers’ “vanilla” identities to sex work clients, and inversely outing them to family and friends (Hill, Citation2017).

Municipal administrative procedures do the same, by demanding registration of sex workers (though not of their clients) at every point in the process of legal sex work. For example, the recent Dutch Prostitution Regulation Law (WRP) has generated much discussion about licensing as a strategic move against human trafficking. Although a license is no longer mandatory for individuals, several opt for registration anyway in order to prevent their partners and family members dependent on their income from being suspected of human trafficking or exploitation. The very real risk of human trafficking makes it easy to use as an excuse for surveillance:

‘It’s that huge hypocrisy. When it‘s about protecting our data it’s like, you want to be like any other job? And when it‘s about violating our right it’s like ‘you‘re special‘ [..] One of our politicians has said that it‘s OK for all our data to be there if it saves one human trafficking victim. It doesn’t matter if all sex-workers in the country do not have any privacy anymore, basically’ [Sex workers group].

These adverse consequences of the processes of datafication clearly highlight the uneasy tension between protecting trafficking victims and protecting sex workers, and the demands being made of sex workers are often contradictory in that rather than the protection they actually create new vulnerabilities. In this way, the people who are already at the fringes and vulnerable show the limits of the neutrality of data.

Interpretation of people’s perceptions and strategies

In this study, we set out to understand, how the smart city is perceived and experienced from the point of view of residents. In the process of conversations, we also learned the kinds of points of references people use to navigate through the digital environment and to make sense of it and to give meaning.

Overall, the discussions reflected a sense of uncertainty about being seen and about the nature and structure of this new environment. From the perspective of residents the smart city is – in many ways – unchartered, opaque, and in many ways also a risky – place, a territory that lacks clear boundaries or reliable coordinates for navigation and trusted engagement with the environment. This aspect of our respondents’ perceptions resonates with a larger scale, theoretical concerns about “cyberspatial disruptions” and of territories losing their hold (Hildebrandt, Citation2017). Digitization and datafication form interactive processes which change both the knowing of the city, where an individual can become informed about or use digital services, as well as the governing of the smart city, where the individual is known to the municipality. Datafication here constitutes the messy space at the interface of networked space (Hildebrandt, Citation2017), where data are the flows that enable individuals to be seen by the city.

The emerging pattern is a phase of governing facilitated by technological experts beyond the limits of the nation-state, where few that can keep a sense of overview of what kinds of data are flowing where, and therefore make informed choices as to how to respond. This opens the door to the risk of a further “rendering technical” of the processes of the city (Murray Li, Citation2007), where the socio-political tensions are simplified into a technical narrative to make the image of the smart city appear much more coherent than it actually is. In fact, the general sense of hypervisibility may not only be rooted in experiences of being taken by surprise when one finds out that CCTVs have been watching one’s shopping trips or that data posted for a given purpose has been refurbished for another. It is likely also evoked through the technocrats’ own visions of “seamless data flows” and “full integration of data structures” across institutional boundaries.” Particularly paradoxical in the current state of affairs is people’s reaction to the opacity of the urban surveillant assemblage. People spoke of datafication practices as a single mass, finding it difficult to distinguish between public and private, much less between different government authorities, whilst at the same time not only using those categories to differentiate their trust, but also starting to see the cracks in these practices of assembling through experiences of imperfect data flows, such as when registering for a house. These disruptions thus provide an opening for insight and understanding. These moments of breakdown of the imagined infrastructure (Star, Citation1999) form points of reference for navigation and sense-making for our respondents.

One response of people to the uncertainty is to draw on known, historical sense-making devices, using categories like government vs. private as an ordering principle to navigate the changing smart landscape. In our case study, people operated on assumptions of government ownership and oversight, but this is no longer exclusively the case. Especially where the relations between public and private actors are increasingly intertwined and therefore unclear to a layperson, hybrid arrangements inherit trust from citizen–government relationships. This is problematic for two reasons. First, leveraging the blurred distinction between public and private space may become the preferred approach of urban science researchers in the future, because it enables data collection processes that are either impossible or unwieldy in public space. The creation of pseudo-public space occurs not only through events, but also through changing patterns of land ownership. For the case of London, for example, Shenker (Citation2017) has mapped the rise of pseudo-public space demonstrating that the regulations which govern these spaces are most often opaque and hidden by a culture of secrecy, with landowners more often than not refusing to make these regulations public. These kinds of public-private hybrids, either temporarily as research interventions or more fixed as architectural changes, morph the logic of how data can be gathered from public space and used for data analytics. Second, the blurring between public and private in the realm of agencies and actors is problematic, because people don’t have the same sort of social contract or relationship of political accountability with data processors or corporate entities as they do with their government. As the distance between people and tangible means of influence and redress grows, people are at risk of becoming separated from their political community, and are further individualized through the process of governing through data doubles, which enables interventions at the individual level (see Richter et al Citation2018). While on one hand relying on the remnants of a social contract, people also perceive and experience that these categories are increasingly blurred, and hence less useful as navigation coordinates. The surveillance arrangement is hybrid and fragmented, at the same time leaving individuals on their own to make decisions.

Our evidence shows that people do see value in the potentials of data analytics for more objective decision-making. However, the flows in datafied, algorithmic governance mean that people’s characteristics are extracted, distanced from their social context and then later reassembled to create meaning in new ways for new purposes. This is inherently difficult for, if not irreconcilable with, a politically- and culturally sensitive form of governing the city, if it is the sole mechanism of governing. The epistemological stance of datafied representations presenting the “objective” truth becomes most strongly questioned by those, whose daily lived experience of discrimination underscores that pure neutrality is unachievable, and rather that urban society is political, made up of multiple subjectivities vying for recognition. The group of sex workers remind us – perhaps most strongly – that the unquestioned hegemony of data over the objective truth and as the dominant form of representation is a dangerous a path to follow. Together with the willingness to share all their data for security purposes, those experiencing discrimination in society are essentially in the paradoxical position of being on the back foot, sharing data as a defense mechanism to prove their merit and innocence, while that sharing of data may make them more visible in other ways. Here it shows that the ways in which protecting the most vulnerable become a loophole in data protection echoes experiences in humanitarian data, where protecting from or dealing with crisis legitimizes surveillance and the invasion of privacy (Taylor, Citation2016).

As data doubles are connected to processes of identification, in some cases they are created intentionally, such as the civic registration number or the licensing of sex workers. In other governance processes, data doubles are a by-product, such as when people use an app – say, a map or to look up the bus schedule – which aimed for a particular service but is not primarily for providing the government with data. In “benevolent” forms of state-based registration, such as to receive welfare benefits, the person providing the data receives benefits in return. In the new data markets the person may receive nothing, but those who collect make profits from it. As data flows cross-cross, lived experiences become part of a new value chain in data markets, and exclusion/inclusion through classification is increasingly ‘decided’ by algorithms, the recurring political question of being able to point out who benefits becomes increasingly important.

Conclusion

In the first instance, our analysis calls for the need to provide citizens with a means to better see the emerging landscapes of the smart city. Citizens are concerned about the uses of the data they provide (knowingly or unknowingly; voluntarily or involuntarily), but that they are also very much “in the dark” on where the data goes, how it is integrated across agencies, and what the pros and cons of such data uses are, perhaps more hoping than trusting in the objectivity that data can inject into decision-making. This is even more problematic in light of the need to largely self-manage their engagement with the digital environment, as far as this is possible, in a rather individualized manner, where “my” privacy has taken precedence over “our” privacy. Both the sense of uncertainty about the city’s new shape and the unease about being extremely visible call for more concerted efforts on part of technology developers and agencies responsible for the implementation of data technology to chart out these landscapes for citizens, make visible the nodes of data collection, purposes, and how these data are integrated.

The potential of data analytics for more objective decision-making must be modulated with an understanding that these processes are made meaningful depending on the social context, which argues for a balanced and judicious use of data analytics and algorithms for governance. Data doubles alone do not provide a coherent and effective way to govern the city, particularly when people are profiled less accurately than they would like. As a new form of virtual self, data doubles extend the identities of individuals in ways they have no control over, which can lead to inaccuracies, exploitation, and unjust decision-making. People do not feel free to determine their own identities, and at the same time, have an unsettling awareness of being hypervisible.

The participants of our research often left the boundaries of Amsterdam in their narrative and discussion; and this is a result of the very questions we discussed. What then is our study a case of (Lund, Citation2014)? We would argue that the perceptions and strategies of our respondents indicate challenges in data governance that go beyond the idea of the “smart city” as a municipal unit. As in the past, the “urban” (Brenner, Citation2009; Sassen, 2008; Verrest & Pfeffer, Citation2018; Townsend, Citation2001) here points towards more global trends in digitalization and datafication of society. Administrative boundaries are losing their meaning in the context of digitalization and datafication; and at a generalized level, our study shows that the boundaries around both government and citizenry are dissipating. The discussions with residents of Amsterdam raise questions about the role of government then in future; and what kind of citizenships and citizens can and will emerge from the multiplicity of engagements between people and their digital environments (Hacking, Citation2007; Pelizza, Citation2017). Given the relevance of these questions, the current state of accountability structures, including the role of government vis-à-vis citizens, needs to be revisited and revised in light of and perhaps against the vision of the city as a machine empowered by data fuel.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek [KIP 13759].

Notes

1. Names have been changed.

2. STW Klein Innovatief Project, No. 13,759.

3. In Dutch, the Burger Service Nummer, or BSN.

4. The OV chipkaart is the Dutch public transport card used to pay for tickets for trains, buses, and metros.

5. The Dutch word is veiligheid, which can be translated as safety and/or security.

References