2,254
Views
6
CrossRef citations to date
0
Altmetric
Special Section: Climate change and insurance

Insurance and the temporality of climate ethics: Accounting for climate change in US flood insurance

ORCID Icon

Abstract

How is knowledge about future climate change operationalized in governance of the present? This paper addresses this question by examining efforts to repurpose the US National Flood Insurance Program (NFIP) for climate change adaptation. Policymakers and officials initially imagined the challenge to be principally a technical one of accounting for uncertainty in risk assessments and insurance tools. But the conduct and outcome of their efforts reflected instead politically charged normative tensions related to the temporality of climate ethics. NFIP policyholders, constituted as a ‘risk public’ by the instruments of flood insurance, exposed these tensions in mobilizations targeting practices of risk governance. The case shows that practices of ‘accounting for’ climate change and governing it through insurance work out—in however tentative or provisional a fashion—larger moralized disputes over the distribution of burdens, benefits and responsibilities over time.

When Hurricane Sandy barrelled into New York City in October 2012, the city’s effective ‘flood insurance rate maps’ captured only 54 per cent of the flooded area in Queens and 67 per cent of the flooded area in Staten Island, two of the hardest-hit boroughs (Shaw, Citation2013). These maps, produced by the Federal Emergency Management Agency (FEMA) and used to price flood insurance through its National Flood Insurance Program (NFIP), had not been updated in years and most of the underlying data were from 1983. About 80 per cent of people who suffered flood damage from Sandy did not have flood insurance, many because they lived outside of the ‘official’ high-risk zones (Chen, Citation2018).

New York City’s municipal government had long been requesting updated flood insurance rate maps. Five years before the storm, as part of his PlaNYC initiative to make New York City ‘greener’ and ‘greater’ in the face of climate change, Mayor Michael Bloomberg called on FEMA to update the old maps, which served as the base dataset for the city’s own maps of future sea level rise, used for planning. Those new flood insurance rate maps finally arrived in the months after Sandy, a storm many connected to climate change. They indeed indicated that the city’s flood risk had changed in recent decades; the maps showed larger flood zones that closely tracked the Sandy inundation line. Yet upon receiving this long-awaited update, the city filed a formal appeal, arguing that FEMA had in fact overestimated the flood risk. After over a year of consultation and negotiation, FEMA and the city reached a surprising compromise, arriving not at one revised flood map, but two. Rate-making in flood insurance for individual homeowners would be based on a flood insurance rate map of current risk, incorporating the city’s own analysis. This revised flood insurance rate map made individuals pay for the assessed flood risks to their properties, in the near-term. The other map—a new, ‘non-regulatory’ map—would show future risk, depicting projections of sea level rise at varying degrees of uncertainty. The ‘future-looking’ map made climate change the official responsibility of ‘the city’ as a whole, over the longer term. Such a map could be used for planning large-scale infrastructural interventions, like flood walls, or for siting new developments. This approach would also provide a more general model for how the NFIP could deliver information about climate change and flood risk to communities nationwide. The two-map solution attempts to stabilize the climate-changed present, even as risk governance always implies an orientation to the future, and parse that present from the climate-changed future in practices of governance.

Here we have a case of policymakers, officials and citizens grappling with how knowledge about future climate change can or should shape governance of the present. How do we understand this outcome, and what can this dispute about insurance and maps tell us about the political morality of climate risk governance? Many accounts understand the central problem of climate risk governance to be the authority of expertise and the management of uncertainty (Jacques et al., Citation2008; Michaels, Citation2008; Oreskes & Conway, Citation2010; Powell, Citation2011): is the science believed by policymakers and the public and will it factor into governance decisions? Or do we have our ‘heads in the sand’ (Washington & Cook, Citation2011)? The apparent inefficacy of scientific truth underpins widespread moral outrage: those empowered to act do nothing, or they fail to act ambitiously enough, even in the face of overwhelming evidence of the urgency and severity of the problem. However, this is not the core moral problem in the case of flood insurance and climate change; there is a basic consensus among authorities that the climate is changing and that this ought to shape policy and planning. Neither is the issue a practical one related to a lack of technical sophistication or a searching for methods that satisfy experts; the experts in this story arrive at a shared understanding of how best, from a technical standpoint, to ‘tame’ uncertainty. Instead, in this case, efforts to govern potentially rapidly changing environmental conditions (and changing understandings of them) turn centrally on questions of responsibility and accountability and, more specifically, tensions related to the temporality of climate ethics. These tensions pertain to (a) which ‘adaptations’ are justifiable in the present to secure a climate-change-resilient future, and for whom; and (b) who should pay for climate change’s effects, and when. At stake is the fundamental question of what it means to ‘take a risk’ in relation to future climate change, to make responsible decisions today for risks that will unfold in the future.

Following the mobilization of NFIP policyholders, constituted as a ‘risk public’ by the instruments of flood insurance, these politically charged normative tensions presented practical problems for the experts, local and federal officials, environmentalists and elected politicians involved in reworking an institution and instrument that authorities use to govern flood risk. With the ‘two-map’ solution arrived at in this case, we see how technical debates, over whether or how to pull apart the climate-changed present and future, reflect contestation regarding the legitimate arrangement of benefits and burdens over time. Risk assessments, which can operate at multiple temporal scales, are calibrated to different political operationalizations to forge alignments with distinct temporalities of responsibility. Fights over the adequacy and accuracy of insurance prices and risk estimates produce judgments about the practically possible and politically acceptable temporality of climate ethics.

The analysis here is based on qualitative data derived from a multi-sited, mixed methods study of the NFIP. I combine data of various kinds to examine what Neale (Citation2016) has called ‘calculative rearticulation’: moments in which ‘figurations of the future are rebooted, reconstructed or recalibrated’ (p. 2028) as actors attempt to make an old programme meet emergent threats. For my account of reforms to the NFIP, undertaken with a view to climate change adaptation, I provide evidence from Congressional transcripts and reports, as well as reports by FEMA’s Technical Mapping Advisory Council (TMAC) and web-based observation and published summary notes of its meetings. At the urban scale, the analysis focuses on New York City because, due to the coincidence of a major flood event, a legislative reform to the NFIP, and an ongoing remapping exercise, it provided a test case for how changes to FEMA’s mapping could be managed in other cities and communities. In New York City, I interviewed diverse users of the NFIP’s flood insurance rate maps on the ground; most informative for the present analysis were interviews with: city officials, FEMA’s regional officials, flood zone homeowners, legal aid lawyers, insurance professionals, engineers and flood risk modellers. I collected ethnographic data at public meetings and town halls related to flood insurance; professional meetings of architects, planners and construction managers; and meetings of legal clinics and housing organizations. I also derive evidence from legislation, reports and technical documents by federal and city agencies, think tanks, and non-profits, as well as media accounts.

Insurance and the temporality of climate ethics

Insurance institutions are key arenas in which climate ethics, in a practical sense, are worked out. Most discussions of climate ethics deal with questions of who has caused climate change, who is affected, and how collective actors can be held to international agreements around emissions reductions. However, in its mundane and often opaque operations, insurance sets the terms of whether or to what extent individual actors can be made to bear the costs of climate change’s effects. It does this through the way it distributes risks and responsibilities. The particular terms of insurance, set through the negotiations of insurers, risk experts, regulators, policyholders, politicians and others, establish which or whose risks can be pooled; allocate financial accountability to different parties; and designate, and incentivize, particular actions as ‘responsible’ vis-à-vis risk (Baker & Simon, Citation2002; Ewald, Citation1991; Heimer, Citation2003). To illustrate, insurance premiums that reflect risk can make a homeowner ‘take responsibility’ for the costs of owning a home in a flood-prone location. As Collier (Citation2014) notes, flood insurance can thus be analyzed as a political technology, intelligible in relation to the prevailing moral economies it both reflects and enacts.

In any debate about climate ethics, the temporal question of when particular actions must take place is central. Again, this dilemma is most salient in debates around mitigation. For my purposes, however, the temporality of climate ethics pertains to the unsettled question of when different actors are or can be made responsible for managing climate change’s effects. Controversies surrounding the temporality of climate ethics are informed by the complex temporalities of climate change itself. What we call ‘climate change’ is a bundle of spatially diffuse and variable effects, which will be realized along different timelines. When is climate change here? Precisely how and when will our present (in)actions create new or more intense problems in the future? And how can we or ought we to account for future climate change in our governance of particular hazards like floods, storms and wildfires today? Conventional forms of actuarial insurance specify a temporality of responsibility for managing many of the risks associated with climate change (e.g. flood), in which knowledge of current risk, based on data about the past, is meant to motivate rational behaviour oriented toward future expectations (Hacking, Citation1975; Lane et al., Citation2011). The ‘temporality of risk calculation’ brings the uncertain future into the domain of agency in the present (Reith, Citation2004, p. 396), simultaneously and often implicitly enacting contestable designations of who is obligated and empowered to act, how and when (Elliott, Citation2017). Climate change unsettles status quo designations, provoking questions about whether or how such responsibilities should be reapportioned. Does someone who moved into a house 20 years ago have to take responsibility for the risks now associated with climate change, particularly those that are projected in the future rather than the present? Which elements of a climate-changed future can or should individuals pay for, and which ought to be the task of governments and private industries, today or tomorrow?

The answers to such questions are negotiated on the terrain of efforts to (re)develop insurance techniques and instruments. Risk assessments, made possible by things like flood maps, can forge and operate at multiple temporal (and spatial) scales (Ferry, Citation2016). Arriving at a particular scale requires decision-makers (engineers, modellers, actuaries, government officials and policymakers) to make judgments regarding the parameters to include or exclude, the selection of data, the operationalization of assumptions and, in the case of insurance, the lengths of contracts. These judgments are in turn often based on values, feelings and objectives (Weinkle & Pielke, Citation2017) and may be contested by those so classified (Zeiderman, Citation2016). As Liz Koslov (Citation2019, p. 3) writes in her ethnographic account of the temporal politics of flood risk mapping in Staten Island, New York, the official ‘flood-zone temporality’ may be ‘out of sync with other ways of knowing and navigating the past, present, and future’. This gives rise to conflicts over how to create and interpret maps. As in many instances where science interacts with policymaking, in flood risk assessment, there is a large body of knowledge ‘whose components can be legitimately assembled and interpreted in different ways to yield competing views of the “problem” and of how society should respond’ (Sarewitz, Citation2004, p. 389). For these reasons, in the words of Beck (Citation2006), ‘Risk definition is, essentially, a power game’ (p.333) with implicated actors struggling to assert their own legitimate definitions in ways that may codify their own interests (Lakoff & Klinenberg, Citation2010; Porter & Demeritt, Citation2012). Technical disputes over risk assessment are in this way simultaneously debates about the nature of particular problems and what various actors are expected to do about them.

Technical disputes do not necessarily take place among predefined collectivities. Rather, insurance may constitute publics or groups with common concern. Flood risk assessment is not inherently controversial; it is made so by the activism of a group of people whose common cause is forged and made salient through the very operations of the map: ‘flood zone homeowners’. The calculations of risk assessment result in lines drawn around areas at high risk, grouping together people who are then subject to the same kinds of insurance pressures and requirements. One of the mechanisms through which risk definition is made a ‘power game’ is the constitution of new interests through processes of risk definition and delimitation, as well as the mobilization of pre-existing ones (Bulkeley, Citation2001). The broad outlines of such constitution are similar to those described by Callon (Citation2007) in his observation that marketization tends to spawn new ‘matters of concern’ and ‘emergent concerned groups’. The normal functioning of markets ‘triggers the appearance of issues’ from the way calculations are framed, or from the ways in which calculative agencies are equipped, included or excluded. In relation to these issues, markets produce new identities, and the actors who take on those identities then strive to be recognized and considered on that basis (Callon, Citation2007, p. 159; see also Fourcade & Healy, Citation2013; Zeiderman, Citation2015). Where matters of concern are taken up by emergent concerned groups, ‘these issues can revive the process of politicization which then requires procedural treatment’ (Callon, Citation2007, p. 140). Callon does not make these points in reference to risk assessment processes, but other scholars of risk and insurance have observed that group-based actuarial classification works to forge solidarities that, when mobilized, may be politically potent (Dean, Citation1999; Garland, Citation2003; Lehtonen & Liukko, Citation2015).

This paper deploys an analytical framework that uses flood insurance to investigate how knowledge about future climate change is operationalized in governance of the present. The case provides an opportunity, first, to examine efforts to forge politically satisfactory alignments between temporalities of risk assessment and responsibility as social actors reform insurance techniques and instruments and, second, to consider how these alignments hold different parties accountable in response to political mobilization and contestation.

The NFIP, flood insurance rate maps and responsibility for flood risk

In the United States, flood insurance for virtually all homes and small businesses is provided by the federal government through the NFIP, a public programme established by Congress in 1968, long before climate change was a regular feature of public discourse or a target of policymaking. At the time of its establishment, legislators representing flood-prone areas wanted to reduce reliance on disaster relief and loans through indemnification but the private market refused to underwrite flood risk, believing it to be too difficult to assess accurately or underwrite profitably. The federal government, however, could socialize the risk more broadly over time and space, creating a risk pool that joined residents in urban and rural floodplains, in coastal and riverine areas. The experts advising policymakers on the design of such a federal programme (principally floodplain geographers and economists) worried about creating problems of moral hazard, whereby individuals and local authorities would continue to develop and move into hazardous areas once they were financially protected from losses there. To avoid this, they proposed that flood insurance, if it were to work in order to reduce flood losses over time, needed to be priced at actuarial (risk-based) rates at the level of individual policies: higher risk would mean higher rates. The property-level ‘price signal’ of an actuarial rate would make calculative decision-making the responsibility of individual residents, real estate developers and municipalities, who would have to assess the relative costs and benefits of occupying or building in the floodplain (Collier, Citation2014). The price of insurance would ostensibly incentivize various kinds of risk-reducing actions that both prevent future losses and lower insurance premiums in the present, e.g. elevating properties above the expected flood elevations or relocating from or avoiding the floodplain altogether (Bergsma, Citation2016; Knowles & Kunreuther, Citation2014; Moss, Citation2004). After 1974, flood insurance was made compulsory through the addition of a ‘mandatory purchase requirement’ for all at-risk properties with federally backed mortgages, though this requirement has been weakly enforced historically. The design of the NFIP individuated policyholders as owners of calculated shares of the risk, but also asserted that governing on the basis of such individual responsibilities would produce desirable collective outcomes, namely transformations to patterns of land use towards less risky development (Collier, Citation2014). As a condition of accessing the programme, participating communities would have to adopt further land use and flood management measures – again, requirements that have been weakly enforced. The provision of flood insurance, at least as designed, would pair individual responsibility with requirements for collective action.

To make precise, individual property-level risk-based insurance rating possible, the federal government had to get into the business of assessing and mapping flood risk on an ambitious scale. Its flood insurance rate maps, or ‘FIRMs’, depict the nation’s official high-risk flood zones. These zones are determined on the basis of historical information about climate and earlier flood events, which is often of varying degrees of completeness and quality, and current topographical and development conditions at the time of mapping. Flood risk is meant to be regularly reassessed, with maps updated to capture changing conditions as well as to leverage developments in risk science and technology. Such updates are required in order to produce an ‘accurate’ view of the risk that can form the basis of actuarial rate-setting. These maps, however, are time- and resource-intensive to produce and have proven difficult to keep updated since virtually the start of the programme (Knowles & Kunreuther, Citation2014). In the interest of increasing participation during the early years of its operation, many communities were brought into the NFIP in an ‘emergency phase’ before detailed assessment and mapping of flood zones could take place. Without information for actuarial rate calculations, insurance was offered at subsidised rates and land-use regulations, a condition of accessing the programme, were relaxed (Bergsma, Citation2016). Though policymakers at the NFIP’s start believed there would be ‘natural attrition’ of subsidized policies from the programme as homes flooded and presumably were not rebuilt, such subsidies have proven politically resilient (King, Citation2013). After over four decades of operation, an estimated 20 per cent of over five million NFIP policies were still subsidized. A grandfathering provision has also allowed homeowners to keep old rates when maps were updated to reflect higher risk.

By the early 2000s, when climate change entered discussions about the appropriate current and future role of flood insurance, the NFIP had run into precisely the problems that advocates of risk-rated insurance had sought to avoid. The combination of continued premium subsidization and a general lack of enforcement of land use and flood management conditions allowed development of the nation’s floodplains to continue apace. The population living in coastal shoreline counties grew 40 per cent from 1970 to 2010 (Knowles & Kunreuther, Citation2014). Many flood insurance rate maps were years out of date. Individual policyholders encountered a patchwork of different kinds of premiums—some ‘risk-based’, some subsidized, some grandfathered—which bore an ambiguous relationship to the underlying hazards. The federal government had taken on responsibility for indemnifying the worst flood risks and the NFIP could not ‘defensively underwrite’ by dropping those policies, meaning the programme was paying claims to rebuild some of the same ‘repetitive loss properties’ again and again, properties that would be facing some of the most acute risks from climate change.

This was the version of the NFIP in place when Hurricane Katrina hit in 2005. The catastrophic flooding from the storm quickly plunged the NFIP $16 billion in debt to the US Treasury. The programme’s total premiums were not sufficient to cover claims and it had not built up a reserve fund to deal with such catastrophic events. With the expectation of more frequent and severe catastrophic floods due to climate change, in the decade prior to Hurricane Sandy, reformers sought to turn the NFIP’s staggering liability into an overhaul of the programme, the most dramatic since its founding.

NFIP reform: Climate change adaptation as a technical challenge

The ways in which climate change complicates the functioning of a risk-rated system of insurance were raised in the context of NFIP reform, through which policymakers worked to harness the programme to the governance of climate change’s effects. In doing so, they proffered multiple orientations to the question of what or who precisely needed ‘adapting’ and when. These orientations were articulated variously by the key coalition members advocating for reform: environmentalists, conservative and libertarian groups, and insurance and reinsurance lobbyists. For their part, in Congressional hearings and lobbying efforts, environmentalists argued that a reformed NFIP could facilitate society’s adaptation to climate change by discouraging further development in high-risk areas over time, leaving intact the wetlands, barrier islands and marshes that provide natural protection from floods. In other words, the NFIP could be used to adapt. One way to do this was to govern on the basis of ‘accurate’ risk-based prices, dependent on more regularly updated maps of current conditions and the removal of longstanding subsidies and grandfathering, essentially making the programme do what it had been expected to do when it was first established. Here was where environmentalists aligned with conservative and libertarian coalition members, who wanted to abolish government subsidies, but out of a general preference for greater ‘personal responsibility’. These groups emphasized a different notion of adaptation: the NFIP itself needed adapting in order to bring in more premiums to reduce future burdens on taxpayers (Elliott, Citation2017), particularly if catastrophic losses increased due to future climate change.

Both framings of the need for ‘adaptation’—adapting society and adapting the NFIP—led policymakers to one key item for reform: flood insurance rate maps nationwide needed rapid and regular updating. If present risks were aligned with premiums, the NFIP’s calculus of risk and responsibility could better respond to the new problems of climate change. Environmentalists also went further, proposing additionally that the NFIP account for climate change in another fundamental and potentially transformative way: by incorporating scientific knowledge of future climate change into the way the programme maps risk. As mentioned above, FIRMs have always been based on data related to current and historical conditions. Yet, developments in climate science were allowing scientists, risk modellers and (re)insurers to produce maps and other visualizations of different future scenarios defined by climate change. Some reformers believed the NFIP ought to, as well, as FEMA updated the nation’s maps. In 2011 Congressional hearings, environmentalists insisted that any reasonable flood insurance reform ought to reorient the NFIP away from a narrow focus on historical loss and cast an eye toward the future. In June 2011 Senate hearings, the National Wildlife Federation testified:

The climate is changing and we are experiencing more intense storms, sea-level rise, and extreme flooding … We are already seeing an upsurge in the number of heavy rainstorms and many other impacts. As this Committee looks to reform the NFIP, it is important that we look to the future and not in the rear-view mirror. We need to prepare America, and FEMA needs to plan for and factor in the increased risk the way private insurers and reinsurers already are. (US Senate, Citation2011, p. 89)

Professionals in the risk industry have developed new models for climate change in part due to the expectation that, with further disruptions to the climate system, past averages and trends, upon which the NFIP and other insurers premise risk assessment, mapping and pricing, will be less useful for predicting the future. The reformers sought to model the NFIP’s governance of flood risk on the technical innovations of the private insurance sector in taming some of the uncertainty associated with climate change. As the reform developed in Congressional hearings, this idea took shape as a demand that the NFIP incorporate ‘the best available climate science’ into the way it mapped and modelled risk. In the last Senate hearings before flood insurance reform was put to a vote, The Nature Conservancy (TNC) testified:

Information on future changing climate conditions must also be incorporated to enable individuals, communities, and regional and State Government entities to sufficiently plan to mitigate their flood risks. The Senate NFIP bill accomplishes this by requiring the incorporation of the most accurate science on current conditions and future conditions by assessing the best available climate science related to flood risks including the impact of sea level rise and other future conditions. (US Senate, Citation2012, p. 78).

In effect, TNC and other environmental groups proposed that FEMA change the assemblage of techniques, data, calculations and instruments relevant to flood risk assessment. Aside from the technical complexities of making the NFIP ‘look to the future’ (e.g. defining baseline conditions, assessing expected impacts and their timing, interpreting existing socio-economic scenarios, among other adjudications), left unspecified in the legislation was the relationship such an approach would bear to the economization of flood hazards, that is, whether or how these future-oriented calculations would be rendered into insurance premiums, impacting the costs facing individuals. Thus, it did not address the moral calculus of risk and responsibility in flood insurance, what it would mean for policyholders to ‘take a risk’ with respect to climate change. Whether adapting the programme to bring in more premium revenue over the long-term or using the programme to adapt the nation’s floodplains to future hazards expected due to climate change, it was unclear what such ‘adaptation’ might mean for policyholders, who would find themselves subject to updated and/or new kinds of flood maps, as well as bear the brunt of any new premiums, in the near-term.

The final reform bill, the Biggert-Waters Flood Insurance Reform Act (henceforth ‘Biggert-Waters’), passed with overwhelming bipartisan support in July 2012. Biggert-Waters approved an actuarial shift for the entire programme, mandating the elimination of subsidized policies and grandfathering. It also mandated the reconvening of a Technical Mapping Advisory Council (TMAC) that would prepare a set of recommendations to FEMA on how to ‘ensure that flood insurance rate maps incorporate the best available climate science to assess flood risks’ (P.L. Citation112-Citation141, Citation2012). Going forward, as maps were updated and potentially transformed to account for future uncertainty per the TMAC’s recommendations, policyholders would shoulder more of the financial responsibility than before. These two major initiatives would surely interact, but policymakers had not clearly indicated how or on what timeline.

Remapping and risk publics in New York City

New York City became a test case for how the NFIP would interface with climate risk governance because Hurricane Sandy accelerated an ongoing regional flood risk remapping process, which would have new implications under Biggert-Waters, passed three months before the storm. This coincidence of events also happened to take place in a context in which there was political will, at the local level, for adapting to climate change’s effects and mitigating its attendant risks. In the months after Sandy, FEMA disseminated a series of draft map products, in consultation with city officials, in the interest of getting the ‘best available’ data about flood risk and insurance to homeowners impacted by the storm as they contemplated whether or how to rebuild. The city’s ‘preliminary FIRMs’ (pFIRMs), ultimately disseminated in December 2013 and the version that would enter a formal adoption process, put almost 400,000 New Yorkers in the city’s high-risk flood zones, making it the nation’s largest flood zone by population. The pFIRMs more than doubled the number of structures in the flood zones from around 35,000 to 71,500. The base flood elevation (the expected height of floodwaters) across the city went up an average of around two feet; in some areas, it increased more than five feet (Department of City Planning, Citation2014).

These maps did not incorporate climate change; they did not even incorporate Sandy. Even still, the changes were dramatic and generated political and moral debates set off by how policyholders in the present experienced ‘adaptation’ when it involved new visualizations of flood risk and higher insurance premiums, key planks of Biggert-Waters and the effort to make the NFIP account for climate change. As new maps were disseminated, outcry arose from homeowners who would be experiencing new financial burdens from some combination of changing flood zones and elevations, now connected, under Biggert-Waters, to actuarial rating without the subsidies and grandfathering that had made flood insurance affordable for many New Yorkers. The pFIRMs would create new costs for those ‘mapped in’ to the flood zones for the first time. In cases where the financial pressures of flood insurance motivated relocation out of the flood zones, homes with high flood insurance premiums would also become harder to sell for any kind of return. And if the current residents left and their neighbourhoods were redeveloped to be more ‘flood resilient’, who would move in? Who would get to ‘adapt’ in place and who would be priced out? Even for families that could bear the immediate costs, the long-term effects on their economic security were worrisome; people were worried not only about future flooding, but also about future finances (Elliott, Citation2019). While updated maps and higher premiums might make the NFIP more ‘resilient’ to future catastrophes like those expected with further climate change, current policyholders experienced these developments as in some ways ‘maladaptive’ and contradictory, increasing conditions of economic vulnerability even as they promised to make residents physically safer from floods by steering them away from risky areas of the city (Barnett & O’Neill, Citation2010).

The pFIRMs themselves made possible a political response to these insecurities and contradictions. Residents shared the designation of being ‘flood zone homeowners’ on the new maps. Flood insurance, and specifically the technology of the flood map, constituted a ‘risk public’ that mobilized not principally on the basis of shared exposure to the hazard, but rather on the basis of shared exposure to the insurance instruments that construct and govern that hazard as risk. What homeowners in flood zones now had in common was that they were being made to pay for risks they believed they were not responsible for creating in any meaningful way, certainly not if flood risk was attributable to climate change (see also Checker, Citation2017; Koslov, Citation2019). After Sandy’s floodwaters receded, these homeowners targeted their representatives on the City Council and in the state assembly and protested at town halls, in local newspapers, and on the evening news, demanding something be done about flood insurance. A New York State assembly member who represents Rockaway, Queens said: ‘[F]lood insurance was the great equalizer … flood insurance was like a brick to the head, brought us all back to reality and put everyone back on the same team’.Footnote1 This new political entity, a ‘team’ made possible by the very maps and insurance being protested, ultimately grew into a nationwide force, calling itself ‘Stop FEMA Now’. In September 2013, on the eve of protests in nine states, including in the streets of Queens, the founder of Stop FEMA Now, a Sandy-affected New Jersey resident, warned: ‘People in New York, New Jersey and Louisiana know what’s coming because we’re getting the new flood maps. Flood maps in other states are coming later, some not until 2017, and we need to wake them up. Coming to a theatre near you’ (quoted in Alpert, Citation2013).

Though it was homeowners who were protesting most visibly, more powerful economic interests were also implicated in the changes to flood maps and insurance. For real estate developers, building codes inside the flood zones are more stringent and costlier than outside, such that, according to a former head of the NFIP, ‘Moving that line just a couple of blocks further out means spending tens, if not hundreds of millions of dollars’ (quoted in Elliott & Rush, Citation2017). The city, in turn, was concerned about devaluing areas that it wanted to continue to make available for future real estate development and tax revenue, even as further effects of climate change took hold. One city official told me that the biggest problem with the new maps was not the mutiny of the homeowners, it was that the maps might signal the private insurance market that rates could go up and compromise major commercial property values in the city.Footnote2 Real estate interests, including the National Association of Realtors and the National Association of Home Builders, also backed the national lobbying efforts of Stop FEMA Now.

From one map to two: Accounting for climate change in the NFIP

Remapping and repricing flood risk in New York City, it was clear, would not be a straightforward matter of collecting new data, re-running models, and publishing revised estimates. Doing so did not resolve the dilemma of what homeowners were or could be made responsible for with respect to flood risk, particularly as that risk intensified with further climate change. The controversies that remapping New York City sparked would not only influence the course of action taken by the city, but also the efforts of the federal government to ‘adapt’ the NFIP itself and to use the NFIP as an instrument for adapting the country to climate change’s effects.

To become ‘regulatory’, i.e. to be used to rate insurance and establish purchase requirements, the FIRMs have to be formally adopted by a local community. Where communities have the resources and the political will, the review process allows them to file technical appeals if they want to argue that FEMA has misestimated local flood risk and, as a result, move the lines around the flood zones or the height of the base flood elevations. Like many communities had before, New York City embarked on such a technical appeal. In January 2015, the city hired a consulting firm to reassess the local flood risk and produce its own maps of the flood zones and elevations across the city. Six months later, when the results were in, the Office of the Mayor issued a formal appeal of FEMA’s pFIRMs. The appeal argued, on the basis of a rival risk assessment costing millions of dollars, that FEMA’s models had overestimated water levels by over two feet and unnecessarily mapped 26,000 buildings and 170,000 residents into high-risk zones. In pursuing the appeal, the city had to proceed carefully. It had, after all, long been requesting a map update. And again, the boundaries of the flood zones in FEMA’s pFIRMs nearly mirrored the Sandy inundation line, which would suggest that Sandy was a 100-year storm, the standard that defines high-risk flood zones for the purposes of the NFIP. In the appeal’s cover letter to FEMA, Dan Zarrilli, the Director of the Office of Recovery and Resiliency, insisted: ‘The City takes flood risk very seriously’ (Mayor’s Office of Recovery and Resiliency, Citation2015, p. 1). He cited the city’s various climate change initiatives. But this endeavour had to be premised on ‘accurate’ risk assessments and maps. Zarrilli wrote:

The City’s goal continues to be to ensure that FEMA’s flood maps provide a representation of current flood risk based on sound scientific and technical analysis … To ensure that we invest wisely in the areas of the City at greatest risk and reach the City’s resiliency goals, the City must have a scientifically accurate assessment of flood risk. This assessment starts with accurate FEMA FIRMs. (Mayor’s Office of Recovery and Resiliency, Citation2015, p. 2, emphasis original)

The technical approach of the appeal was to highlight the epistemic uncertainty around storms that produce floods, even before considering or incorporating any future conditions. In different flood insurance studies, this uncertainty can arise from a lack of understanding of events and processes, or from a lack of data, or both (Lane et al., Citation2011). In the case of US coastal flood risk, there is a highly limited historical record of storm events. In order to produce a flood risk profile, modellers have to generate a synthetic record of representative hurricanes and nor’easters, which they run at randomized tidal cycles. The city’s appeal argued that FEMA’s models got it wrong: they used the wrong number of storms; had a disproportionate number of them washing ashore close to high tide; over-relied on a particular 1950 nor’easter; and mishandled a ‘drag law’ that showed wind speeds applying too much force to the water. The city’s consultants created new models, based on different drag laws, and ran them at multiple tides over a larger series of events. Divergent modelling strategies, each credibly assessing flood risk on the basis of historical events and losses, reflect specific interpretations of the relevant information needed for resolving the question of risk (Lakoff & Klinenberg, Citation2010; Weinkle & Pielke, Citation2017). In the city’s version of the maps, the flood zones shrank.

Following the announcement of the appeal, Zarrilli told the press that inclusion in a flood zone ‘can have a devastating impact on neighbourhoods’. Queens City Councilman Donovan Richards observed: ‘It was necessary for the city to do it, to try to keep that affordability for homeowners … But we also have to be cautious, and not shrink the map to the extent that if another storm comes, these homeowners would not have been in the flood zone’ (quoted in Ramey, Citation2015). The careful language of the appeal put the city on the side of residents needing access to affordable coverage today, while committing to responding to risks expected to worsen due to climate change in the future—based on more ‘accurate’ maps than those initially provided by FEMA.

As the backlash intensified in New York City and spread to other floodplain communities around the country, the TMAC (that group of experts Biggert-Waters brought together) began meeting. A major part of its mandate was to develop recommendations for incorporating climate science into flood insurance studies and maps, which happened against this backdrop of intense public scrutiny of the NFIP. The TMAC was comprised of 21 floodplain managers, planners, engineers, NFIP officials and risk managers. The TMAC met 11 times over the course of September 2014 to February 2016, when its Future Conditions Risk Assessment and Modeling Report was published. The situation on the ground reverberated up to the TMAC. In public comments during the TMAC’s first meeting, a representative from the National Association of Realtors (NAR), which backed Stop FEMA Now, offered a plea to TMAC members that they remember the people who would be impacted by their decisions and recommendations. Maps were a ‘flashpoint’ and insurance affordability related to new maps would affect community property values. Talk of future conditions ‘often strikes fear into the heart of homeowners and NAR members’.Footnote3 New York City also actively participated in the TMAC, with the Deputy Director for Planning in the New York City Mayor’s Office of Recovery and Resiliency sitting on the Council.

The report included extensive discussion of how ‘observed climate change’ had already directly impacted flood risks, in both coastal and riverine areas (Technical Mapping Advisory Council, Citation2016, section 3, p. 23, 14, 19). The TMAC concluded that FEMA’s existing modelling framework could be used for calculating and mapping future conditions but would require inputs of additional information and data about future natural and manmade changes. As the environmentalists involved in Biggert-Waters had suggested, the incorporation of this data would reorient flood mapping to ‘account for a potential future that is not based on the past’; ‘the rules of stationarity … will no longer be valid’ (Technical Mapping Advisory Council, Citation2016, p. 26). The TMAC therefore recommended that FEMA use a ‘scenario approach’ of the type that are ‘often used to analyze problems that are characterized by large uncertainties with large potential consequences’ (Technical Mapping Advisory Council, Citation2016, section 5, p. 3). The New York City Panel on Climate Change’s own sea level rise mapping efforts were included as an ‘example case study’. Such an approach, ‘built from’ the NFIP’s existing flood hazard analyses, would produce a range of different scenarios to provide visualizations of multiple ‘plausible future states’ (Technical Mapping Advisory Council, Citation2016, p. 14). This would amplify the role of uncertainty in governing flood:

In the case of future conditions … projected trends and variabilities will be based on some combination of data and modelling, both of which magnify uncertainty. Uncertainties will be even greater for future conditions than those associated with modeling and mapping existing conditions … (Technical Mapping Advisory Council, Citation2016, p. 9)

The Council noted that even assessments of current risk, based on historical averages and recent trends, were characterized by uncertainty associated with estimates, whether acknowledged or not. The pFIRM appeal in New York City was a recent and dramatic example of how such uncertainty could be operationalized to subject the processes of risk governance to scrutiny. The Council was aware that the introduction of greater uncertainty into flood hazard assessment and mapping had political implications, particularly if this were to form the basis of insurance rating, a matter that Biggert-Waters had not clearly adjudicated. Though questions about insurance implications came up repeatedly during Council meetings, the final report was clear that the TMAC was not ‘getting out of our lane’, in the words of a TMAC member at a December 2015 virtual meeting,Footnote4 declining, at least at that stage, to discuss directly the cost implications of transforming FEMA’s mapping:

The [Biggert-Waters] mandate for this report directs the TMAC to outline the best available methodologies for considering the impacts of sea level rise and future development on flood risk, not to dictate how that information is used. (Technical Mapping Advisory Council, Citation2016, section 6, p. 2)

If the scenario-based approach became ‘regulatory’, i.e. was used to price the flood insurance that many homeowners were required to buy, the TMAC noted, ‘future conditions modelling introduces additional uncertainty to calculations and the potential for additional appeals should be considered’ (Technical Mapping Advisory Council, Citation2016, section 6, p. 2). FEMA might succeed in transforming its mapping to account for climate change, but if communities facing higher costs and threats to property values and real estate development appealed every time a map was updated, they might be dead letters. Controversies regarding fairness were also likely: ‘If future conditions become linked to mandatory insurance requirements, an analysis of the impact to property owners may need to be conducted. Issues of equity and affordability associated with insurance premiums need to be considered’ (Technical Mapping Advisory Council, Citation2016, section 6, p. 3). As a result, the TMAC recommended that all future conditions flood risk products and information be ‘non-regulatory’, meaning for ‘advisory’ purposes at the level of NFIP administration. Local communities, the report recommended, should be allowed discretion to adopt future conditions flood hazard products, tools and information (Technical Mapping Advisory Council, Citation2016, p. 7). In public comments at a December 2015 meeting, a representative from the Association of State Floodplain Managers flagged the persistent ambiguity surrounding the implications of the TMAC’s effort despite this official parsing of ‘regulatory’ and ‘non-regulatory’ uses: ‘I think the explanation for the recommendation is unintentionally misleading and would have rather the report stayed silent on the topic until a deeper analysis could be done’.Footnote5 At a second December 2015 virtual meeting, convened to discuss potential focus areas for the TMAC’s next annual report, the Deputy Director for Planning from New York City also asked a representative from FEMA for more information about the ‘subsequent transformation of insurance pricing’ following any changes to mapping. The FEMA representative demurred, responding that FEMA would not make detailed connections between the TMAC’s recommendations and insurance pricing; this was a ‘longer-term consideration’.Footnote6

FEMA’s consideration of New York City’s pFIRM appeal continued for a further eight months after the publication of the TMAC’s Future Conditions report. In October 2016, FEMA and the New York City Office of the Mayor issued a joint press release announcing the resolution of the appeal. Going forward there would be not one flood map, but two. Insofar as the NFIP was concerned, current flood risk in New York City would be economized and governed on the basis of revised FIRMs; FEMA would redo the coastal storm surge analysis on the basis of the city’s appeal. But the future would be depicted on an entirely new ‘future-looking’ map product ‘reflecting future conditions that account for climate change’ (Federal Emergency Management Agency, Citation2016). Following from the recommendations of the TMAC, FEMA and the city would be co-developing a ‘new methodology’ that would result in ‘a new set of flood maps for planning and building purposes that better accounts for the future risk of sea level rise and coastal storm surge’ (Federal Emergency Management Agency, Citation2016, emphasis original). These maps would not turn flood hazard into a NFIP insurance premium for property owners. They would not be ‘regulatory’ by triggering insurance requirements. But they would tame some of the uncertainty around future conditions in order to guide capital planning, as well as make clear that the city was not in denial about its flood risk or climate change. In the press release, Zarrilli was quoted:

FEMA’s decision to redraw New York City’s flood maps, and to work with us to produce innovative, climate-smart flood maps, allows us to begin separating the calculation of annual insurance premiums against current risk from the necessary long-term planning and building we need to do as a city to adapt to rising seas and climate change.

The FIRMs, capturing ‘current’ flood risk, would be used to price homeowner insurance policies, while the maps of ‘future’ risk would guide long-term planning projects, for things like flood barriers and other major infrastructure projects. These interventions have their own temporalities. Policies insure for a year at a time. Flood walls, once built, last for a long time.

The implications of the two-map agreement extend beyond New York City. According to a FEMA spokesperson, it represents the first in ‘a series of demonstration projects’ for a new, national strategy that separates the calculation and depiction of current flood risk from the calculation and depiction of uncertain scenarios defined by the future conditions of climate change. According to the spokesperson, going forward, ‘new products will be delivered to NFIP communities separately from the FIRMs’.Footnote7

Conclusion

In this episode in the life of the NFIP, we see New York City authorities and federal experts and officials grapple with how to deploy knowledge of climate change in the governance of the present through its incorporation into the instruments of flood insurance. The challenge was not one of establishing the legitimacy or authority of climate science as such, nor of convincing policymakers to take scientific assessment of climate change seriously in strategies of governance. Nor was it principally a technical challenge, as the environmentalists involved in Biggert-Waters had imagined it. Instead, making the programme’s instruments ‘look to the future’ proved considerably more difficult due to the ambiguities of economizing hazards under these protocols and the ethical controversies such ambiguities gave rise to. FEMA might develop the knowledge base, with the recommendations of the TMAC, to assess and map flood hazards in state-of-the-art ways. But if these scenario-based models were economized through flood insurance, would or could people pay for it? Should they? The solution that emerged, recommended by the TMAC and enacted in New York City as a model for the rest of the country, separated two orientations to flood in order to manage the political and moral implications, preserving a regulatory role for insurance, premised on historically based calculations of current risk, and articulating a planning role for scenario-based models, incorporating different degrees of uncertainty. The uncertainty of the future could be organized, quantified and visualized—‘tamed’, but not completely transformed into a priceable risk that could be passed on to individual homeowners. The science provided a new and potentially more appropriate temporality of risk assessment—scenario-based methods that dealt with the stationarity problem—but the corresponding temporality of responsibility was unsettled and unsettling. The outcome was necessarily, and significantly, a politically acceptable one. There is a ‘looping’ effect (Hacking, Citation2004) here, in which calculating risk does not simply represent an underlying, objective reality of hazard, but rather generates new kinds of real effects, which loop back into the knowledge practices through which risks are calculated, visualized, economized and distributed in public insurance (or not).

The constitution of new political collectivities is itself one such real effect, as well as the source of further effects on the methods and outcomes of governing climate risk. By grafting flood zones onto the waterfront communities of New York City and across the country, flood insurance creates a new category of political identification: ‘flood zone homeowner’. On this basis, networks like Stop FEMA Now emerged and articulated a set of shared experiences and interests, which they asserted as relevant to adjudications of how floodplains will be governed by insurance. Resembling Callon’s (Citation2007) ‘emergent concerned groups’, born from the way calculations are framed, Stop FEMA Now organized otherwise unmobilized individuals who then strove to ‘revive the process of politicization’ (p. 140) of not only technical mapping processes, but also of the actuarial shift of the NFIP. Congress responded to Stop FEMA Now by paring back Biggert-Waters in March 2014 in order to slow down rate increases and study premium affordability. This new collective political actor set boundaries of its own: boundaries of solidarity and responsibility, making common cause more with real estate interests (which have in many cases acted historically to put people in harm’s way) than with other ‘at risk’ populations like renters and public housing residents (Elliott, Citation2017). These boundaries could of course be drawn differently (Jamieson, Citation2013); to the extent that current flood risk indeed reflects in part ‘observed climate change’ (in the words of the TMAC), the collective actor responsible for climate change risk could be framed as the current and previous generations of resource-voracious rich-world citizens, which would implicate New York City homeowners along with other parties and activities associated with high carbon emissions. Insurance, however, facilitates a different kind of political imaginary with its flood zones and risk pools.

The specific dilemmas identified and described here illustrate the more general ways in which insurance acts as a key site where the political morality of climate adaptation is being worked out. The risk assessments that underpin insurance can operate at multiple temporal scales, and the two-map solution dramatizes how such assessments are politically calibrated to find a way through normative issues related to the temporality of climate ethics: what kinds of adaptations can be demanded, and which people should be made to pay, and when. The ‘regulatory’ map of current risk, used to rate insurance, does not make climate change’s effects per se the responsibility of insured individual homeowners, though this mobilizes an implicit understanding that climate change is still a problem of the future, not the present. It provides a reference frame for a political morality of risk in which these residents are not made financially responsible for climate risks in any explicit way (FEMA says it currently accounts for ‘future conditions’ through general contingency loading, but there is no explicit allocation of the load for this specifically; see Technical Mapping Advisory Council, Citation2016, section 2, p. 8). The ‘non-regulatory’ maps of future conditions provide a reference frame for a political morality of risk in which communities, governance authorities in particular, have an imperative to embrace their future risk (Baker & Simon, Citation2002), to give at least the appearance of control through evidence-gathering, representation and planning (Webb, Citation2011). These new maps perform a vulnerability to climate change, under multiple scenarios, that is economically viable in the face of hazards. As Webber (Citation2013) argues, vulnerability to climate change is not a latent status, but rather is enacted and performed as the emergent effect of an assemblage of facts, expert actors and objects. Producing these new maps will treat some urban areas as ‘sites of potential catastrophe’ (Collier & Lakoff, Citation2008, p. 8), but the vulnerability performed in them does not disqualify further urban development by putting a prohibitively high price tag on risk. Instead, the maps can guide the expenditure of capital, identifying where money can be spent to redevelop housing and infrastructure in more ‘resilient’ ways. Governing on the basis of these maps represents the uncertainty posed by climate change as a source of current and future opportunity—for some. And even in ‘sites of potential catastrophe’, while there may indeed be ‘little doubt that climate change impacts will trigger massive devaluations in the built environment’ (Johnson, Citation2015, p. 2503; also Sayre, Citation2010), with insurance as the mediating institution, we should expect policymakers and officials to feel pressure to blunt the force, or curtail the reach, of these devaluations. They can do this not only in the context of public insurance arrangements, as in the case here, but also by investing in public works and preparedness measures, as well as by intervening in the regulation of private insurance, which they are often called to do (Weinkle & Pielke, Citation2017; Gray, Citationforthcoming), and which has taken place elsewhere in the United States (e.g. Florida), United Kingdom and Germany (Krieger & Demeritt, Citation2015).

This distinction between ‘current’ and ‘future’ risk elides the fact that risk always implies an orientation toward the future, at different horizons established by how frequently risks are reassessed and repriced and by the way contracts are written. But the distinction works as a politically practical one. The actors here find a way to treat them separately; pulling apart the climate-changed present and future is simultaneously impossible and necessary. Doing so allows the actors involved to get on with the business of governing. Contestation over risk definitions produces an understanding of the maps, and the models that underpin them, as essentially different, as useful for doing different kinds of work. In the course of this contestation, relationships between past, present and future change. In the view of flood insurance reformers, the past is no longer a trustworthy source of information for setting expectations for the future. But a future that ‘accounts for’ climate change, at least a version of it represented as a set of possible scenarios, is a politically difficult basis upon which to demand certain kinds of adaptations and responsibilities in the present. For the residents of these communities, the two-map arrangement generates ambiguous demands regarding the practices with which they are expected to govern themselves. They need to heed the FIRMs to adapt to the near-term requirements of insurance, but does climate adaptation proper require the simultaneous consideration of the future scenarios depicted in this new, second set of maps? And if the city uses the future-oriented map to build a flood wall, can they rely on it to protect their homes and lower the costs of their insurance? The resilient insurantial subject now has at their disposal multiple instruments for understanding and acting upon historical trends, current conditions and future expectations.

In disputes about insurance we can make legible the strains that arise around issues of risk and responsibility in the face of climate change. These strains are as much political and moral as they are technical and financial, part of a larger problem of how states will manage and distribute increasing costs of natural hazards and disasters going forward. In such matters, the ethical dilemmas never go away, but they are renegotiated in part through changes to the implicit and explicit political contracts around risk and responsibility enacted by insurance.

Acknowledgements

I would like to thank Stephen Collier, Turo-Kimmo Lehtonen, Savannah Cox, Ian Gray, Kevin Grove, Leigh Johnson, Sarah Bracking and Zac Taylor for their comments on earlier drafts of this paper. I would also like to thank the organizers, co-panelists and audiences of the Green Economy Contradictions mini-conference at the annual meeting of the Society for the Advancement of Socio-Economics, as well as of the Financialization of the City workshop at the Max Planck Institute for the Study of Societies, for their comments on the work. Parts of this paper elaborate on data that appear in Underwater: Loss, flood insurance, and the moral economy of climate change in the United States (Columbia University Press). Finally, I would like to thank the anonymous referees and the journal’s editorial board for their careful and constructive comments.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Rebecca Elliott

Rebecca Elliott is Assistant Professor in the Department of Sociology at the London School of Economics and Political Science. Her research examines the intersections of environmental change and economic life, as they appear across public policy, administrative institutions, and everyday practice, with a particular focus on the governance of climate change.

Notes

1 Interview, 21 March 2014.

2 Interview, 20 March 2014.

3 Technical Mapping Advisory Council Meeting 30 September – 1 October 2014, certified summary notes, published 1 October 2014.

4 Virtual meeting of the TMAC, field notes, 10 December 2015.

5 Virtual meeting of the TMAC, field notes, 9 December 2015.

6 Virtual meeting of the TMAC, field notes, 10 December 2015.

7 Email from FEMA spokesperson to author, 4 November 2016.

References

  • Alpert, B. (2013, September 25). Rallies planned Saturday in nine states to protest flood insurance premium hikes. The Times-Picayune.
  • Baker, T. & Simon, J. (2002). Embracing risk: The changing culture of insurance and responsibility. University of Chicago Press.
  • Barnett, J. & O’Neill, S. (2010). Maladaptation. Global Environmental Change, 20(2), 211–213.
  • Beck, U. (2006). Living in the world risk society. Economy and Society, 35(3), 329–345.
  • Bergsma, E. (2016). Geographers versus managers: Expert influence on the construction of values underlying flood insurance in the United States. Environmental Values, 25(6), 687–705.
  • Bulkeley, H. (2001). Governing climate change: The politics of risk society? Transactions of the Institute of British Geographers, 26, 430–447.
  • Callon, M. (2007). An essay on the growing contribution of economic markets to the proliferation of the social. Theory, Culture & Society, 24(7-8), 139–163.
  • Checker, M. (2017). Stop FEMA Now: Social media, activism and the sacrificed citizen. Geoforum; Journal of Physical, Human, and Regional Geosciences, 79(February), 124–133.
  • Chen, D. W. (2018, January 7). In New York, drawing flood maps is a ‘game of inches’. New York Times.
  • Collier, S. J. (2014). Neoliberalism and natural disaster: Insurance as a political technology of catastrophe. Journal of Cultural Economy, 7(3), 273–290.
  • Collier, S. J. & Lakoff, A. (2008). Distributed preparedness: The spatial logic of domestic security in the United States. Environment and Planning D: Society and Space, 26(1), 7–28.
  • Dean, M. (1999). Risk, calculable and incalculable. In D. Lupton (Ed.), Risk and sociocultural theory: New directions and perspectives (pp. 131–159). Cambridge University Press.
  • Department of City Planning. (2014). Coastal climate resiliency: Retrofitting buildings for flood risk. New York City: Department of City Planning.
  • Elliott, R. (2017). Who pays for the next wave: The American welfare state and responsibility for flood risk. Politics & Society, 45(3), 415–440.
  • Elliott, R. (2019). ‘Scarier than another storm’: Values at risk in the mapping and insuring of US floodplains. British Journal of Sociology, 70(3), 1067–1090.
  • Elliott, R. & Rush, E. (2017, June). Stormy waters: The fight over New York City’s flood maps. Harper’s Monthly.
  • Ewald, F. (1991). Insurance and risk. In G. Burchell, C. Gordon & P. Miller (Eds.), The Foucault effect: Studies in governmentality (pp. 197–210). University of Chicago Press.
  • Federal Emergency Management Agency. (2016, October 17). Mayor De Blasio and FEMA announce plan to revise NYC's flood maps [News release].
  • Ferry, E. (2016). Claiming futures. Journal of the Royal Anthropological Institute, 22(S1), 181–188.
  • Fourcade, M. & Healy, K. (2013). Classification situations: Life-chances in the neoliberal era. Accounting, Organizations and Society, 38, 559–572.
  • Garland, D. (2003). The rise of risk. In R. V. Ericson & A. Doyle (Eds.), Risk and morality (pp. 48–86). University of Toronto Press.
  • Gray, I. (Forthcoming). Hazardous simulations: Constructing estimates of climate risk in US coastal insurance markets. Economy and Society.
  • Hacking, I. (1975). The emergence of probability: A philosophical study of early ideas about probability induction and statistical inference. Cambridge University Press.
  • Hacking, I. (2004). Between Michel Foucault and Erving Goffman: Between discourse in the abstract and face-to-face interaction. Economy and Society, 33(3), 277–302.
  • Heimer, C. (2003). Insurers as moral actors. In R. V. Ericson & A. Doyle (Eds.), Risk and morality (pp. 284–316). University of Toronto Press.
  • Jacques, P. J., Dunlap, R. E. & Freeman, M. (2008). The organization of denial: Conservative think tanks and environmental scepticism. Environmental Politics, 17(3), 349–385.
  • Jamieson, D. (2013). Jack, Jill, and Jane in a perfect moral storm. Philosophy and Public Issues, 3(1), 1–17.
  • Johnson, L. (2015). Catastrophic fixes: Cyclical devaluation and accumulation through climate change impacts. Environment and Planning A, 47(12), 2503–2521.
  • King, R. O. (2013). The National Flood Insurance Program: Status and remaining issues for Congress. Congressional Research Service.
  • Knowles, S. G. & Kunreuther, H. C. (2014). Troubled waters: The National Flood Insurance Program in historical perspective. Journal of Policy History, 26(3), 327–353.
  • Koslov, L. (2019). How maps make time: Temporal conflicts of life in the flood zone. City, 23(4-5), 1–15.
  • Krieger, K. & Demeritt, D. (2015). Limits of insurance as risk governance: Market failures and disaster politics in German and British private flood insurance. Working Paper: Centre for Analysis of Risk and Regulation.
  • Lakoff, A. & Klinenberg, E. (2010). Of risk and pork: Urban security and the politics of objectivity. Theory and Society, 39(September), 503–525.
  • Lane, S. N., Landström, C. & Whatmore, S. J. (2011). Imagining flood futures: Risk assessment and management in practice. Philosophical Transactions of the Royal Society A, 369, 1784–1806.
  • Lehtonen, T. & Liukko, J. (2015). Producing solidarity, inequality, and exclusion through insurance. Res Publica, 21(May), 1–15.
  • Mayor’s Office of Recovery and Resiliency. (2015). Appeal of FEMA’s preliminary flood insurance rate maps for New York City. New York City Office of the Mayor.
  • Michaels, D. (2008). Doubt is their product. Oxford University Press.
  • Moss, D. A. (2004). When all else fails: Government as the ultimate risk manager. Harvard University Press.
  • Neale, T. (2016). Burning anticipation: Wildfire, risk mitigation and simulation modelling in Victoria. Australia. Environment and Planning A, 48(10), 2026–2045.
  • Oreskes, N. & Conway, E. M. (2010). Merchants of doubt. Bloomsbury Press.
  • P.L. 112-141. (2012). Flood Insurance Reform and Modernization Act. Government Printing Office.
  • Porter, J. & Demeritt, D. (2012). Flood-risk management, mapping, and planning: The institutional politics of decision support in England. Environment and Planning A, 44(10), 2359–2378.
  • Powell, J. L. (2011). The inquisition of climate science. Columbia University Press.
  • Ramey, C. (2015, August 18). New York disputes FEMA on flood risk. Wall Street Journal.
  • Reith, G. (2004). Uncertain times: The notion of ‘risk’ and the development of modernity. Time & Society, 13(2-3), 383–402.
  • Sarewitz, D. (2004). How science makes environmental controversies worse. Environmental Science & Policy, 7(5), 385–403.
  • Sayre, N. F. (2010). Climate change, scale, and devaluation: The challenge of our built environment. Washington and Lee journal of Energy, Climate, and the Environment, 1, 93–105.
  • Shaw, A. (2013, December 6). How well did FEMA’s maps predict Sandy’s flooding. ProPublica.
  • Technical Mapping Advisory Council. (2016). Future conditions risk assessment and modeling. Federal Emergency Management Agency.
  • US Senate. (2011). Committee on Banking, Housing, and Urban Affairs. Hearing: Reauthorization of the National Flood Insurance Program. 112th Cong., 1st sess., 9 June and 23 June. Government Printing Office.
  • US Senate. (2012). The Committee on Banking, Housing, and Urban Affairs, Subcommittee on Economic Policy. Hearing: The National Flood Insurance Program: The need for long-term reauthorization and reform. 112th Cong., 2nd sess., 9 May. Government Printing Office.
  • Washington, H. & Cook, J. (2011). Climate change denial: Heads in the sand. Earthscan.
  • Webb, J. (2011). Making climate change governable: The case of the UK climate change risk assessment and adaptation planning. Science and Public Policy, 38(4), 279–292.
  • Webber, S. (2013). Performative vulnerability: Climate change adaptation policies and financing in Kiribati. Environment and Planning A, 45(11), 2717–2733.
  • Weinkle, J. & Pielke Jr., R. (2017). The truthiness about hurricane catastrophe models. Science, Technology, & Human Values, 42(4), 547–576.
  • Zeiderman, A. (2015). Spaces of uncertainty: Governing urban environmental hazards. In L. Samimian-Darash & P. Rabinow (Eds.), Modes of uncertainty (pp. 182–200). University of Chicago Press.
  • Zeiderman, A. (2016). Prognosis past: The temporal politics of disaster in Colombia. Journal of the Royal Anthropological Institute, 22(S1), 163–180.