5,873
Views
66
CrossRef citations to date
0
Altmetric
Articles

Process matters: a framework for conducting decision-relevant assessments of ecosystem services

, , , , , , & show all
Pages 190-204 | Received 01 May 2014, Accepted 12 Sep 2014, Published online: 05 Nov 2014

Abstract

Ecosystem Service Assessments (ESAs) have become a popular tool for science-based policy. Yet, there are few guidelines for developing an ESA to inform a decision-making process. This is an important area of inquiry since the process of conducting an ESA is likely to affect the quality of results and their influence on decisions. Drawing on the lessons of conducting ESAs around the world, we propose a set of enabling conditions and a framework for carrying out ESAs that foster high-quality results and drive action. Our framework includes an emphasis on iterative stakeholder engagement, advancing science to address policy needs, and capacity-building through six general steps: (1) scope the process, (2) collect and compile data, (3) develop scenarios, (4) analyze ecosystem services, (5) synthesize results, and (6) communicate knowledge. Our experience indicates that using this framework to conduct an ESA can generate policy-relevant science and enhance uptake of information about nature’s benefits in decisions.

1. Introduction

Increasing awareness of the links between natural resource management and human well-being has spurred a growing call to incorporate ecosystem services (ES) into decision making and policy to create better outcomes for nature and people (Foley et al. Citation2005; Millennium Ecosystem Assessment Citation2005; Daily et al. Citation2009; Tallis & Polasky Citation2009; Braat & de Groot Citation2012; Cardinale et al. Citation2012; Goldstein et al. Citation2012; Guerry et al. Citation2012). As a result, an increasing number of institutions, policies, and agreements call for ecosystem service assessments (ESA) (Vihervaara et al. Citation2010; IPBES Citation2014; TEEB Citation2014). Yet, the difficulty of applying environmental research and assessment to decisions is well documented (e.g. Knight et al. Citation2008; Laurans et al. Citation2013; Neßhöver et al. Citation2013; MacDonald et al. Citation2014).

Researchers have identified important elements or criteria for high-quality ESAs (Seppelt et al. Citation2011, Citation2012; Ruckelshaus et al. Citation2013; McKenzie et al. Citation2014). However, these studies do not provide actionable guidelines for how to embed an ESA into planning and decision making. Practitioners and researchers face a number of constraints, including limited resources and knowledge, poor-quality data or missing information, and competing demands from a number of stakeholders. The options for addressing these challenges are often unclear (Carpenter et al. Citation2006, Citation2009). With so little guidance, it can be difficult to predict the consequences of alternative options and design a planning process that will meet scientific, policy, or business objectives. Based on our experience using ESAs to inform decisions around the world (Ruckelshaus et al. Citation2013), we propose a framework and approach for conducting an integrated science-policy process that has the potential to improve the quality of ES information and affect decisions.

Our work with the Natural Capital Project (www.naturalcapitalproject.org) typically uses the InVEST (Integrated Valuation of Ecosystem Services and Tradeoffs) suite of models to map and value ES (Sharp et al. Citation2014). In this article, we do not focus on a particular model; rather, we draw lessons from the broader process we have developed to conduct ESAs. We present a framework relevant to researchers and practitioners regardless of the particular models or analytical approaches they employ.

Throughout this study, we build on the lesson from Ruckelshaus and colleagues (2013) that applied ESAs are ‘most effective in leading to policy change as part of an iterative science-policy process.’ We ask: What are the elements of a successful science-policy process? What factors enable success and what common challenges need to be overcome to embed ESAs in decisions? To answer these questions and propose a coherent framework, we conduct a comparative case study analysis of six applied ESAs undertaken between 2010 and 2014 to inform a variety of ‘decision contexts,’ including spatial planning, payment for ES, green economy design, impact assessments for permitting and mitigation, marine and coastal planning, and natural capital accounting. Throughout this article, we use the term applied ESA, and sometimes just ESA, to refer to ES assessments that are conducted with intention to inform specific policies or decisions. ESAs provide information about ES and their value; applied ESAs provide actionable information for achieving preferred ES outcomes, typically related to decisions about natural resource use and management.

2. Enabling conditions: The 5 Ps

Our experiences applying ES approaches and tools in over 20 cases indicate a number of factors that affect the likelihood that an ESA will be successful in influencing a decision (Ruckelshaus et al. Citation2013). Based on these experiences, we now frequently evaluate potential ESAs with the ‘5 Ps’ – enabling conditions that affect the likelihood of success: Policy question, Policy window, People, Pertinent data, and an iterative science-policy Process. We equate success with robust scientific results that effectively generate action (i.e., reach at least as far as Pathway 3 in of Ruckelshaus et al. Citation2013). A description of the ‘5 Ps’ follows:

Figure 1. Framework for ecosystem service assessment. Iteration is critical within each step of the process (inset) and between steps (inset again) for the entire process to be effective. We find that revisiting steps 1–3 after ‘Synthesize’ (step 5) are the most common pathways for iteration; however, other trajectories do occur.

Figure 1. Framework for ecosystem service assessment. Iteration is critical within each step of the process (inset) and between steps (inset again) for the entire process to be effective. We find that revisiting steps 1–3 after ‘Synthesize’ (step 5) are the most common pathways for iteration; however, other trajectories do occur.

Policy question: We have found ESAs to be more successful when designed from the outset to address a clearly defined planning process, policy, or decision question. This increases the potential for the ESA to provide information that policymakers take up. Information needs can relate to any stage of the policy process, including: defining a problem, setting the policy agenda, choosing among alternatives, and evaluating policy effectiveness. For example, we framed an ESA in Sumatra, Indonesia, around two policy questions facing provincial and district governments: Do the potential benefits of sustainable spatial planning justify the costs of foregone development? How and where can sustainable spatial planning be implemented and financed? By targeting these questions, the ESA effectively informed the development of district and provincial land-use plans and investment decisions by the Millennium Challenge Corporation (MCC) and other international organizations (Ruckelshaus et al. Citation2013).

Policy window: We have found that well-timed ESAs targeted to seize policy opportunities or ‘policy windows’ (Kingdon Citation1995) maximize the likelihood that results will influence decisions. For an ESA to have impact, political actors must be willing and able to consider results as an input to new or changing policy or decisions. The time for decision making should match the time frame for the ESA. Although windows can change as political processes stall or scientific study takes longer than expected, good scoping can help scientists and practitioners adjust to these changes (see ). Policy windows can open up for a single but highly influential piece of legislation (e.g., the first national Integrated Coastal Zone Management Plan in Belize) or for taking decisions to scale (e.g., the Latin American Water Funds Partnership to replicate and scale water funds in Latin America).Footnote1

At times the motivation and resources to conduct an ESA will emerge without a clear policy window. When this occurs, the applied ESA should consider what information would be most useful to inform potential policy opportunities. An ESA can itself facilitate a policy window by defining a problem or demonstrating a possible solution to a problem. For example, Arkema and colleagues (Citation2013) recently published a study that drew attention to the role of coastal ecosystems in reducing risk from hazards. Elements of the science developed in this ESA are now being used to increase the effectiveness of investments in habitat restoration under the US Restore Act in response to the BP Deepwater Horizon oil spill.

People: In addition to scientific and technical experts with domain-specific knowledge of ES, leaders familiar with local policy dynamics and with a deep understanding of the decision context perform a critical role in interpreting, applying, and championing the results of an ESA. For example, the participation of local leaders in an ESA to support the design of water funds in Latin America proved invaluable to help researchers ask the right questions and integrate the ES approach into their diverse organizations. In other cases a single influential leader can help overcome obstacles to communication and trust and engender broad stakeholder support for an ESA. In an ESA in Himachal Pradesh, India, for example, the director of the state Department of Environment, Science and Technology played this role by facilitating a process among state leaders and bringing stakeholders together to learn about natural capital accounting (Vogl et al. Citation2014).

Pertinent data: ESA data and results need to be pertinent to the decision at hand, applied at appropriate scale and resolution, and transparent to decision makers. In coastal zone planning in Belize, for example, we informed a national spatial plan by focusing on a key interest of planners: cumulative impacts of alternative coastal planning options for important economic activities (Arkema et al. CitationForthcoming). We aggregated data from local, trusted sources (e.g., a well-respected scientist from the government fisheries agency) with high-quality regional data to ensure appropriate resolution and quality control. Yet, in other areas, we had to limit our analysis (e.g., since data at the appropriate scale for conch fisheries did not exist, we had to leave it out of our ESA).

Iterative science-policy process: Perhaps most critically, it is important to have ongoing coordination among scientists and decision makers in an iterative science-policy process, whereby steps in the process are repeatedly revisited to improve results (see ). We find that an ESA is more likely to generate action when embedded within a participatory decision process based on sound scientific analysis, with mechanisms for building trust and resolving conflicts (Beierle & Konisky Citation2001; Reid et al. Citation2009). An example of such a process comes from the ESAs used to prioritize investments in water funds in Latin America. Over 2 years, we engaged with the Latin American Water Funds Partnership, which supports 16 water funds across Latin America. The Partnership, along with stakeholders from several of the 16 water funds, contributed to the development of a new ESA tool through a series of three design and testing workshops that incorporated training and capacity-building (Goldman-Benner et al. Citation2013; Vogl et al. Citationin prep). In this and other cases, an ongoing process of iteration has proved to build salience, credibility, and legitimacy (Cash et al. Citation2003) and encourage uptake of ES information for a variety of uses (McKenzie et al. Citation2014). We use six case studies to explore how an iterative science-policy process can help an ESA take full advantage of the 5 Ps of enabling conditions.

3. Case studies

We draw empirical lessons from six applied ESAs, conducted between 2010 and 2014 (see Appendix for details). These cases are similar in several important ways: they are conducted by interdisciplinary teams of scientists and practitioners; they were initiated with the 5 Ps in mind; and they use the same suite of ES models (InVEST). Yet they are dissimilar in other ways: they seek to inform different types of policies and management decisions; they occur across the globe; and they focus on different ecosystems, services, and stakeholders. Case 1 supports the creation of a national Integrated Coastal Zone Management Plan in Belize. Case 2 contributes to the design of a green economy on the island of Borneo. Case 3 informs the development of a national-scale ES impact and offset policy Colombia. Case 4 provides capacity-building and technical support for a state government in India to establish natural capital accounting and improve ES management in forests. Case 5 provides input to land-use planning in an area of central Sumatra containing some of the last remaining habitat of the Sumatran tiger. Case 6 develops a standardized approach for prioritizing investments in water funds to scale up the approach across Latin America. Drawing on these six cases, we propose framework for conducting applied ESAs.

4. A proposed framework

Through our experience and a comparative case analysis, we identify generic steps of a science-policy process that can take advantage of, and help create, the 5 – the enabling conditions we describe above. These steps delineate a pathway to embed ESA results in diverse decisions. The steps are (1) scope or frame the ESA, (2) collect and compile data, (3) develop scenarios, (4) analyze ES, (5) synthesize results, and (6) communicate knowledge ().

We explore each step in the framework, with an emphasis on the decisions that practitioners will need to take. Drawing on the six applied ESAs in our case analysis, we illustrate how these steps can be carried out and describe the lessons learned through practice. We then identify the common challenges faced by scientists and practitioners at each step and how the 5 Ps affect implementation.

4.1. Step 1 – Scope

In the first phase of an applied ESA, there are typically many questions to consider: Who are the relevant actors affected by changes in ES? What are the core values and main decisions faced by policymakers and stakeholders? Which ES and tradeoffs should be considered? What are the available tools and capacity for analyzing different ES? Scoping the ESA to link to the decisions and decision makers of interest is essential for framing appropriate scientific questions, improving the quality of analytical outputs, and increasing the likelihood that results are salient and accessible to stakeholders and policymakers. It can also create the conditions that enable success by clarifying the policy question and ES of interest, identifying local leaders and champions, and identifying whether a policy window has opened or could open as the result of the ESA. We find that scoping is most successful in driving action when it includes several essential tasks: defining partnerships and roles; setting a work plan; identifying key decision questions; selecting methods for analysis; engaging stakeholders, developing a rapport between partners, and exchanging knowledge; and, importantly, ensuring these plans are feasible given timelines, resources, and expertise.

Establishing clear roles and responsibilities can be a crucial part of scoping. To support the development of a national coastal zone management plan in Belize, we spent time and resources early on to establish a strong partnership between researchers and government policymakers through shared budgeting, detailed work plans with distinct roles and responsibilities, and regular progress calls and meetings. As a result, each partner had a clearly defined role to effectively tackle challenges arising in stakeholder engagement, data collection, modeling, and scenario development and directly inform Belize’s Integrated Coastal Zone Management Plan (Clarke et al. Citation2013; McKenzie et al. Citation2014; Arkema et al. CitationForthcoming). On the other hand, in our efforts to inform an offset scheme for ES in Colombia (Tallis et al. Citation2011), we did not clearly assign and differentiate roles and responsibilities of the various partners at the outset. Further, we failed to create effective communication mechanisms between stakeholders and project partners. Together these challenges have led to delays in the ESA process, divergence in goals across partners, and the use of different methods that are difficult to harmonize into a coherent offset scheme that can be applied uniformly across the country.

Among our six cases, the three most common challenges in scoping were (1) poor problem definition, where the ESA was poorly linked to decision questions; (2) ambiguity in roles and responsibilities or low accountability, whereby important steps were missed or significantly delayed because no entity or individual was clearly assigned the role or failed to implement it; and (3) a mismatch between expectations and information delivered, often due to the resolution of available data or the limitations of ES valuation (e.g., it is not always possible to demonstrate that an intact forest is more valuable than a given set of economic activities). These challenges often occur in the absence of one or more of the 5 Ps of enabling conditions for applied ESAs to inform decisions.

4.2. Step 2 – Collect and compile data

Given their multidisciplinary nature, ESAs usually have substantial spatial and nonspatial data requirements, including biophysical, socioeconomic, and political information. Data requirements vary by project geography, scale and resolution of analysis, and decision context. The ‘Data’ step helps clarify research methods, information sources, and critical data characteristics. It also provides an opportunity to institute quality control processes and data triangulation techniques, including the incorporation of local information from stakeholders.

We find that this step has been most effective for informing decisions when there is explicit and early identification of data needs, systematic data-gathering, gap analysis, and a clear strategy to address gaps (e.g., by selecting proxy variables, bringing in new data partners, or limiting the scope of the analysis). Often applied ESA data are gathered iteratively through a review process with local experts and stakeholders. Rather than embark on an effort to collect all available data for a study region, we recommend constraining this step by first scoping the problem, resources, and data challenges. Key questions include: What data already exist? How will we use these data during the process? What data do I need to address the services and metrics of interest? Answering these questions helps anticipate needs, gaps and options for ES analysis, scenario development, and monitoring. Monitoring and evaluation, where part of the applied ESA, should also begin during this step through collection of baseline data and sketching out a plan of action.

We have found that most data can be acquired through scientific partnerships, field study, policy analysis, and literature review. Using creative, qualitative inquiry methods can make it possible to cull difficult-to-access information that is spread among multiple sources. We have gathered information, such as visitation rates and property value estimates, by ‘combing’ social media and other Internet sources and developing relationships with big data providers (e.g., Arkema et al. Citation2013; Wood et al. Citation2013). It also helps to have local contacts and partners for co-production of knowledge and access to local data sources. In an applied ESA to inform land-use planning in Sumatra, we initially used freely available, coarse global data sets and guidance provided by the InVEST User’s Guide (Sharp et al. Citation2014) to parameterize ES models; these inputs were refined with several key data sets developed by the World Wide Fund for Nature in Indonesia (WWF-Indonesia) or acquired from regional government agencies (Bhagabati et al. Citation2012; see also Bhagabati et al. Citation2014 for a description of data and their provenance).

Across our six cases, we have identified a number of data challenges. First, many relevant data are sensitive and hard to access. This includes, for example, data on catch and revenue from fisheries (sensitive) and the current and projected locations of human activities (hard to access). Data quality can also represent a significant challenge. For example, although we obtained precipitation data for Sumatra, the coordinates of some rain gauges were suspect. Similarly, for natural capital accounting in India, the survey coverage for precipitation was incomplete, and the data required substantial quality control. This led us to discard local precipitation data in both India and Sumatra, and instead use coarser-resolution global precipitation data sets. In our experience, some of the least reliable data are hydrologic model inputs and rates of resource use (e.g., fish catch). It is important to note that these data are often needed to make robust connections from ES supply to impacts on human well-being.

This step both creates enabling conditions, by ensuring data are pertinent to decisions, and depends on them (e.g., local partners who can provide hard-to-access data). We have found that the ESA is more likely to influence decisions if efforts are made to gather credible data, determine the appropriate resolution of analysis, match available data with model and user needs, and include local and traditional knowledge (McKenzie et al. Citation2014).

4.3. Step 3 – Develop scenarios

Scenarios are plausible representations of alternative futures, useful for examining how action taken today could play out in the future. Many ESA tools like InVEST provide a snapshot of ES at a given time. Yet, to inform the policy process, ESAs typically need to demonstrate how ES can change under various scenarios and indicate the tradeoffs among options.

Scenario development is an often overlooked, yet critical component of an applied ESA. Scenarios help to answer the questions: What are the primary drivers that will affect the ES of interest? What range of alternatives best demonstrates the tradeoffs of different decision pathways? There are several tools and methods for building scenarios, many of which are freely available and documented in the scientific literature (e.g., Myint & Wang Citation2006; Alcamo Citation2008; Clark Labs Citation2009). In order to select the appropriate method, it can be helpful to take into account the time horizon, ES of interest, and drivers of change. We find it useful to start the scenario development process with a current map or baseline, which provides a point of reference and can enable stakeholders to examine and appreciate the current state of the ecosystem(s). For benchmarking and stakeholder engagement, it is desirable to have as current information as possible, but in many cases the best available data are several years old at the time of analysis.

Scenario development methods and goals vary greatly, depending on the context and data available. In many cases, scenario development constitutes the most time-consuming component of the ESA process. To support green economy planning for Borneo, we compared two possible futures: (1) a ‘Business As Usual’ (BAU) scenario based on expected permits for forestry, palm oil plantations and mining development and (2) a ‘green economy’ scenario, in which palm oil development occurs in degraded areas only, certification is enforced, and idle forest land is protected and restored. To visualize these contrasting options, our partners used IDRISI’s Land Change Modeler to generate scenario maps, depicting land use and land cover (Cosslett & van Paddenburg, Eds., Citation2012; McKenzie et al. Citation2012). In Belize, the scenario development process was stakeholder-based over a number of months. Planners conducted multiple stakeholder workshops and worked with researchers to use Geographic Information Systems (GIS) to generate the scenario maps (McKenzie et al. Citation2012, Citation2014). In Sumatra, scenarios were developed based on existing government spatial plans and an ecosystem-based spatial plan (or ‘green vision’) developed by government agencies and civil society partners focusing on habitat restoration and protecting high conservation value forests. An ESA using the scenario maps derived from these government spatial plans and the alternative proposal demonstrated the implications of choosing these alternative pathways.

We have had greatest success with stakeholder-driven approaches to scenario development, where stakeholders are involved throughout the process and their views are incorporated into scenarios. Scenario development then serves not only to respond to decision questions, but also to foster learning and trust, improve scientific outputs, and promote uptake of final results (McKenzie et al. Citation2012, Citation2014). Learning from our cases, we established a development process with the following elements:

  • Literature and data review to establish historical and current conditions for the area of interest;

  • Review of existing and proposed policies and strategies;

  • Key informant interviews with selected stakeholders who have local knowledge about resource use and extraction and governance conditions; and

  • Consultative stakeholder workshops to review scenarios and improve them.

This approach often involves iteration with the previous steps as new questions arise and new stakeholders are identified whose feedback is needed to develop plausible scenarios. Local partners and pertinent data also support relevant and credible scenario development.

4.4. Step 4 – Analyze ecosystem services

In an applied ESA process, analysis typically comes after initial stakeholder engagement, scenario development, and dialogue among scientists, policymakers, and other stakeholders to determine which services will be addressed and how (steps 1–3). In our framework (), ‘Analyze’ provides key information about synergies and tradeoffs among multiple ES under alternative scenarios. This step involves (1) selecting and applying appropriate analyses and tools, (2) assessing ES and tradeoffs among services, (3) scenario analysis of ES outcomes for comparative results, and (4) linking outcomes, in terms of supply and value, to beneficiaries. We have found that several rounds of iteration, both within the step (e.g., through local expert review) and among steps (e.g., gathering new data and refining scenarios based on synthesis), can enhance the outcomes of an applied ESA.

In our experience, the first phase of the ‘Analyze’ step typically involves a coarse analysis to understand the current distribution of priority ES. An initial, ‘quick and dirty’ analysis using simple decision-support tools – in our case, ‘Tier 0’ InVEST models – can help project participants learn to use new tools and identify data gaps or problems to fix through iteration. In some cases, this may occur before or concurrently with the ‘Scenario’ step. We have found this approach helpful for both characterizing baseline conditions and assessing likely ES impacts of alternative scenarios. A next phase may involve refining the coarse analysis for better accuracy and precision of results. Ideally, all ES analysis results would be supported by monitoring and evaluation; however, this does not always happen due to tight project budgets and timelines and a paucity of empirical data.

An integral part of analysis is testing and validating results. We have tested results in three main ways: comparing to observed data, comparing to results from other models and studies, and reviewing with local and disciplinary experts. During analysis, input and review from local experts and stakeholders can be used to refine results and improve tools. The iterative nature of this process, whereby stakeholders see their inputs reflected in results, also helps to garner additional interest and buy-in from stakeholders (McKenzie et al. Citation2014). In Sumatra, for example, we initially used InVEST for a coarse analysis of carbon storage and sequestration, sediment retention, and water yield for the baseline landscape. Through an iterative process of multiple workshops and consultations with in-country partners, we were able to improve and expand upon data collection and analysis. This process included obtaining data from government ministries for soil erosion modeling, mapping tiger habitat with expert input from a local field biologist, and incorporating carbon stock estimates from a previous WWF survey. We used these new data inputs to improve the model results and conduct a tradeoff analysis, which was reviewed by local experts and partners. In contrast, we did not have early review of a first-estimate analysis in our ESA in Borneo. As a result, we lost the opportunity to access any local data or comparative studies held by stakeholders and regional experts for our final outputs.

Access to pertinent data (one of the 5 Ps) is a key enabling condition for this step. Even with appropriate data, ES outputs can give a false impression of certainty or authority about the results. We have faced this challenge in each of our six cases, particularly in the expectations of information users. As a result, we explicitly communicate caveats, such as limitations of the tools, gaps in data, and assumptions. We often report results relative to the current scenario or other future scenarios, instead of absolute values. We also tend to avoid summing ES values and reporting total values of combined ES as a single metric; we find this approach less useful for informing management questions that address tradeoffs among services and other goods.

4.5. Step 5 – Synthesize results

Framing ESA results around key decision questions facilitates their incorporation into decision making. The ‘Synthesize’ step places ES and tradeoff results in a specific context, explicitly illustrating how different scenarios affect outcomes; in essence, this step takes the results of the ES analysis and valuation and uses them to answer the decision questions. Answering these questions may involve combining ES analysis with additional information (e.g., stakeholder preferences), streams of inquiry (e.g., market analysis, other ES studies), or human consequences (i.e., applying additional metrics or considering indirect effects on human well-being). Or, it could mean focusing on a single element or subset of ES from an integrated analysis.

In Sumatra, our analysis demonstrated that the ‘green vision’ land-use plan for the region offered better outcomes for water purification, erosion control, and carbon storage and sequestration than existing government spatial plans. In our synthesis, we addressed how to facilitate adoption of the scenario for better ES outcomes. This included identifying locations where particular ES incentives and policies prioritized by the government (e.g., water funds, habitat restoration, and forest carbon projects) could be implemented to support the ‘green vision’ (Bhagabati et al. Citation2012, Citation2014). In Belize, the National Emergency Management Organization wanted to understand the impacts of changes in ES on vulnerable populations and related infrastructure, including roads, schools, and hospitals, to design disaster risk reduction plans. Where data allowed, we explicitly illustrated the link between coastal protection services and infrastructure, separately from our integrated scenario analysis (Clarke et al. Citation2013).

Synthesis is sometimes the most difficult step to accomplish in the framework. Major challenges include obtaining and appropriately using data to complement results generated from scenario assessment (e.g., locations of infrastructure in Belize, values for emerging markets in a green economy in Borneo); integrating results from parallel studies that use different methods and metrics (e.g., multiple offset approaches in Colombia); selecting and quantifying metrics that resonate with decision makers (e.g., human health impacts or poverty alleviation in Sumatra); and formulating actionable recommendations (e.g., the most cost-effective allocation of budget for water funds). We find that synthesis is most successful when it translates information about ES outcomes into useful, specific recommendations for management and (often nonspatial) policies by drawing out clear conclusions and compelling stories. To do this, practitioners will need to rely on the 5 Ps, especially clear decision questions linked to a particular policy, appropriate pertinent data, and local partners who know what will resonate with decision makers.

4.6. Step 6 – Communicate knowledge

Finally, contextualized, targeted, and clear communication of ES knowledge can magnify the impact of an ESA. ES information can be delivered in many ways, from peer-reviewed papers and reports to workshops and interactive maps. For this information to influence decisions, it should be presented in ways that resonate with the intended audience in forums and venues accessible to decision makers. In addition, it can be helpful to develop a communication plan with partners that dovetails with your research process. A communication plan can help to address key questions like: Who is the target audience (e.g., for a particular decision question)? How to reach this audience? What information to share and how? Who should share results (e.g., local partners vs. scientists)? What are the most useful products and channels for sharing information?

The visual display of ESA results can affect the success of communication efforts. Across cases, we find that policymakers and stakeholders prefer bold and colorful maps, simple figures, and limited text. Summary figures and presentations can be more effective at communicating key results, even if they are less comprehensive or nuanced than peer-review papers or book-length reports. In Sumatra, we began by creating radar diagrams comparing ES outcomes across scenarios and wrote a painstaking 150-page report (Bhagabati et al. Citation2012). Eventually, through dialogue with our local partners and trial and error in presentations, we learned that policymakers were confused by the radar diagrams. They preferred a simple bar graph with contrasting colors depicting the different possible outcomes for key ES. Even after we included chapter summaries in the report and translated it into Bahasa Indonesia, it remained largely unread. In the end, we focused our efforts on well-designed PowerPoint® presentations to the US MCC and provincial governors in central Sumatra, and a four-page policy brief with clear, bold maps. As a result, the MCC required ES analyses to be considered in the plans for a US$600 million Compact with Indonesia to promote green prosperity, and Jambi province in Sumatra used ES information in a strategic environmental assessment to support spatial planning (Ruckelshaus et al. Citation2013).

Common challenges we have encountered for communicating ESA results include: providing meaningful information targeted to decision questions; distilling complicated and uncertain results into simple, readily accessible language and figures; and facilitating an ongoing process of iteration and uptake of ES knowledge. Across our cases, we find that the most effective communication occurs through co-production of knowledge by scientists, policymakers, and stakeholders throughout an applied ESA (Ruckelshaus et al. Citation2013; McKenzie et al. Citation2014), which depends on establishing a strong partnership in step 1 and supporting open knowledge exchange through capacity-building. Critically, we also have discovered that contextualized information yields the most uptake by stakeholders and decision makers. Providing the different ES outcomes for alternative policies or management choices via scenarios and clear policy questions makes the information relevant to the decision (McKenzie et al. Citation2012). Last, communication is most effective to inform decisions when it occurs within an appropriate policy window, as we were able to do in Belize, Sumatra, and water funds, but have faced challenges doing in Colombia, India, and Borneo.

4.7. Between steps: cross-cutting issues

The above steps lay out a coherent process for an effective ESA in practice. However, they do not ensure – on their own – that an ESA is embedded in a decision. In addition to these discrete steps, there are important cross-cutting activities that take place throughout the entire process and that we have found to be essential to inform decisions directly (). These are iteration, capacity-building, and, we propose, monitoring and evaluation.

4.7.1. Iteration

Repeatedly revisiting steps in the science-policy process to improve results over time can help maintain or foster enabling conditions to inform decisions. An ongoing process of iteration can encourage uptake of ES information for a variety of uses (McKenzie et al. Citation2014); it can also encourage reflection throughout the process and ensure that the project is responsive to changing conditions, incorporating new information and stakeholders.

Iteration can occur within each step, between a set of steps, and over the entire process (). For example, scoping (step 1) involves initial problem definition and identification of key partnerships and stakeholders, who often bring to the table additional insights, issues, and capacities that must be integrated into the collective definition of ESA scope and approaches. The ‘Data’ and ‘Analyze’ steps (2 and 4, respectively) often involve iteration through which results from initial, coarse analyses illustrate critical data gaps (e.g., Bhagabati et al. Citation2012; Verutes et al. Citationin prep). Iteration from ‘Synthesize’ back to ‘Scenarios’ and ‘Analyze’ (steps 5, 3 and 4, respectively) can help target and improve policy recommendations by including new options or drivers of change in scenarios and comparatively analyzing them to obtain better ES outcomes (e.g., McKenzie et al. Citation2012; Clarke et al. Citation2013). At times communication of results to new audiences (step 7) will help identify new decision questions that require new data collection, scenarios, analysis, or synthesis (e.g., universities in Sumatra requested that we include access to non-timber forest products in our ESA).

Major challenges in the iterative process are knowing when the ESA can continue on to the next step in the process effectively, determining how many iterative cycles to complete, and deciding whether or not to return to a particular step or to skip it. We find that in the aim for perfection it is easy to fall into the ‘iteration trap’ at almost any stage of the ESA process, where one or more partners are reluctant to move forward because the work could still be improved. For this reason, we recommend that applied ESAs: (1) prioritize good communication and alignment among partners, as well as common criteria for identifying when it is time to advance to the next step; (2) act on the adage that the perfect is the enemy of the good; and (3) design a work plan that allows for multiple work streams to occur in parallel. Policy windows are a good guide to limit iteration and create deadlines. In general, there is critical information that must be produced in each step, and when this critical information becomes available, the ESA can move forward.

The Sumatra, Belize, and water funds ESAs involved significant iteration through different steps of the process. In these cases, various iterations within and among steps 2–6 took place over a period of 2–3 years. Without this process, we believe that the policy impact (i.e., in Sumatra and Belize) and scaling up (i.e., of water funds) would have been significantly reduced. In contrast, little iteration took place in Borneo, and although the resulting report was high-profile and helped to raise awareness of green economy options in the region and internationally (Cosslett & van Paddenburg Citation2012), it did not result in a direct policy impact. It is expected that new policy windows will open in Borneo, which the iterative cycle of applied ESAs can take advantage of to revise analyses and communication to inform decisions.

4.7.2. Capacity-building

Conducting an applied ESA requires many different types of knowledge and experience, and bringing together the right set of expertise can be difficult and expensive. To address this challenge, we have included a formal thread of capacity-building in our ESAs, which emphasizes increasing knowledge within relevant disciplines, developing practical skills, and building strong understanding of ESA steps, tools, and results. Building the capacity of local experts to conduct ESAs has also proved important for building ownership results and support for including them in decisions (Ruckelshaus et al. Citation2013; McKenzie et al. Citation2014).

The main focus of capacity-building for decision makers, stakeholders, and the project team is to establish shared skills in key areas of expertise, such as ecology, economics, geography, and policy; allow ESA practitioners to learn from stakeholders; and train users in the techniques developed and used throughout the process. The objectives are to foster good understanding of the ESA approach and broad support for results and to prepare users to replicate it as needed (e.g., to maintain up-to-date natural capital accounting in India). Capacity-building efforts can begin as early as step 1 or 2 and continue throughout the applied ESA (). We recommend developing a capacity-building plan early in the process; appropriate information and resources can then be gathered to conduct a variety of capacity-building efforts throughout the steps of the framework.

We find that some types of capacity or skills are critical to address at key steps of an ESA (e.g., project planning for step 1; scenario design for step 3; valuation and modeling methods for steps 4 and 5). This means that different types of capacity-building efforts will be needed at different points. We have identified a number of options from our six cases, and we find that applied ESAs are most successful at building capacity when they employ a combination of these:

  • Kick-off meetings typically introduce the ESA, identify key ES and threats, and help scope the project. They provide an opportunity for co-design of the process and support from stakeholders but are rarely conducive for comprehensive training in ES analysis and valuation.

  • Stakeholder or public training workshops can introduce potential users to the approach, methods, and tools for the ESA. They can build knowledge and support for the process, but since substantial time is required for introductory concepts to suit a diverse audience, they are not usually sufficient for comprehensive training to co-lead an ESA.

  • Scenario planning workshops help prepare partners to select and undertake scenario development methods. These can help accomplish the scenario step, but are difficult to facilitate effectively.

  • Technical or policy workshops delve deeply into one facet of the applied ESA, either tools and models or policy implications, with partners. We find these provide helpful background and training, but due to their narrow scope, rarely provide a good platform for exchange among all stakeholders (e.g., technical workshops can isolate less tech-savvy audiences).

  • One-on-one training or mentoring takes a personalized, iterative approach to building capacity in all aspects of the ESA. It is useful for making significant progress with core project participants, especially running models and technical troubleshooting; however, it is difficult to coordinate remotely and serves a much smaller audience than other options.

  • Learning by doing such that novice project participants take the lead on one new approach or tool (e.g., running a single InVEST model). This process helps build strong expertise for replication, but it also requires more flexibility of time and resources.

For coastal planning in Belize, we employed a variety of these approaches, which reached a broad range of stakeholders while focusing the most time and resources on core project partners (i.e., Coastal Zone Management Authority and Institute). In particular, we found that a combination of one-on-one training and learning by doing can help build local capacity and investment. In Borneo, on the other hand, we held only a kick-off meeting and found that the ESA was less thoroughly integrated into green economy design and results have yet to be taken up explicitly in Indonesia’s national policy. From our experience, we conclude that incorporating capacity-building strategies and targets in the applied ESA process can enrich enabling conditions for influencing decisions. In multiple cases, we have seen that it can support local champions to showcase the ESA and contribute to successful iteration.

4.7.3. Monitoring and evaluation

Monitoring and evaluation are underrepresented components of the ESA process. The ESA process essentially outlines a theory of change where, (1) the ESA facilitates a policy or behavior change and (2) the desired policy change drives desired natural resource management, which then catalyzes changes in ES and human well-being. Accordingly, evaluating the impact of ESAs on intended policies, and, eventually on ES and human well-being, functions as a means to test and verify proposed theories of change. Data generated in monitoring and evaluation serve not only to verify theories of change, providing empirical evidence about targeted outcomes and potential co-benefits or unintended negative outcomes, but also build knowledge and data relevant for further iterations of the ESA process.

Some monitoring and impact evaluation approaches require substantial financial, human, and technical resources. It can be useful to determine which monitoring and evaluation approach (e.g., trend monitoring vs. impact evaluation) the ESA requires to meet adaptive management needs or to garner support through demonstration of impacts. Questions to ask include: Who needs to know what and how will information be used? What is the burden of proof? Is it critical to demonstrate attribution? How uncertain are modeled links between natural resource use and ES provision? For example, with a high burden of proof, such as in a court of law, decision-makers may require a robust impact evaluation with well-designed controls. In contrast, trend monitoring may provide sufficient information in the context of a low burden of proof, where decision makers do not require clear demonstration of attribution. Likewise, for adaptive management, in contexts where links between land or coastal use and ES remain poorly understood, a robust evaluation program with appropriate controls is important; trend monitoring or no monitoring may be appropriate where such links have already been clearly established. Overall, we find monitoring and evaluation most successful when indicators and design are carefully linked to the decision context they will inform.

We are supporting the design and implementation of linked hydrological and socioeconomic monitoring in several water funds in Latin America. This is driven by calls from investors and stakeholders for empirical evidence of the impacts of watershed investments on ecosystems and people. To choose relevant indicators, we first elaborated theories of change associated with each water fund activity, including both potential positive and negative outcomes. We then created a multi-scale design, with control and intervention areas, to monitor the impacts of water fund activities on metrics such as sediment retention, base flow, and rural livelihoods. Information generated will contribute to adaptive management, especially identifying activities that are most effective at particular scales as well as providing empirical evidence to increase financial and political support for the fund and scaling of water funds across Latin America and beyond.

In Belize, the coastal zone management implementation plan calls for ecological, social, and policy monitoring. A monitoring protocol is being designed, but a local organization has spearheaded an effort to produce a report card every 2 years to evaluate the health of the MesoAmerican reef system (http://www.healthyreefs.org/cms/report-cards/). The ecological data were useful for validating our models of habitat risk and can serve as baseline and long-term monitoring data to assess potential changes in ecological integrity as a result of the Integrated Coastal Zone Management Plan.

Monitoring and impact evaluation have been almost entirely absent from our applied ESAs, with the notable exception of water funds. Both in the case of Belize and water funds, monitoring and evaluation began after the ESA; ideally it would have been integrated into the ESA process from the beginning. This remains a challenge for us and for others, even in cases where demonstration of impacts is clearly needed. If a goal is to determine the impact of an ESA and subsequent land and coastal changes, monitoring and evaluation should begin before the changes occur. Based on our experience, we recommend defining objectives and key questions for monitoring and evaluation as early as possible in the ESA process. In cases like Belize, universities, private companies, and governments may already collect some of the data necessary for monitoring and evaluation, but coordinating among different groups can be challenging. An early step is to identify what types of data exist, establish data sharing agreements, and locate gaps in available data to answer key monitoring and evaluation questions.

Another critical challenge for impact evaluation concerns the attribution of the ESA process to a desired policy change, subsequent land or coastal use and management changes, and finally to changes in ES and human well-being (Ruckelshaus et al. Citation2013). Whether or not the targeted policy change occurs, and whether this results in expected changes in ES, can easily be tracked. However, attributing this to the ESA process requires a robust counterfactual group (e.g., cases where similar conditions lacking an ESA process or where an ESA process failed), which is difficult to find given the sample size of ESAs and difficulty of finding true counterfactuals.

Many factors contribute to a change in policy and subsequent land and coastal management (Kingdon Citation1995; Sabatier Citation2007; McKenzie et al. Citation2014). Where large data sets exist, including baseline data, attribution may usefully employ matching methods to create a more robust counterfactual analysis (Andam et al. Citation2008, Citation2013; Ferraro et al. Citation2011; Ferraro & Hanauer Citation2014). Alternatively, drawing on synthetic control methods (Abadie et al. Citation2010) to create a more robust counterfactual (e.g., ESA vs. no ESA and associated policy changes) in the context of a relatively small number of ESA cases may provide a promising opportunity to overcome this challenge. Attributing changes in ES and human well-being to changes in land and coastal management presents an additional challenge. Finally, as with other types of interdisciplinary research, monitoring changes in the full spectrum of human well-being associated with changes in ES requires the adoption of interdisciplinary methodologies and epistemologies, which can also contribute to an applied ESA process.

5. Putting it all together

Our experience indicates that applying this framework can generate relevant scientific outputs and enhance uptake in policies and decisions. summarizes the enabling conditions for each applied ESA we draw on to distil lessons and challenges in informing decisions (for more details on steps, see Appendix). There are common elements among these cases, but no two cases emphasize the same framework steps or follow the same iterative path. These applied ESAs have taught us that stakeholder engagement, iteration, and capacity-building are critical elements of the ESA process, which can help maintain and foster enabling conditions to inform decisions. They also demonstrate the value of including each step in the ESA process from scoping to communication. Our experience undertaking similar efforts in the Democratic Republic of Congo and for the Greater Virunga Landscape (in Rwanda, Uganda and DRC), where we attempted to ‘hopscotch’ across the framework, have not explicitly influenced decisions in these regions.

Table 1. Enabling conditions of ESA cases, including the 5 Ps and cross-cutting elements of the applied ESA framework.

Although there are valuable general lessons to distil from the cumulative wealth of our applied ESA cases, it is also important to note that all steps in the framework should be tailored to the local decision context. We have found that the extent to which an applied ESA delves into and spends time on each step should be determined collaboratively with partners and in response to local needs. In other words, the framework is better used as a set of guidelines to inform practice than strict rules to follow rigidly.

In many cases, we find that the absence of enabling conditions (the 5 Ps) can increase the challenges faced in applied ESAs, although practitioner decisions at each step can help restore quality of results and uptake. In Belize, Sumatra, and water funds cases, where the 5 Ps were in place, applied ESAs influenced the design of a policy, agreement, or mechanism. In contrast, in the other cases where evidence of all five enabling conditions was less strong, we have encountered challenges that affected execution and completion of a number of the steps in the framework. In Borneo, for example, although policy questions were addressed in a broad way, we failed to target a specific policy window or opportunity to embed results directly from the ESA. As a consequence, its impact on specific resource decisions has remained limited, although its results have influenced public discussion and debate about development regionally and internationally. ESAs have been less successful in clearly informing policy and management where we were unable to identify or cultivate champions in local institutions to co-lead the applied ESA and advocate for its use in decision making.

6. Future directions

In this article, we aim to rigorously interrogate our practice of ESAs and encourage others to do so as well. The ES field can achieve more policy-relevant results and have greater impact on decisions by clearly characterizing the science-policy process, comparing cases, and drawing lessons from both successes and failures. This study has clear limitations: we did not compare to controls where no ESA occurred; we did not design a quasi-experimental approach to test alternative processes in the same decision context; and, we have chosen to self-evaluate instead of seeking the review of an objective third party. In addition, we developed this as a general framework to apply to ESAs for any type of natural resource management decision; yet, we suggest that each applied ESA will be conducted to inform a specific decision. The details will vary across decision context, and a richer set of lessons could be drawn from a deep investigation of a single ESA. Despite these constraints, we believe that this assessment helps to improve our practice and potential to embed ESAs in decisions that matter.

We would like to see more formative and summative evaluations of the process; resulting lessons could be taken up by the ES community. In particular, there is a number of elements or steps in applied ESAs that are little studied, although they significantly affect the quality of the process and results, such as scoping or framing, stakeholder engagement, scenario development, and iteration. In particular, there is a clear need for further identifying and defining enabling conditions for successful applied ESAs. We could benefit from better tools and approaches for visualizing and communicating results; future studies could address what kinds of language and products improve clarity and uptake of results in decisions. Our experience with monitoring and evaluation is limited; it would be helpful to understand better how and where monitoring should fit into the ESA framework. Likewise, we were able to link our enabling conditions only to influencing decisions; we still have significant questions about how and whether those decisions improve ES outcomes and human well-being. Finally, we suggest that more interdisciplinary studies, with a greater range of social science expertise, could help the ES science community to better understand complex human well-being outcomes and synthesize lessons from practice.

Acknowledgements

We wish to thank the many colleagues and partners who made this work possible. In particular, we drew lessons from applied ESAs co-developed by the following people: Chantalle Clarke, Maritza Canto, Samir Rosado, and Spencer Wood (Belize); Silvia Benitez, Alejandro Calvache, Juan Carlos Gonzales, Lisa Mandle, Heather Tallis, and Fernando Veiga (Colombia and water funds); Urvashi Narain, SS Negi, Anil Vaidya, Stacie Wolny, and Perrine Hamel (India); and Thomas Barano, Sulistyawan Andrew Dean, Oki Hadian, and Anna van Paddenburg (Borneo and Sumatra, Indonesia). More broadly, we thank all those at the Natural Capital Project who created the conditions for this research, including Rebecca Chaplin-Kramer, Gretchen Daily, Anne Guerry, Peter Kareiva, Stephen Polasky, Taylor Ricketts, and Mary Ruckelshaus. We thank Kirsten Howard for her insights and review. We are also grateful for the judicious suggestions of three anonymous reviewers.

Notes

1. Water funds are a rapidly growing approach to investment in watershed services, and we conducted an applied ESA to help prioritize investments and garner support in Cauca Valley, Colombia and other locations in partnership with The Nature Conservancy (TNC) and the Latin American Water Funds Partnership (Goldman-Benner et al. Citation2012; Vogl et al. Citationin prep).

References

  • Abadie A, Diamond A, Hainmueller J. 2010. Synthetic control methods for comparative case studies: estimating the effect of California’s tobacco control program. J Amer Stat Assoc. 105:493–505.
  • Alcamo J. 2008. Environmental futures: the practice of environmental scenario analysis. Chapter 6, The SAS approach: combining qualitative and quantitative knowledge in environmental scenarios. Amsterdam: Elsevier; p. 123–148.
  • Andam KS, Ferraro PJ, Hanauer MM. 2013. The effects of protected area systems on ecosystem restoration: a quasi-experimental design to estimate the impact of Costa Rica’s protected area system on forest regrowth. Cons Lett. 6:317–323.
  • Andam KS, Ferraro PJ, Pfaff A, Sanchez-Azofeifa GA, Robalino JA. 2008. Measuring the effectiveness of protected area networks in reducing deforestation. Proc Nat Acad Sci USA. 105:16089–16094.
  • Arkema KK, Guannel G, Verutes G, Wood SA, Guerry A, Ruckelshaus M, Kareiva P, Lacayo M, Silver JM. 2013. Coastal habitats shield people and property from sea-level rise and storms. Nat Clim Change. 3:913–918.
  • Arkema KK, Verutes G, Wood SA, Clarke C, Rosado S, Canto M, Rosenthal A, Ruckelshaus M, Guannel G, Toft J, et al. Forthcoming. Improving the margins: modeling ecosystem services leads to better coastal plans. Proc Nat Acad Sci USA.
  • Beierle TC, Konisky DM. 2001. What are we gaining from stakeholder involvement? Observations from environmental planning in the Great Lakes. Env Plan C. 19:515–527.
  • Bhagabati N, Ricketts T, Sulistyawan TBS, Conte M, Ennaanay D, Hadian O, McKenzie E, Olwero N, Rosenthal A, Tallis H, Wolny S. 2014. Ecosystem services reinforce Sumatran tiger conservation in land use plans. Biol Cons. 169:147–156.
  • Bhagabati NK, Barano T, Conte M, Ennaanay D, Hadian O, McKenzie E, Olwero N, Rosenthal A, Suparmoko SA, Shapiro A, et al. 2012. A green vision for Sumatra: using ecosystem services information to make recommendations for sustainable land use planning at the province and district level. Washington (DC): The Natural Capital Project, WWF-US and WWF-International.
  • Braat LC, de Groot R. 2012. The ecosystem services agenda: bridging the worlds of natural science and economics, conservation and development, and public and private policy. Ecosys Serv. 1:4–15.
  • Cardinale BJ, Duffy JE, Gonzalez A, Hooper DU, Perrings C, Venail P, Narwani A, Mace GM, Tilman D, Wardle DA, et al. 2012. Biodiversity loss and its impact on humanity. Nature. 486: 59–67.
  • Carpenter SR, DeFries R, Dietz T, Mooney HA, Polasky S, Reid WV, Scholes RJ. 2006. Millennium Ecosystem Assessment: research needs. Science. 314:257–258.
  • Carpenter SR, Mooney HA, Agard J, Capistrano D, DeFries RS, Díaz S, Dietz T, Duraiappah AK, Oteng-Yeboah A, Pereira HM, et al. 2009. Science for managing ecosystem services: beyond the Millennium Ecosystem Assessment. Proc Nat Acad Sci U S A. 106: 1305–1312.
  • Cash DW, Clark WC, Alcock F, Dickson NM, Eckley N, Guston DH, Jäger J, Mitchell RB. 2003. Knowledge systems for sustainable development. Proc Nat Acad Sci USA. 100:8086–8091.
  • Clark Labs. 2009. The Land Change Modeler for ecological sustainability [Internet]. IDRISI Focus Paper. Worcester (MA): Clark Labs. Available from: http://clarklabs.org/applications/upload/Land-Change-Modeler-IDRISI-Focus-Paper.pdf
  • Clarke C, Canto M, Rosado S. 2013. Belize integrated Coastal Zone management plan. Belize City: Coastal Zone Management Authority and Institute; p. 423.
  • Cosslett CE, van Paddenburg A, editors. 2012. Heart of Borneo: investing in nature for a green economy. A synthesis report. Jakarta: WWF Heart of Borneo Global Initiative.
  • Daily GC, Polasky S, Goldstein J, Kareiva PM, Mooney HA, Pejchar L, Ricketts TH, Salzman J, Shallenberger R. 2009. Ecosystem services in decision making: time to deliver. Front Ecol Env. 7:21–28.
  • Ferraro PJ, Hanauer MM. 2014. Quantifying causal mechanisms to determine how protected areas affect poverty through changes in ecosystem services and infrastructure. Proc Nat Acad Sci USA. 111:4332–4337.
  • Ferraro PJ, Hanauer MM, Sims KRE. 2011. Conditions associated with protected area success in conservation and poverty reduction. Proc Nat Acad Sci USA. 108:13913–13918.
  • Foley JA, DeFries R, Asner GP, Barford C, Bonan G, Carpenter SR, Chapin FS, Coe MT, Daily GC, Gibbs HK, et al. 2005. Global consequences of land use. Science. 309: 570–574.
  • Goldman-Benner RL, Benitez S, Boucher T, Calvache A, Daily G, Kareiva P, Kroeger T, Ramos A. 2012. Water funds and payments for ecosystem services: practice learns from theory and theory can learn from practice. Oryx. 46:55–63.
  • Goldman-Benner RL, Benitez S, Calvache A, Ramos A, Veiga F. 2013. Water funds: a new ecosystem service and biodiversity conservation strategy. In: Levin SA, editor. Encyclopedia of biodiversity. 2nd ed. Vol. 7. Cambridge: Elsevier; p. 352–366.
  • Goldstein JH, Caldarone G, Duarte TK, Ennaanay D, Hannahs N, Mendoza G, Polasky S, Wolny S, Daily GC. 2012. Integrating ecosystem-service tradeoffs into land-use decisions. Proc Nat Acad Sci USA. 109:7565–7570.
  • Guerry AD, Ruckelshaus MH, Arkema KK, Bernhardt JR, Guannel G, Kim CK, Marsik M, Papenfus M, Toft JE, Verutes G, et al. 2012. Modeling benefits from nature: using ecosystem services to inform coastal and marine spatial planning. Int J Biodiv Sci Ecosys Serv Manag. 8: 107–121.
  • IPBES [Internet]. c2014. Bonn: intergovernmental platform on biodiversity and ecosystem services. [cited 2014 Apr 27] Available from: http://www.ipbes.net/
  • Kingdon JW. 1995. Agendas, alternatives and public policies. Boston (MA): Addison Wesley Educational.
  • Knight A, Cowling RM, Rouget M, Balmford A, Lombard AT, Campbell BM. 2008. Knowing but not doing: selecting priority conservation areas and the research–implementation gap. Cons Bio. 22:610–617.
  • Laurans Y, Rankovic A, Billé R, Pirard R, Mermet L. 2013. Use of ecosystem services economic valuation for decision making: questioning a literature blindspot. J Environ Manag. 119:208–219.
  • MacDonald DH, Bark RH, Coggan A. 2014. Is ecosystem service research used by decision-makers? A case study of the Murray-Darling Basin, Australia. Land Ecol. doi:10.1007/s10980-014-0021-3
  • McKenzie E, Posner S, Tillman P, Bernhardt J, Howard K, Rosenthal A. 2014. Understanding the use of ecosystem service knowledge in decision making: lessons from international experiences of spatial planning. Environ Plan C. 32. doi:10.1068/c12292j
  • McKenzie E, Rosenthal A, Bernhardt J, Girvetz E, Kovacs K, Olwero N, Toft J. 2012. Developing scenarios to assess ecosystem service tradeoffs: guidance and case studies for InVEST users. Washington (DC): World Wildlife Fund.
  • Millennium Ecosystem Assessment (MA). 2005. Ecosystems and human well-being: the assessment series. Washington (DC): Island Press.
  • Myint SW, Wang L. 2006. Multicriteria decision approach for land use land cover change using Markov chain analysis and a cellular automata approach. Can J Rem Sens. 32:390–404.
  • Neßhöver C, Timaeus J, Wittmer H, Krieg A, Geamana N, van den Hove S, Young J, Watt A. 2013. Improving the science-policy interface of biodiversity research projects. GAIA. 22:99–103.
  • Reid RS, Nkedianye D, Said MY, Kaelo D, Neselle M, Makui O, Onetu L, Kiruswa S, Ole Kamuaro N, Kristjanson P, et al. 2009. Evolution of models to support community and policy action with science: balancing pastoral livelihoods and wildlife conservation in savannas of East Africa. Proc Nat Acad Sci U S A. doi:10.1073/pnas.0900313106
  • Ruckelshaus M, McKenzie E, Tallis H, Guerry A, Daily G, Kareiva P, Polasky S, Ricketts R, Bhagabati N, Wood S, et al. 2013. Notes from the field: lessons learned from using ecosystem service approaches to inform real-world decisions. Ecol Econ. doi:10.1016/j.ecolecon.2013.07.009
  • Sabatier PA. 2007. Theories of the policy process. Boulder (CO): Westview Press.
  • Seppelt R, Dormann C, Eppink F, Lautenbach S, Schmidt S. 2011. A quantitative review of ecosystem service studies: approaches, shortcomings and the road ahead. J Appl Ecol. 48:630–636.
  • Seppelt R, Fath B, Burkhard B, Fisher JL, Grêt-Regamey A, Lautenbach S, Pert P, Hotes S, Spangenberg J, Verburg PH, et al. 2012. Form follows function? Proposing a blueprint for ecosystem service assessments based on reviews and case studies. Ecol Indic. 21: 145–154.
  • Sharp R, Tallis HT, Ricketts T, Guerry AD, Wood SA, Chaplin-Kramer R, Nelson E, Ennaanay D, Wolny S, Olwero N, et al. 2014. InVEST user’s guide. Stanford: the Natural Capital Project [Internet]. Available from: http://ncp-dev.stanford.edu/~dataportal/invest-releases/documentation/current_release/
  • Tallis H, Polasky S. 2009. Mapping and valuing ecosystem services as an approach for conservation and natural-resource management. Ann N Y Acad Sci. 1162:265–283.
  • Tallis H, Wolny S, Lozano JS. 2011. Including ecosystem services in mitigation [Internet]. Report to the Colombian Ministry of Environment, Mines and Territorial Development. Stanford (CA): Natural Capital Project, Stanford University. Available from: http://naturalcapitalproject.org/pubs/Colombia_Mitigation_Report_Final_2011.pdf
  • TEEB. 2014. Geneva: The Economic of Ecosystems and Biodiversity [Internet]. [cited 2014 April 27]. Available from: http://www.teebweb.org/
  • Verutes G, Arkema KK, Rosenthal A, Clarke C, Wood S, Canto M, Rosado S, Ruckelshaus M, Guerry A. In prep. Applying ecosystem service science to policy for marine spatial planning.
  • Vihervaara P, Rönkä M, Walls M. 2010. Trends in ecosystem service research: early steps and current drivers. AMBIO. 39:314–324.
  • Vogl AL, Tallis H, Douglass J, Sharp R, Veiga F, Benitez S, León J, Game E, Petry E, Guimarães J, Lozano JS. 2013. Resource Investment Optimization System: introduction & theoretical documentation [Internet]. Stanford (CA): Natural Capital Project, Stanford University. Available from: http://ncp-dev.stanford.edu/~dataportal/rios_releases/RIOSGuide_Combined_8-22-13.pdf
  • Vogl AL, Tallis H, Wolny S, Sharp R, Veiga F, Benitez S, León J, Lozano JS, Petry P, Game E, et al. In prep. Investing in watershed services: combining biophysical, economic, and social data to identify cost-effective investment strategies.
  • Vogl AL, Wolny S, Hamel P, Johnson J, Rogers M, Dennedy-Frank PJ 2014. Valuation of hydropower services. Report prepared for the World Bank. Stanford (CA): The Natural Capital Project, Stanford University.
  • Wood SA, Guerry AD, Silver JM, Lacayo M. 2013. Using social media to quantify nature-based tourism and recreation. Sci Rep. 3:2976.

Appendix. Summary of steps in the science-policy process for applied ecosystem service assessments.