4,411
Views
33
CrossRef citations to date
0
Altmetric
Articles

Assessing the impact of strategic environmental assessment

&
Pages 177-187 | Received 19 Apr 2014, Accepted 21 May 2014, Published online: 12 Jun 2014

Abstract

This paper examines the impact of strategic environmental assessment (SEA) – its direct impact on policies, plans and programs (PPPs) and its indirect and longer-term impacts. Criteria for assessing SEA's impact are developed and applied in the Canadian context based on a survey of SEA practitioners, and the perceived opportunities and challenges to realizing the full impact of SEA explored. Results indicate that SEA does have a direct impact on PPPs, but its indirect impacts are either constrained or difficult to distinguish from an agency's normal policies, practices and innovations. Amongst the most significant challenges to realizing the indirect impacts of SEA is the lack of shared vision for SEA by those responsible for implementation, and incongruences between the need for rapid results by way of PPP approval versus the long-term commitment required to realize many of the benefits of SEA. Indirect impacts require more explicit consideration at the outset of the SEA design process than what is currently the case if the benefits of SEA are to be fully recognized.

Introduction

Effectiveness is a long-standing issue in environmental assessment (Chanchitpricha & Bond Citation2013) and has gained considerable traction in the context of strategic environmental assessment (SEA; Stoeglehner et al. Citation2009; Bina et al. Citation2011; SEPA Citation2011; EPA Citation2012; Tetlow & Hanusch Citation2012). There have been several evaluations of SEA effectiveness, focused primarily on process elements and, to a lesser extent, on the role of SEA within the policy, plan or program (PPP) decision-making context (Arbter Citation2003; Retief Citation2007; CESD Citation2008; Noble Citation2009; Wang et al. Citation2009; De Montis et al. Citation2014). Most reviews of SEA's effectiveness, however, have focused on the input and process requirements of SEA – that SEA be done, that it follows a set of defined steps, that it meets certain procedural or good practice standards, and that its results be reported in a certain way.

Evaluations of procedural effectiveness, that good practice elements and directives are followed, can be easily misunderstood as evidence of whether SEA is having an impact. There is a difference between the quality of an assessment process and its effectiveness (Lawrence Citation1997; Kauppinen et al. Citation2006; Theophilou et al. Citation2010), and evaluating SEA procedural or policy compliance is not the same as evaluating whether SEA is having an impact. An SEA application that is poorly constructed, and does not comply with specific policy or directive-based requirements, can still have a positive impact on PPP decisions and, indirectly, on subsequent PPPs, strategic initiatives, decisions or practices. The value of SEA is, in part, a function of the extent to which it influences and adds value to decision-making (Partidário Citation2000); however, as Tetlow and Hanusch (Citation2012) explain, it is now recognized that SEA can have multiple and indirect, long-term benefits beyond the immediate, visible effects on PPPs and decisions.

Evaluating the effectiveness of SEA, beyond basic input or procedural characteristics, is a complex and uncertain task. Strategic initiatives are sometimes formulated in abstract terms; decisions made in PPP implementation are often difficult to link back to particular SEA results; and there are often few direct links between decisions made at the strategic level and long-term PPP outcomes (Cherp et al. Citation2008; Gachechiladze et al. Citation2009). Further, what makes an SEA effective will vary from one context and application to the next. This does not mean, however, that the research community should not focus attention on the evaluation of SEA effectiveness, and on the development of suitable criteria to do so. Differences in the context, design and purposes of SEA are insufficient reason for not assessing SEA's value added. If SEA is to gain traction amongst the professional practice and regulatory communities, the scholarly community must do a better job of evaluating its impact and providing the right tools to do so. As government departments and agencies are increasingly subjected to evaluations and audits of SEA, and its merits challenged (CESD Citation2008; Bregha Citation2011), it is important to examine the added value of SEA beyond the process itself and beyond the scope of the particular PPP at hand.

In this paper, we focus on the impact of SEA, both its direct impact on PPPs and decisions, and its indirect impact beyond the particular PPP for which the SEA was implemented. Our goal is not to present another evaluation of SEA performance per se; rather, this paper develops and applies a set of criteria for evaluating the direct and indirect impact of SEA and, based on empirical investigation in the Canadian context, attempts to explain the primary opportunities and challenges to realizing SEA's full impact.Footnote1 Although our findings are rooted in the experiences of Canadian SEA practitioners, the underlying factors that explain these results are broadly applicable to advancing SEA's impact in other regions.

Conceptualizing SEA effectiveness as direct and indirect impacts

Chanchitpricha and Bond (Citation2013, p. 66) describe effectiveness as ‘a troublesome term,’ characterized by multiple definitions and dimensions across the range of impact assessment processes, including health impact assessment (Wismar et al. Citation2007; Harris-Roxas & Harris Citation2013), social impact assessment (O'Faircheallaigh Citation2009), environmental impact assessment (EIA; Heinma & Põder Citation2010) and SEA (Noble Citation2009). At the most basic level, the effectiveness of SEA is a function of its inputs (i.e. institutional requirements or provisions for SEA), process (i.e. assessment methodology), outputs (i.e. results and changes in PPPs) and outcomes (i.e. longer-term change as a result of the SEA experience). Understanding the effectiveness of SEA inputs and processes is important, but the outputs and outcomes of SEA are the ultimate measures of its value added. Inputs and process do not directly and independently indicate the effectiveness of SEA; a quality SEA based solely on compliance with a directive or prescribed methodology might not mean an ‘effective’ SEA based on influence and outcome. There is usually a requirement, for example, that certain principles are reflected in SEA, including public participation and the consideration of environmental concerns related to a proposed PPP, but there is often no requirement that such considerations shape PPP decisions.

Stoeglehner et al. (Citation2009) have argued for a shift in focus of SEA research from the development of legislation, guidelines and methodologies toward improving the effectiveness of SEA. Although some scholars have focused on the longer-term impacts of SEA under informal arrangements (e.g. community-based SEA, Sinclair et al. Citation2009), when effectiveness has been examined beyond input and procedural aspects, the focus has largely been on the extent to which SEA influences PPPs toward the achievement of specific environmental or sustainability goals (Song et al. Citation2011; Zhou & Sheate Citation2011), or its influence on PPP decisions (Partidário Citation2000; Noble Citation2009). Understanding the effectiveness of SEA requires focusing also on the indirect, often subtle, impacts of SEA that may occur beyond the scope of the specific application or PPP.

Several scholars have suggested approaching SEA effectiveness in terms of its immediate or direct impacts (i.e. outputs) and its indirect impacts (i.e. outcomes) (Table ). The direct impacts of SEA refer to its impacts on PPPs or decisions, including, for example, the identification and management of impacts, representation of stakeholder values in the PPP or the realization of PPP objectives (Runhaar & Driessen Citation2007; Noble Citation2009; Stoeglehner et al. Citation2009; EPA Citation2012). These may be considered the more immediate effects of SEA and its influence on, or value added to, the development, scope or implementation of a PPP. Outputs are short term or immediate, possibly measurable, and realized directly through SEA implementation and PPP development, modification and improvement (van Buuren & Nooteboom Citation2009). Outputs are an important measurement of SEA efficacy, if SEA is to be a valuable tool for integrating environmental considerations into PPPs.

Table 1 Characteristics of SEA outputs and outcomes.

The indirect impacts of SEA refer to its impacts or influence beyond the PPP or decision context (Runhaar & Driessen Citation2007; Stoeglehner et al. Citation2009; EPA Citation2012). These include the broader or longer-term impacts of SEA on learning, on other processes or actions, influences on institutional and management practices, or improvements to environmental or socioeconomic conditions or standards due to increased awareness. Indirect impacts may materialize as new ideas or innovations in subsequent PPPs or rounds of decision-making, impacts on processes and situations other than those of which the SEA is part (Bina et al. Citation2011), changes or improvements in administrative structures, or the initiation of new research programs. Such impacts are not easily identified due to their implicit nature – they are often unplanned for and subtle as learning and longer-term transformations of individual, professional and organizational norms and practices unfold (Kornov & Thissen Citation2000; Owens & Cowell Citation2006; Sinclair et al. Citation2008; Jha-Thakur et al. Citation2009). Understanding the efficacy of SEA requires that both its direct and indirect impacts be considered.

Methods

The research consisted of two phases: (1) the development of SEA impact evaluation criteria and (2) a survey of the impact(s) of SEA based on recent practice experiences. Draft evaluation criteria were developed based on the international scholarly literature including, but not limited to, Thissen (Citation2000), Fischer and Gazzola (Citation2006), Noble (Citation2009), Bina et al. (Citation2011), Partidário (Citation2012) and Zhang et al. (Citation2013). Two types of criteria were identified – those focused on evaluating the direct impacts of SEA on a PPP, including its immediate outputs, and those focused on evaluating the indirect or longer-term impacts of SEA. Draft criteria were sent to 11 academics for peer review. Reviewers were selected based on their research expertise or publications in the areas of SEA, and asked whether the criteria were comprehensive, required revision or whether there were certain criteria that should be removed. Based on reviewer feedback, several criteria were rephrased for clarification and other criteria merged due to perceived duplication. Two new criteria were added, namely criteria A11 and A12. The final list of criteria, based on the literature and reviewer feedback, is presented in Table .

Table 2 Criteria for evaluation of SEA efficacy.

The criteria were sent to SEA practitioners from across Canada using an online survey, Fluid Surveys. An initial list of participants was identified based on recent SEA reports and then using a snowball sampling design (Bryman Citation2012). A total of 33 practitioners participated in the study: of them, 21 were from federal and provincial governments and 12 were private sector consultants. Participants were asked to identify the recent SEA(s) with which they were involved and, drawing upon their experience, evaluate the extent to which the SEAs satisfied each of the direct and indirect criteria (Table ) using a nine-point scale from 1 =  strongly disagree that the SEA satisfied the criterion, to 9 =  strongly agree that the SEA satisfied the criterion. Participants were also asked to provide qualitative comments following each criterion, and to comment specifically on the following:

  • significant limitations of, or challenges to, the SEA and

  • the most important learning outcomes based on their experience with SEA.

Results

Results are presented in four sections. First, participants' responses concerning the direct impacts of SEA are presented, followed by the indirect or longer-term impacts of SEA. These results are synthesized in tabular format and we focus on a select set of criteria for which there was consensus, significant disagreement, or considerable variability. This is followed by the reported challenges to SEA and most important learning outcomes. Results represent the collective experience of 33 individuals across 52 SEA initiatives.

Direct impacts

Participants' assessments of the direct impacts of SEA are summarized in Table . There were no significant differences between government and consultants and participants were neutral on many criteria. Participants reported that SEA helped to identify the potential impacts of the PPP (A1), and identified strategies for avoidance or reduction of adverse impacts, or strategies for enhancement of positive impacts (A2). Several government participants commented that the SEA provided increased awareness and understanding of stakeholder concerns and interests, provided information to better align the PPP with those interests and priorities and helped to identify both positive and adverse impacts and opportunities to manage them. Several consultants commented on baseline data generated during the SEA, and the opportunity it provided to identify knowledge gaps for managing potential impacts and to inform monitoring. Participants also reported that SEA was effective in helping ensure compliance of the PPP with the agency's/organization's mandate or policy commitments (A6); and that the SEA did not cause undue delay to decision-making processes (A11). Six participants (18.2%), however, disagreed with A11, with one consultant commenting that the SEA was time-consuming and with few benefits.

Table 3 Direct impacts of SEA on PPPs.

There was considerable variability in reported experiences based on A3, A5 and A7 (Table ). For example, approximately 33% reported that SEA was integrated with PPP development, or provided information early enough to inform its development (A3), while 33% disagreed with this statement. One consultant reported that SEA input was limited because it was not formally linked to the decision-making process but rather served as an information provision tool. Similarly, 27.3% reported that the SEA gave sufficient consideration to viable PPP alternatives (A7), but 27.3% disagreed and 45% were neutral in their responses.

The greatest variability was in response to A5 – whether SEA ensured that stakeholder interests, including Aboriginal interests, were represented in the final PPP. Approximately 34% of participants suggested little or no integration of stakeholder concerns in the PPP as a result of the SEA; a near equivalent number (37%) reported positive impacts, such as incorporating priority issues and concerns in the PPP and ensuring that Aboriginal perspectives were addressed prior to project-level development applications. One government participant reported that the process, although not a directive-based SEA, ensured a focused discussion on community concerns and stakeholder priority interests – in this case emergency preparedness and the risks of offshore energy exploration. A second participant commented that SEA was able to effectively address stakeholder and Aboriginal interests, when compared to past EIAs, and bring the parties together with regulators in a non-adversarial environment prior to receiving project applications for development. The participant said that SEA led to greater transparency in the planning process (A10). At the same time, however, another participant commented that it was difficult to relate increased transparency and accountability to the SEA process, as such matters have more to do with the priorities and practices of the agency conducting the assessment.

Indirect impacts

Participants' assessments of the indirect impacts of SEA are summarized in Table . There were no significant differences between government and practitioner participants, with the exception of B2 (p = 0.036; U = 73.5). Government participants reported that SEA had helped to realize broader organizational or institutional goals and objectives, beyond the scope of the PPP (7 ± 0.83); consultants tended to disagree with this statement (3 ± 0.31) (Table ). Overall, there was strong agreement only with B4 – the SEA helped to identify or stimulate new research directions or needs (e.g. policy or program gaps). One government participant, for example, reported that SEA identified data gaps in baseline knowledge and resulted in new information gathering. A minority of participants (9.1%) disagreed that the SEA was effective in helping identify or stimulate new research directions or needs, with one government participant commenting that establishing new research directions is typically outside the mandate of the agency responsible for the SEA at hand.

Table 4 Indirect impacts of SEA for PPPs.

The median response was relatively neutral on all other criteria (Table ). Some participants commented that insufficient time had lapsed since the SEA to comment on the indirect or longer-term benefits; others reported that there was insufficient information to make such a determination or that it was too difficult to attribute any observed changes to the SEA process. Although the median responses were neutral, there was still considerable variability. For example, 27.5% reported that SEA provided information for use in subsequent PPP processes, assessments or decisions (B1), but 24.3% disagreed. One participant commented that SEA facilitated subsequent regulatory review and project approvals by providing information and data to assist in impact assessments, serving as ‘a first step in decision making to be followed by more detailed project specific EIA.’ A second consultant reported that SEA was useful in generating high-level information, but that information was not useful to support specific decisions or development applications.

Consideration must be given to the nature and intent of the SEA when evaluating the usefulness of the information it generates. One government participant explained that, based on his experience, the SEA was intended to be a reference document and provide a high-level overview of the potential effects of, and mitigation options for, typical development activities in the region. Given the intended purpose of the SEA, the information generated would not be useful for environmental monitoring or aid in the prediction of project-specific impacts. This does not mean that the information generated was of little value; rather, consistent with the ‘one concept-multiple forms’ view of SEA (Verheem & Tonk Citation2000), information was generated for a particular purpose and that purpose may not necessarily meet all needs or expectations.

Responses were similarly distributed on B8, with 58.7% neutral as to whether SEA led to improved efficiencies at the next level of decision-making. One consultant explained that it was too early to tell if the SEA informed next level decisions, but went on to report that the intent of the SEA was to find efficiencies in terms of the review process for subsequent PPPs, including the early identification of potential effects. A government participant reported that improved efficiencies was an underlying objective, but implementation was a problem: ‘… best management practices were identified, such that these issues would not have to be addressed in subsequent project assessments, but the plan was never fully implemented.’

Approximately 24% of participants reported that the SEA did lead to improved efficiencies at the next level of decision-making. For example, one government participant reported that SEA improved subsequent planning and development activities by ‘providing baseline information and determining best mitigation practices to assist in expected assessments for multiple projects.’ Approximately 17%, however, disagreed, with one participant reporting that the degree of actual success has been limited because there was resistance, from some project-specific EIA proponents, to have anything other than a large ‘stand-alone’ EIA document. A practitioner noted that part of the problem was the perception that the SEA was supposed to help ‘avoid doing anymore studies at a project level – rather than improve the next level of efficiencies.’

Significant limitations or challenges

Participants identified several challenges to SEA, which could be grouped into five related themes (Table ). Many of the challenges were common to both government and consultant participants. However, only government participants identified limited baseline data as a major challenge or constraint, noting the lack of data available to support SEA application, particularly baseline studies; only consultants reported lack of a clear and common vision amongst government agencies responsible for SEA as a major impediment to success, with one commenting that the lack of a clear vision was the main reason for the lack of implementation of SEA recommendations. The consultant further commented that if SEA is to influence decisions, then the departments and agencies responsible for the SEA must share a similar vision with those other departments and agencies who are either involved in the SEA or in PPP implementation – currently this is not the case.

Table 5 Participant's comments on the realized challenges and limitations to effective SEA.

Both government and consultants raised the lack of knowledge about SEA's purpose and methodology. This result was not surprising, and is consistent with recent explorations of SEA in the Arctic (Noble et al. Citation2013). Of particular concern was the lack of understanding of the differences between SEA and EIA, and the benefits of an additional layer of assessment to the current project regulatory environment. One consultant commented that part of the problem is that the same practitioners who carry out project EIAs typically carry out SEAs and, as a result, SEA tends to adopt an EIA approach.

The final issue identified was the limited time and resources for conducting SEA and ensuring long-term commitment and implementation of SEA results. The reasons, however, were many and varied. Several government participants identified lengthy consultation processes as delaying PPP implementation, suggesting the need for SEAs to be more efficient if they are to be more effective – a statement that helps explain the variability in responses to A5 (Table ) – whether the SEA ensured that stakeholder interests were represented in the final PPP. Limited professional expertise to conduct SEAs, and the lack of sufficient time and resources dedicated to conduct SEA and follow-through to monitor and assess its influence over the long term were frequently reported challenges. One participant commented that it could take many years before the real benefits of SEA application are realized, referring to the indirect impacts of SEA (Table ), but this long-term perspective is in conflict with demands for agencies to demonstrate that they are meeting certain policy or political goals and objectives. One consultant reported that in his experience, the SEA was done simply to meet the requirement that it be done, not to influence decisions.

Learning outcomes

The final phase of the survey asked participants to identify what they considered to be the most important learning outcomes about the impact and influence of SEA based on their recent experiences. Participants identified several lessons, many of which were context- or case-specific; however, three overarching learning outcomes were reported across study participants.

Several participants reported that SEA was not as effective as it could have been because it was a stand-alone process and lacked integration with, or linkages to, other forms of assessment and decision-making. The single window approach to SEA (Partidário Citation2012), where SEA was applied to assess the impacts of a proposed PPP, was seen as having limited influence. As reported by one government participant, most SEAs are conducted too late to have an influence; they are ‘… usually undertaken too late in the approval process to have a real impact on decision making in the end.’

Participants emphasized the importance of a shared vision and expectations amongst those involved in SEA, but more importantly amongst those involved in the implementation of its recommendations. A consultant participant commented: ‘One thing I learned was the need for a shared vision at the outset; if there is not a shared vision then the SEA process may unfold good enough, but implementation will be challenging.’ A government participant shared a similar experience, reporting significant resistance to the SEA's recommendations – specifically those recommendations that bridged multiple government departments and agencies or were beyond the scope of the department or agency responsible for the SEA itself. The participant explained that ensuring SEA has an impact means ensuring that those responsible for implementing its recommendations share a common vision and set of expectations about what the SEA is to deliver – and that this commitment is more important than being involved in the SEA process itself.

The need for long-term commitment to fully realize the benefits of SEA was a third common theme. Several participants commented that SEA is still a relatively new concept, when compared to EIA, and many government departments and agencies are only now conducting SEA for the first time; as a result, its longer-term impacts have yet to materialize. One participant explained that governments need to give greater consideration to the time required to realize the impacts of SEA. A consultant, with experience in regional assessments, suggested that because of the time required to realize strategic outcomes, SEA might be more influential, and successful, when integrated with longer-term, regional land use planning processes that adopt SEA frameworks or principles, rather than approaching SEA as a stand-alone process.

Observations and lessons

Based on the reported SEA experiences of study participants, the proposed SEA evaluation criteria were not fully met, and participants perceived that significantly more direct impact, or output, criteria were met than indirect or outcome criteria. This does not mean that SEA is wholly ineffective or lacks impact or influence; rather, SEA's design and objectives vary considerably from one sector (e.g. national parks policy) to the next (e.g. offshore oil and gas planning), and from stand-alone PPP appraisals to those SEAs integrated with long-term regional planning and assessment frameworks. Further, many of the indirect effects of SEA often go unreported in formal audit and evaluation studies and perhaps, as a result, unnoticed.

It was clear from our results that SEA is perceived as effective in terms of identifying the potential PPP impacts and finding ways to manage them; in helping ensure PPP compliance with an agency's or organization's mandate or policy commitments; and in ensuring a degree of efficiency in the PPP process. These results are consistent with recent evaluations of SEA effectiveness in both Ireland (EPA Citation2012) and Scotland (SEPA Citation2011). It is less clear, however, whether SEA successfully influences or informs the development of PPPs, directs their implementation and ensures that stakeholder interests are represented in the final PPP or decisions taken. Whether SEA helps to realize broader organizational or institutional goals and objectives, beyond the scope of the PPP itself, was a polarizing issue between governments and consultants. And, aside from SEA identifying or stimulating new research or PPP needs, and improving overall awareness of PPP issues within an organization, there was little clarity on the indirect impacts or outcomes of SEA. Results allow for no clear conclusion, based on participant perspectives, as to whether SEA is leading to improved environmental or socioeconomic conditions.

Technical constraints were amongst the minority of challenges identified to realizing SEA's impact. Participants expressed concern that the practitioners who conduct SEAs are typically trained in EIA and, as a result, the impact of SEA is not fully realized. This is consistent with Verheem and Dusik's (Citation2011) observation that, in many countries, SEA is still practiced as an EIA-based tool. The most common issues raised, however, identified by more than two-thirds of participants, concerned the lack of shared vision for SEA by those responsible for implementation, the lack of knowledge and understanding of what SEA is to achieve, and time and resource constraints and political willingness to commit to SEA and to its recommendations. The need for a shared or common vision for SEA at the outset of the process and the long-term commitments required to realize the benefits of an SEA were thus amongst the most significant learning outcomes identified by participants.

In the sections that follow, we venture a number of observations that attempt to explain the above results. Although our empirical findings are rooted in the experiences of SEA practitioners in Canada, we believe that the underlying factors that explain these results are not context-specific and broadly applicable to advancing SEA's impact in other regions.

Understanding and advancing SEA's impact

First, although many SEAs on which participants based their assessments were recent SEAs, or still in progress, participants still identified several immediate or direct impacts of SEA. That more direct impacts were identified than indirect ones is not surprising. The direct impacts or outputs of SEA typically need only a relatively short time to be realized (during or immediately following an SEA), and even agencies conducting SEA for the first time reported the direct impacts of SEA outputs. Indirect impacts, however, often require a longer time to manifest (EPA Citation2012); they are often subtle, difficult to measure and may easily go unnoticed (Thissen Citation2000; Owens & Cowell Citation2006; Runhaar & Driessen Citation2007; Stoeglehner et al. Citation2009).

It is often that case that more outcomes are not identified or realized because of the ad hoc or one time application of many SEAs (Noble Citation2013), using SEA as a yardstick against which the acceptability of a PPP is measured rather than as a decision support tool for PPP development and long-term sustainability. It appears that the short-term objective to enhance the approval of an agency's PPP is the norm in SEA practice – at least under formal or directive-based approaches, after which SEA activities are discontinued because the objectives, that of conducting the SEA itself and making the PPP decision, have been achieved. Exceptions occur when SEA is part of a broader regional or land use planning processes, where plans are subject to periodic reviews and, as a result, the SEA revisited.

Second, the need for improved follow-through and follow-up on SEA to evaluate its impacts have been emphasized in both the scholarly literature (e.g. Cherp et al. Citation2008; Gachechiladze et al. Citation2009) and by the results of our study; but the focus of both research and practice (e.g. SEPA Citation2011; EPA Citation2012) has been primarily on following-up and evaluating SEA recommendations and the direct impact(s) of SEA on the PPP at hand. There have been several prescriptions on how to better ensure that something be done about SEA results to realize its direct impact on PPP decisions, ranging from law to policy options (Gibson et al. Citation2010). Although it is important to ensure the influence of SEA on a PPP, there is a need to focus more attention on following-up and evaluating the indirect impacts or outcomes of SEA beyond the PPP, if SEA's value added is to be fully realized. Even in those cases where none of the recommendations resulting from an SEA is implemented, there is still an opportunity for SEA to have a longer-term, indirect impact. Thérivel and Minas (Citation2002), for example, report that even when a strategic action remains unchanged after SEA, the SEA may still provide a better understanding of environmental and sustainability issues. This is consistent with the responses of our study participants, who identified such long-term and indirect impacts that may result from SEA as including the realization of broader organizational or institutional goals and objectives, or the recognition of other PPP gaps and the need for new research or PPP development.

Third, unlike SEA outputs, which are often linked to the objectives of the SEA (e.g. the SEA terms of reference), or the SEA process (e.g. requirements for participation and engagement), most indirect impacts resulting from SEA, or outcomes, are not explicitly planned for. A significant obstacle to recognizing the often indirect impacts of SEA is the difficulty in distinguishing between an agency's or organization's normal policies, management practices and innovations and those benefits emerging due to SEA application – an issue raised by many of our government study participants. The challenge of linking SEA to outcomes is exacerbated when the SEA process is tightly coupled with policy or planning itself, such as when SEA is used to guide the development of a PPP (Partidário Citation2012). Jha-Thakur et al. (Citation2009), for example, report that when SEA is part of a wider, well-developed planning context, it may not necessarily enhance certain levels of learning, which may already be well developed. That said, much of the literature indicates that SEA is most effective when integrated with the PPP development and decision-making process (Brown & Thérivel Citation2000; CCME Citation2009; Partidário Citation2012; Tetlow & Hanusch Citation2012). The Scottish EPA, for example, reports that SEA is more influential when approached as a ‘plan shaper’ versus as a ‘fine tuner’ of existing PPPs or decisions (SEPA Citation2011). The inability to separate SEA's indirect impacts or longer-term added value from an agency's or organization's management practices does not diminish its value or impact. Rather, the challenge is how to link these impacts and outcomes to SEA, versus the normal practices or influences of the agency implementing the SEA, or exogenous factors.

Typically, as reported by our study participants, increased pressures to meet specific departmental or agency objectives often translate into a focus on, first, process and, second, output-based goals and objectives. The focus on longer-term impacts, or outcomes, is rarely the case and when they are considered, the association is typically with broader regional environmental planning processes or frameworks that include SEA versus the SEA per se. Although typically outside any legal requirements or formal obligations for SEA, if the full range of SEA's potential impacts are to be realized, and appropriately credited to the SEA process, then indirect impacts and outcomes require much more explicit consideration at the outset of the SEA design process than what is currently the case – when establishing the guidance, objectives or terms of reference for the assessment.

These indirect impacts and longer-term outcomes must be more specific than simply ‘sustainability’ or ‘inter- and intra-generational equity,’ as is often the case in practice (White & Noble Citation2013). Simple and pragmatic criteria and/or indicators are required to follow-up on and evaluate the outcomes of SEA (Hacking & Guthrie Citation2006; White & Noble Citation2012); those identified by our panel of reviewers (Table ) and by Chanchitpricha and Bond (Citation2013) may serve as a foundation for the development of more context-specific metrics for SEA evaluation. We acknowledge that developing such criteria and/or indicators and attributing such matters as learning or increased efficiencies in ‘next level’ decision-making to SEA is not a simple task; however, it is increasingly important to demonstrate the added value of SEA beyond the application of the process itself (SEPA Citation2011) and beyond the scope of the particular PPP at hand.

Finally, achieving such direction and focus on advancing the impact of SEA requires change within those organizations responsible for SEA – including those responsible for SEA implementation, and following-up on SEA applications. Tetlow and Hanusch (Citation2012) report that there is often a lengthy time period required to realize the influence of an assessment process on actual environmental outcomes; thus, organizations may not see the value of SEA practices in the short term. At the same time, a focus on long-term and indirect impacts is not necessarily consistent with the demand for rapid results in an era of increased streamlining of assessment processes. Government agencies often do not get the necessary support to effectively implement SEA and to follow-up on its impact, yet they are constantly subjected to evaluations that indicate poor performance. Outputs need sufficient time to translate into outcomes, which requires both resources and a long-term commitment. Our study participants reported that SEA helped to identify the potential impacts of PPPs and resulted in strategies for managing impacts, but were neutral as to whether SEA led to improved environmental or socioeconomic conditions. Either there had been insufficient time to realize the benefits, consistent with the results of a recent review of SEA effectiveness in Ireland (EPA Citation2012), or there was no means to assess these longer-term outcomes due to a lack of commitment to ensuring implementation and following-through on SEA post-PPP decisions.

SEPA (Citation2011) found that those authorities that have built SEA into their corporate culture have tended to benefit the most from SEA. Institutional capacity and supportive learning environments play a major role in ensuring the immediate and longer-term impact of SEA. By institutional capacity, we mean social capital and an enabling environment, as well as the culture, values and power relations within an agency or organization (Segnestam et al. Citation2003). Capacity building, training of SEA experts and increased resource investment, as identified by our study participants, are important but insufficient solutions to ensure that the impacts of SEA are fully recognized. Notwithstanding trained practitioners, achieving long-term and indirect impacts from SEA, including changes in agency or organizational practices, requires a long-term commitment to SEA and an institutional environment that supports innovation and learning.

Conclusion

A fundamental challenge to ensuring the impact of SEA is that it is still often perceived, and approached, as EIA applied to PPPs. As a result, there is an expectation that SEA's impacts or value added will be immediate when, in reality, its impacts are often subtle, indirect and unfold over the long-term. Evaluating the effectiveness of SEA, beyond basic input or procedural characteristics, is thus a complex task. It is conceivable that an SEA implemented in full compliance with a policy or directive, and adopting a sound methodological process, can have little to no impact. At the same time, it is conceivable that an SEA that does not adhere to ‘best practice’ can have a significant impact. Further, the expectations of what SEA is and can and should deliver vary considerably. That said, these are insufficient reasons for not focusing more attention on the impacts of SEA on PPPs and, more importantly, on the longer term, often indirect, impacts and influences of SEA.

The notion of effectiveness as an absolute measure may be untenable (Cashmore et al. Citation2008). A challenge to effectiveness studies, and a limitation to this research, is that one of the few ways to measure indirect effectiveness is based on the experience of those involved in the impact assessment process. As a result, evaluations of indirect effectiveness are shaped not only by the experiences of those involved, but also by the different interpretations and understandings of what constitutes effectiveness. Not all criteria identified in this study may be equally applicable to all SEA applications, and there is a diversity of effectiveness criteria for impact assessment (see, e.g. Chanchitpricha & Bond Citation2013).

We agree with Stoeglehner et al. (Citation2009) and Tetlow and Hanusch (Citation2012) in that it is necessary to reconceptualize how we approach effectiveness in SEA, and suggest the need for effectiveness studies across different SEA contexts in order to more fully understand the factors that enable or constrain effective SEA. Advancing SEA requires focusing not only on improving its procedural aspects, such as participation or alternatives assessment, but also on developing the necessary institutional and learning environments to ensure that SEA can, and does, result in not only direct impacts on the PPP in question but also has a longer-term value added.

Acknowledgments

We wish to acknowledge the contributions of our study participants and the constructive feedback of three anonymous peer reviewers.

Additional information

Funding

This research was funded in part by the Social Sciences and humanities Research Council of Canada.

Notes

1. This research was completed following the introduction of the new Canadian Environmental Assessment Act 2012. Most of the SEA cases reffered to by study particpants were completed prior to the new act. However, SEA in Canada is conducted federally under a Cabinet Directive, separate from the federal environmental assessment act, and provincially under various ministerial orders or commissioned studies. As such, recent legislative changes in Canada to the federal environmental assessment act do not affect the results of this research.

References

  • ArbterK. 2003. SEA and SIA – two participative assessment tools for sustainability. Conference proceedings of the EASY ECO 2 Conference; May 15–17; Vienna . p. 175–81.
  • BinaO, JingW, BrownL, PartidárioMR. 2011. An inquiry into the concept of SEA effectiveness: towards criteria for Chinese practice. Environ Impact Assess Rev. 31:572–81.
  • BreghaF. 2011. How Ottawa spends 2011–2012: trimming fat or slick pork?Montreal, QC: McGill-Queen's University Press. Chapter 7, Time to get serious about strategic environmental assessment of federal government policies and plans; p. 144–62.
  • BrownAL, ThérivelR. 2000. Effective methodologies: principles to guide the development of strategic environmental assessment methodology. Impact Assess Proj Appraisal. 18(3):183–9.
  • BrymanA. 2012. Social research methods. 4th ed. New York, NY: Oxford University Press.
  • CashmoreM, BondA, SadlerB. 2008. Introduction: the effectiveness of impact assessment instruments. Impact Assess Proj Appraisal. 27(2):91–3.
  • [CCME] Canadian Council of Ministers of the Environment. 2009. Regional strategic environmental assessment in Canada: principles and guidance. Winnipeg, MB: Canadian Council of Ministers of the Environment.
  • [CESD] Commissioner of the Environment and Sustainable Development. 2008. March status report. Ottawa: Office of the Auditor General. Available from: http://www.oag-bvg.gc.ca/.
  • ChanchitprichaC, BondA. 2013. Conceptualizing the effectiveness of impact assessment processes. Environ Impact Assess Rev. 43:65–72.
  • CherpA, PartidárioMR, ArtsJ. 2008. Handbook of strategic environmental assessment. London: Earthscan. Chapter 32, From formulation to implementation: strengthening SEA through follow-up; p. 515–34.
  • De MontisA, LeddaA, CaschiliS, GanciuA, BarraM. 2014. SEA effectiveness for landscape and master planning: an investigation in Sardinia. Environ Impact Assess Rev. 47:1–13.
  • [EPA] Environmental Protection Agency. 2012. Review of effectiveness of SEA in Ireland: key findings and recommendations. Wexford: Environmental Protection Agency.
  • FischerTB. 1999. Benefits arising from SEA application: a comparative review of North West England, Noord-Holland, and Brandenburg-Berlin. Environ Impact Assess Rev. 19:143–73.
  • FischerTB, GazzolaP. 2006. SEA effectiveness criteria – equally valid in all countries? The case of Italy. Environ Impact Assess Rev. 23:155–70.
  • GachechiladzeM, NobleBF, BitterBW. 2009. Following-up in strategic environmental assessment: a case study of 20-year forest management planning in Saskatchewan, Canada. Impact Assess Proj Appraisal. 27(1):45–56.
  • GibsonR, BenevidesH, MeinhardD, KirchhoffD. 2010. Strengthening strategic environmental assessment in Canada: an evaluation of three basic options. J Environ Law Pract. 20(3):175–211.
  • HackingT, GuthrieP. 2006. Sustainable development objectives in impact assessment: why are they needed and where do they come from?J Environ Assess Policy Manage. 8:341–71.
  • Harris-RoxasB, HarrisE. 2013. The impact and effectiveness of health impact assessment: a conceptual framework. Environ Impact Assess Rev. 42:51–9.
  • HeinmaK, PõderT. 2010. Effectiveness of environmental impact assessment system in Estonia. Environ Impact Assess Rev. 30(4):272–7.
  • Jha-ThakurU, GazzolaP, PeelD, FischerT, KiddS. 2009. Effectiveness of strategic environmental assessment – the significance of learning. Impact Assess Proj Appraisal. 27(2):133–44.
  • KauppinenT, NelimarkkaK, PerttiläK. 2006. The effectiveness of human health impact assessment in the Finnish Health Cities Network. Public Health. 120:1033–41.
  • KornovL, ThissenWAH. 2000. Rationality in decision- and policy-making: implications for strategic environmental assessment. Impact Assess Proj Appraisal. 18(3):191–200.
  • LawrenceDP. 1997. Quality and effectiveness of environmental impact assessments: lessons and insights from ten assessments in Canada. Impact Assess Proj Appraisal. 12(4):219–32.
  • NobleBF. 2009. Promise and dismay: the state of strategic environmental assessment systems and practices in Canada. Environ Impact Assess Rev. 29(1):66–75.
  • NobleBF. 2013. Review of strategic environmental assessment practices: including lessons learned and opportunities for advancement. Hull, QC: Environment Canada.
  • NobleBF, KetilsonS, AitkenA, PoelzerG. 2013. Strategic environmental assessment opportunities and risks for Arctic offshore energy planning and development. Mar Policy. 39:296–302.
  • O'FaircheallaighC. 2009. Effectiveness in social impact assessment: Aboriginal peoples and resource development in Australia. Impact Assess Proj Appraisal. 27:95–110.
  • OwensS, CowellR. 2006. Governing space: planning reform and the politics of sustainability. Environ Plann C Gov Policy. 24:403–21.
  • PartidárioMR. 2000. Elements of an SEA framework-improving the added-value of SEA. Environ Impact Assess Rev. 20:647–63.
  • PartidárioMR. 2012. Strategic environmental assessment better practice guide: methodological guidance for strategic thinking in SEA. Lisbon: Portuguese Environment Agency and Redas Energeticas Nacionais.
  • RetiefF. 2007. A performance evaluation of strategic environmental assessment (SEA) processes within the South African context. Environ Impact Assess Rev. 27:84–100.
  • RunhaarH, DriessenJP. 2007. What makes strategic environmental assessment successful environmental assessment? The role of context in the contribution of SEA to decision-making. Impact Assess Proj Appraisal. 25(1):2–14.
  • SegnestamL, PerssonA, NilssonM, ArvidssonA, IjjaszE.2003. Country-level environmental analysis: a review of international experience. Environment Strategy Papers No. 8. Washington, DC: The International Bank for Reconstruction and Development, The World Bank.
  • [SEPA] Scottish Environment Protection Agency. 2011. The Scottish strategic environmental assessment review. Stirling: The Scottish Environment Protection Agency.
  • SinclairAJ, DiduckAP, FitzpatrickPJ. 2008. Conceptualizing learning for sustainability through environmental assessment: critical reflections on 15 years of research. Environ Impact Assess Rev. 28(7):415–28.
  • SinclairAJ, SimmsL, SpalingH. 2009. Community-based approaches to strategic environmental assessment: lessons from Costa Rica. Environ Impact Assess Rev. 29(3):147–56.
  • SongG, ZhouL, ZhangL. 2011. Institutional design for environmental assessment on urban economic and social development planning in China. Environ Impact Assess Rev. 31:582–6.
  • StoeglehnerG, BrownAL, KornovLB. 2009. SEA and planning: ‘ownership’ of strategic environmental assessment by the planners is the key to its effectiveness. Impact Assess Proj Appraisal. 27(2):111–20.
  • TetlowMF, HanuschM. 2012. Strategic environmental assessment: the state of the art. Impact Assess Proj Appraisal. 30(1):15–24.
  • TheophilouV, BondA, CashmoreM. 2010. Application of SEA directive to EU structural funds: perspectives on effectiveness. Environ Impact Assess Rev. 30:136–44.
  • ThérivelR, MinasP. 2002. Measuring SEA effectiveness: ensuring effective sustainability appraisal. Impact Assess Proj Appraisal. 20(2):81–91.
  • ThissenR. 2000. Perspectives on strategic environmental assessment. New York, NY: Lewis. Chapter 8, Criteria for evaluation of SEA; p. 113–29.
  • TukeyJW. 1977. Exploratory data analysis. Reading, MA: Addison Wesley.
  • van BuurenVA, NooteboomS. 2009. Evaluating strategic environmental assessment in the Netherlands: content, process and procedure as indissoluble criteria for effectiveness. Impact Assess Proj Appraisal. 27(2):145–54.
  • Verheem R, Dusik J. 2011. A hitchhiker's guide to SEA: are we on the same planet? Opening plenary. IAIA Special Conference on SEA; September 21–23; Prague.
  • VerheemR, TonkJ. 2000. Strategic environmental assessment: one concept, multiple forms. Impact Assess Proj Appraisal. 19(3):177–82.
  • WangS, LiuJ, RenL, ZhangK, WangR. 2009. The development and practices of strategic environmental assessment in Shandong Province, in China. Environ Impact Assess Rev. 29:408–20.
  • WhiteL, NobleBF. 2012. Strategic environmental assessment in the electricity sector: an application to electricity supply planning, Saskatchewan, Canada. Impact Assess Proj Appraisal. 30(4):284–95.
  • WhiteL, NobleBF. 2013. Strategic environmental assessment for sustainability: a review of a decade of academic research. Environ Impact Assess Rev. 42:60–6.
  • WismarM, BlauJ, ErnstK, FiguerasJ. 2007. The effectiveness of health impact assessment: scope and limitations of supporting decision-making in Europe. Copenhagen: World Health Organization.
  • ZhangJ, ChristensenP, KornovL. 2013. Review of critical factors for SEA implementation. Environ Impact Assess Rev. 38:88–98.
  • ZhouK, SheateWR. 2011. Case studies: application of SEA in provincial expressway infrastructure network planning in China – current existing problems. Environ Impact Assess Rev. 31:521–37.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.