3,730
Views
44
CrossRef citations to date
0
Altmetric
Articles

Survey of current methods and guidance for strategic environmental assessment

, &
Pages 139-147 | Received 02 Nov 2011, Accepted 11 Dec 2011, Published online: 16 Jul 2012

Abstract

This paper examines methods promoted and used in strategic environmental assessment (SEA) practice, practitioner choices about methodology and the nature of SEA guidance. Results show that SEA is not challenged by a lack of methods, but the range of methods promoted and used is restrictive. A major challenge to practice is making the ‘right choices’ about methods and methodology. Much SEA guidance is focused on flexibility in SEA, providing high-level principles, and is too generic to facilitate such choices. It is assumed that there is sufficient expertise amongst SEA practitioners, and that practitioners will simply know what methods and methodologies are best. Our results indicate that more detailed operational guidance is needed at the practitioner level on how to make sound methodological choices and how to select the best available methods for the SEA tier and context at hand.

Introduction

It has been more than 20 years since Bridgewater's (Citation1989) observation that strategic environmental assessment (SEA) methodology has not been adequately developed, arguing for an appropriate SEA framework and set of guiding principles and tested methods. The academic community responded, and throughout the 1990s and early 2000s there emerged a considerable body of SEA literature dedicated to methodology. Some suggested that SEA simply required the extension of project environmental impact assessment (EIA) ‘upstream’ to policies, plans and programs (PPPs; e.g. CEARC Citation1990, UNECE Citation1992, Wood Citation1995); others argued that SEA needed to break away from EIA (e.g. Thérivel et al. Citation1992, Bailey and Renton Citation1997) and required a methodological framework, that ‘grafting SEA onto existing PPP formulation procedures will not be achieved by attempting to translate existing project-based EIA … upstream’ (Brown and Thérivel Citation2000, p. 186). There emerged some consensus that SEA worked best when integrated with PPP formulation (e.g. Fischer Citation1999, Devuyst et al. Citation2000, Noble Citation2000), but it was also pointed out that, first, ‘SEA must learn how policy making works’ (Nitz and Brown Citation2001, p. 329). Underlying these discussions was considerable debate on how much structure vs how much flexibility there should be in SEA (see Partidário Citation2000, Wiseman Citation2000, Nilsson and Dalkman Citation2001, Nitz and Brown Citation2001, Noble and Storey Citation2001, Fischer Citation2003), with some lobbying for a qualitative and flexible process and others promoting structured methodological frameworks and supporting quantitative methods.

In recent years, aside from country reports on developing SEA systems (e.g. Liou et al. Citation2006, Zhu et al. Citation2011), sector-specific applications (e.g. Finnveden et al. Citation2003, Björklund Citation2012) or individual SEA components, such as follow-up (e.g. Nilsson et al. Citation2009), participation (e.g. Sinclair et al. Citation2009) or cumulative effects (e.g. Harriman Gunn and Noble Citation2009), discussions about SEA methodology have been less prominent in the academic literature. It could be argued that the methodology debate is settled; that the methods to support SEA and practitioner knowledge of how to use them have matured. It could be that the SEA community is content with the characterization of SEA as ‘one concept, multiple forms’ (Verheem and Tonk Citation2000), and practitioners are skilled enough to use what works best from project-EIA and borrow or adapt other methods and frameworks as needed from planning or policy appraisal (Thérivel et al. Citation2004).

Interestingly, Liou et al. (Citation2006) identify unfamiliarity with SEA procedures and methodologies and the lack of clear guidelines as amongst the main challenges encountered during SEA implementation. They suggest that ‘the concept of SEA techniques and methodologies … is rather fuzzy’. In the South African context, Retief (Citation2007, p. 86) cautions that ‘some have perceived being flexible and adaptable as synonymous with being vague and confusing’. In the Canadian context, Noble (Citation2009) reports that SEA has experienced considerable growth but that practice remains ad hoc, methodologically diverse and overall inconsistent. Wallgren et al. (Citation2011, p. 230) report that, although frameworks and legislations exist for SEA, implementation ‘has been difficult and slow in many countries’.

We agree with Thérivel and Partidário's (Citation1996) and Partidário's (Citation2000) early observations that there is not a single best SEA methodology for all PPPs, and specific design for specific application is necessary (Verheem and Tonk Citation2000). Because of the multivariate nature of SEA, it is important that practitioners have sufficient and sound guidance to make important methodological choices about the best SEA design and set of supporting methods. We hypothesize that the guidance needed at the practitioner level is beyond academic ‘best-practice principles’ and generic SEA frameworks. However, notwithstanding the growing volume of SEA guidance documents, there has been limited attention to the efficacy of current guidance to help practitioners determine the best SEA design and methods for their specific application.

In this paper we survey the current state of SEA guidance and supporting methods. Specifically, we examine the nature and range of methods commonly promoted and used in SEA practice; practitioner choices about SEA methodology and supporting methods; and the state of methodological guidance and support for making such choices. In the sections that follow we report on the results of a review of select SEA guidance documents, a survey of methods used in practice, and the results of semi-structured interviews with SEA practitioners about available methods, guidance and challenges to implementation. We then venture a number of observations concerning SEA guidance and methods support, and the implications for ensuring the efficacy of SEA practice.

Study methods

Our study consisted of three parts: a review of SEA guidance; a survey of practitioners and completed SEA statements; and semi-structured interviews with practitioners. The sample of guidance, practitioners and SEA statements was not meant to be representative of all SEA systems; the intention was to provide a ‘snapshot’ of available guidance, methods and what practitioners identify as the key challenges to application. Our review of SEA guidance focused on international and Canadian guidance, including: the UNECE (Citation2007) resource manual on SEA; guidance documents and toolkits developed to support the European Directive 2001/42/EC (e.g. ODPM Citation2005, Sommer Citation2005, Scottish Executive Citation2006); OECD DAC (Citation2006) guidance; methodological guidance for SEA application to spatial planning in Portugal (Partidário Citation2003); South African guidance (DEAT 2004); the World Bank (Citation2009) SEA Toolkit; and Canadian guidance on SEA (Croal et al. Citation2010), including guidance on implementing the federal SEA directive (e.g. Industry Canada Citation2007, Transport Canada Citation2000, INAC Citation2008, Privy Council Office and CEAA Citation2010).

The survey and interviews were administered to a sample of 34 Canadian and 11 international practitioners. ‘Practitioners’ was broadly defined to include consultants who carry out SEAs and government administrators who either carry out or manage SEA processes. Participants were sent the survey in advance of the interview, which consisted of a matrix of the stages of an SEA process, and asked to identify: (a) methods used in practice in their jurisdiction, department, or agency; and (b) the stages of the SEA process at which those methods are most commonly used. ‘Methods’ was broadly defined to include approaches to data/information collection, analysis, organization and reporting. For consistency, we separated the SEA process into phases, from ‘conducting a preliminary scan’ to ‘follow-up and monitoring’, based on the Canadian SEA directive (Privy Council Office and CEAA Citation2010). Existing SEA guidance documents were used to develop an initial categorization of methods, including methods identified in the academic literature. Semi-structured interviews focused on the efficacy of the methods identified; availability of methods to support SEA; quality and availability of SEA guidance; and major challenges to practice. There was no attempt to evaluate the participant's level of expertise on SEA, or their overall attitude toward SEA. The results of our survey are dependent, in part, on the experience and expertise of the participants.

As such, a total of 14 impact statements or SEA reports from Canadian and international SEA and SEA-type applications were also reviewed (Table ). A method was identified as being used in the SEA if it was explicitly identified or demonstrated in the assessment report. A select few SEA cases were recommended by participants, but the majority were identified from the grey literature, on government websites and based on what was accessible. The cases provide a snapshot of practice and capture a range of PPPs and sectors from national environmental appraisals to regional energy system planning. Not all SEAs carried out under a directive; some were informal SEAs conducted using SEA-like methodologies or guiding principles.

Table 1 SEA and SEA-like reports reviewed.

Results

Guidance on SEA methods

Our review of SEA guidance revealed four key findings. First, we found that some SEA guidance documents focused almost exclusively on compliance – providing standardized, questionnaire-type approaches to ensure alignment with an overarching directive or mandate. This was the case with Canadian federal guidance under the SEA directive (see Privy Council Office and CEAA Citation2010). For example, the stated purposes of the guidance document are to provide the decision-making context for SEA and its link to federal sustainable development goals; to outline the obligations of federal departments in applying SEA; and to provide advice on implementing the directive. The guidance indicates the need to conduct a preliminary scan, to address the scope and nature of potential effects and to determine mitigation needs and any follow-up requirements, and presents the expectations of federal departments and agencies in terms of SEA reporting. There is no methodological guidance for the most appropriate SEA design or for selecting the best methods; rather, ‘departments and agencies are encouraged to develop their own sources of information and analytical tools’ (Privy Council Office and CEAA Citation2010, p. 9). Guidance developed by Canadian federal departments and agencies was also largely focused on how to ensure compliance with the directive, with much less attention to methodology and supporting methods. INAC (Citation2008) guidance, for example, includes flow charts that identify the stages of the federal SEA process and agency roles and responsibilities, but provides no direction as to what methods can or should be used. Several guiding principles are suggested, such as the need to ‘provide a high level description of the potential for the proposal to interact with the environment and cause environmental effects’ (p. 13), and to ask ‘how does the outcome interact with the environment?’ (p. 16), but the guidance is focused largely on protocol and SEA report documentation.

Second, in those cases where guidance did focus on methodology and ‘good practice’, the majority provided superficial treatment of methods. The Scottish Executive's (2006) SEA toolkit provides a guide to methodology, with sections dedicated to, among other things, assessing cumulative effects. Illustrative checklists are provided for comparing alternatives and evaluating impacts, and methods are listed for assessing cumulative effects; guidance on the most appropriate methodological design and methods section for the specific tier of SEA application is limited. Section 6.3.17 of the guidance simply notes that ‘the method used to undertake the assessment will vary …, depending upon its scale, nature, content and place in the PPS [plans, programmes, strategies] hierarchy’ (Scottish Executive Citation2006, p. 16). The OECD DAC (Citation2006, pp. 32–33) guidance similarly describes SEA as ‘a family of approaches using a variety of tools’, emphasizing that ‘the availability of data, level of definition of the PPP, knowledge of direct and indirect impacts, and available time frame for the SEA’ will help determine the methods used. Examples of available methods are provided, and it reports that ‘where SEA is applied to plans and programmes, a more structured approach’ can be used, characteristic of EIA-type applications, but ‘in policy making, usually this will not be possible because of the complex, non-linear character of this process’ (OECD DAC Citation2006, p. 50). Identification of the most suitable methods for each type of application is left to the knowledge of the practitioner. We found Croal et al.'s (Citation2010) decision-maker's tool for sustainability-centred SEA to be amongst the most comprehensive in terms of how to incorporate sustainability principles in SEA, providing a series of questions to guide a practitioner through a ‘typical’ SEA. The guidance indicates that ‘SEA should lead to a recommendation to the decision maker, formulated on the best available information and methods of analysis’ (Croal et al. Citation2010, p. 9), but guidance on how to identify the best available methods was outside the scope of the tool.

Third, we found limited practical guidance as to how a single process is to be adapted to the diverse range of applications to which SEA can and should be applied, and how to make informed choices about the best methods. Notwithstanding the recognized difference between SEA applied to lower-level programmes vs higher-level policies, guidance documents that identified SEA methods rarely differentiated between or provided guidance on methods that may be more or less suitable at each tier. Guidance predominately focused on methods for organizing or presenting information, borrowed largely from project-based EIA, including checklists, matrices and expert judgment. These methods were typically presented as being sufficiently flexible across all types of SEA. ODPM (Citation2005) guidance does make reference to what may be considered ‘higher-level’ evaluation and appraisal methods, including scenario analysis, multi-criteria evaluation and compatibility assessment; however, there is little guidance on when such methods are most appropriate. It simply notes that ‘the methods outlined … can be regarded as tools and techniques to be used to meet the requirements of the Directive. In practice, the Responsible Authority may find it appropriate to vary its approach, for instance, in combining qualitative and quantitative assessment’ (ODPM Citation2005, p. 23).

Fourth, we found significantly more attention and weight given to qualitative methods, particularly expert judgment, than quantitative ones. Industry Canada (Citation2007, p. 2) guidance, for example, provides a questionnaire-based approach to SEA and notes that the assessor should ‘focus on identifying strategic considerations at a high level, rather than detailed quantitative analysis’. Sommer's (Citation2005) guidance for SEA under Directive 2001/42/EC similarly proposes, for each step of the SEA process, a ‘basic set of tools … which essentially are check-lists and … rules designed to facilitate the use of the check-lists’ (p. 27). Expert judgment is identified as ‘an essential instrument for assessing environmental effects in strategic environmental assessments’ (p. 60). Quantitative methods are explicitly dismissed, reporting that ‘mathematical methods hardly make sense in the assessment of environmental effects … Exact numeric specifications involving detailed mathematical models … are, therefore, impossible for this application owing to the PPs’ fuzziness' (Sommer Citation2005, p. 60). The guidance notes that quantitative methods ‘would lead to “fictitious precision” reducing such approaches to absurdity.’

Methods used in SEA practice

Our survey of SEA participants identified 18 different methods used in practice and at various phases of the process (Table ). There were no differences in methods identified or their reported use between Canadian and international study participants. Various forms of expert judgment, public consultation and ad hoc methods or lessons from elsewhere were identified as the most commonly used throughout the SEA process. For conducting a preliminary scan, ad hoc approaches, followed by consultation methods, expert judgment and case or literature reviews were identified as the most commonly used. Expert judgment was identified as the most common method for all tasks involved in analysing the environmental effects of PPPs, for identifying alternatives and for evaluating and comparing alternatives.

Table 2 Methods identified by expert respondents as currently used in the practice of SEA.a

What survey participants told us (Table ) was confirmed by our review of SEA documentation (Table ). There were some minor deviations. For example, ad hoc methods or drawing on lessons from elsewhere were reported to be used more by practitioners for ‘identifying alternatives’, ‘evaluating alternatives’, ‘identifying public and stakeholder concerns’ and ‘follow-up and monitoring’ than what we found to be documented in SEA reports. Also, trends analysis and extrapolation were reported by practitioners to be used more frequently in ‘characterizing effects’ and for ‘predicting futures or outcomes’ than what we found to be documented in practice. However, additional methods may have been used during the SEA that were not reported in documentation, and any minor variations could be explained by the knowledge of the participant or range of cases reviewed.

Table 3 Methods identified in SEA and SEA-type assessment documents.a

We also found that some methods were more versatile than others – they were used at many different stages of the assessment process. These were mostly qualitative methods, such as expert judgment, case reviews or consultation processes. Expert judgment and similar types of expert-based methods were evident across all cases examined and were the most common methods identified regardless of the SEA tier or whether the SEA was appraisal-based or more technical in its approach. Relatively data-intense methods, such as systems modelling, scenario analysis and spatial analytical methods, were limited in application and when used were typically found in SEAs tiered toward regional planning-type assessments. Such methods require data that is often either not available or not applicable at the broader policy level. These methods are also time- and resource-intensive in comparison to expert judgment.

Finally, there was variation in the range of methods used at different stages of the assessment process. For example, ‘conducting a preliminary scan’, ‘identifying public and stakeholder concerns’, and ‘follow-up and monitoring’ were the most restrictive phases of the SEA process in terms of the diversity of methods used. ‘Identifying potential effects’, ‘characterizing effects’ and ‘evaluating/comparing alternatives’ were the most ambitious phases in terms of method diversity.

Practitioner perspectives on SEA guidance and methods support

Our interviews provided little evidence to suggest that SEA was constrained by a lack of methods. One participant noted that ‘most tools [already] exist to complete SEAs efficiently and with value added’ and that ‘the problem rests with lack of expertise as opposed to lack of methods or tools’. Another participant responded by saying that SEA is ‘not really constrained by a lack of tools – there are many to choose from appropriate to the proposal being considered’. However, participants did raise a number of concerns about the state of methodological guidance and support for SEA.

Knowing what methods to use and when

More than half of our participants noted that practitioners and regulators often struggle to know what methods can and should be applied in a strategic context, for what purposes they are best used, and at what tiers are they most appropriate. One participant explained that ‘because SEA is so vaguely defined … the concept of tools is not emerging in any systematic or organized way’. Another pointed to the diversity of SEA approaches currently being promoted, from highly structured quantitative frameworks to evaluate alternative options to loosely defined methods for the qualitative appraisal of a single pre-defined policy option. This participant went on to explain that, as a result, SEA is ‘constrained’ because of a lack of understanding of which methods to apply, when to apply them, and what the overall objectives or contributions of applying those methods are to the SEA and decision process.

Other participants echoed this claim, noting that methods ‘are well known among some, though … most consulting firms and governmental departments are not even aware of what SEA is intended to accomplish’. Specifically, participants noted a lack of knowledge and understanding about ‘more strategic based methods and tools that will encourage dialogues, long-term analysis, working with uncertainty, communication, institutional analysis, learning processes, and strategic thinking’. A number of participants expressed the need for more guidance on how and when to use the methods that are available, and ‘an overarching guidance book that has case studies incorporated into the document’ to demonstrate for practitioners what methods are most appropriate under different SEA contexts and scenarios. One interviewee, however, from The Netherlands, said that methods guidance for SEA was not needed since there was ‘a sufficiently large, highly skilled pool of SEA expertise within the country’.

Expected outcomes and applied methodological guidance

A second theme that emerged concerned the expected outcomes of SEA and guidance on how best to use SEA to achieve different, intended objectives. Some, for example, viewed SEA as generating a report, which is then considered to have applicability to making decisions about a PPP– similar to EIA. For others SEA was described as a process to ensure continual improvement of, and closely integrated with, PPP development and implementation. Participants agreed that SEA should offer some structure and consistency, yet considerable flexibility is needed to ensure that SEA is applicable to the PPP context at hand. The problem, explained one participant, is that notwithstanding directives and guidebooks for meeting the intention of those directives, SEA training and guidance materials tend to treat SEA simply ‘as a somewhat less detailed form of project assessment’.

This concern was echoed by another participant, who added that ‘education and awareness is poor’ and applied SEA methodology is ‘poorly explained or rationalized by current materials’. A third participant expressed a similar view, and said that the lack of good-practice guidance ‘is most evident in the emphasis on the SEA report as a stand-alone document’. The participant went on to explain that ‘the ideal SEA document reports on what was done to affect or change the policy, plan or program proposal’, but there exists limited practical guidance on how to ensure and achieve such a level of integration. Other participants described SEA methodology ‘as poorly understood or not understood at all’. One academic said that many of the difficulties can be tied to a lack of understanding of what SEA can and should achieve, explaining that ‘the responsibility or obligation to take a strategic approach to decision making is being delegated to the SEA process, when it should be part of the planning, program and policy development process to begin with’. In other words, SEA is being used too often ‘as a substitute for good strategic planning practice’.

Institutional requirements vs institutional support

The institutional context for SEA was frequently raised as having discernible effects on SEA methodology. Many participants felt that the institutional environment in which SEA operates begets either conducive or constraining conditions for its methodological development, guidance and support. Some participants indicated that the lack of a requirement that SEA be implemented was ‘constraining SEA’. On the other hand, an equal number of participants noted that legislation can be overly prescriptive, incompatible with the need for flexibility and lead to stifled creativity and marginal standards of performance. There was consensus that a constraint to ‘good’ SEA was the lack of institutional support or ‘the attitude of people involved, the absence of a strategic culture’. Well-developed legal and policy instruments, combined with the support of experienced and educated SEA consultants and administrators, was thought to be a key catalyst to good SEA methodological design. Participants from three different countries identified the lack of a supportive ‘strategic culture’ and ‘resistance toward SEA’ as major challenges to practice. Even for those jurisdictions with directive-based SEA, participants identified a ‘lack of government support to apply and implement SEA’ and a ‘lack of clear guidance and government or ministry policy’.

One participant emphasized that the public and government want ‘to make a decision quickly and they become frustrated over what they perceive as a delay caused by SEA’. The participant went on to explain that, because of this, timely application and efficient methodologies to support SEA are necessary but, at the same time, there is also a lack of clear understanding amongst practitioners about newer, progressive and effective methods that reduce the fear that ‘conducting SEA takes too long … is expensive and results are uncertain’. It was further expressed that, once SEA is perceived by decision makers as valuable and necessary, ‘it becomes important to have a common understanding of the best methodology/ies to apply in various circumstances; only then one can benefit from guidance on the best tools to support those methodologies’.

Discussion

There is no shortage of methods that could be used in SEA and ‘there are no really specific SEA methods or techniques’ (Partidário n.d., p. Citation42). Bao et al. (Citation2004) describe a combination of expert knowledge and practical experience from multiple disciplines as the current trend in SEA application. In our study, we found the range of methods used in practice and recommended in SEA guidance to be relatively restrictive, if not lacking in innovation, and often limited to a number of common, qualitative-based methods. More analytical-based, ‘impact assessment’ methods, including quantitative approaches, tended to be generally limited in their use and in their promotion in SEA guidance. We agree that qualitative approaches are sometimes the most appropriate and, particularly when decisions need to be made quickly, SEA cannot always be as ‘scientific’ as may be desired or even necessary (see Thérivel Citation2004). However, qualitative and EIA-based methods are ‘over-promoted’ in SEA guidance, and perhaps over-used in practice. There are instances where more quantitative-based methods are perhaps more fitting or where more rigorous methods are warranted. This was confirmed by our interviewees, who reported criticisms of SEA by decision-makers for its uncertain results owing, in part, to the unverifiable nature of the methods used. We disagree with Sommer's (Citation2005) view that quantitative methods are largely inappropriate owing to the fuzziness of PPPs. This is a restrictive view of SEA methods, and dismisses the utility of emerging methods in SEA (e.g. fuzzy sets and scenario-modelling) that are designed specifically to address such fuzziness.

Second, we suggest that SEA guidance has been too passive in providing instruction on how to select the best SEA design and supporting methods at different tiers of assessment and at different stages of the process. The focus is often on procedure, protocol and meeting the requirements of a directive. This is not surprising, particularly in the Canadian context, given recent Auditor General reports on federal non-compliance with the SEA directive (see Auditor General of Canada Citation2004). Although compliance with SEA directives is essential, it is equally important for practitioners to understand what SEA design is most appropriate to meet their objectives and what methods are best to use, when and why. Most guidance indicates that SEA is a diverse tool and the specific approach and methods depend on the tier of application and objectives at hand. This may seem intuitive to some – it has been a fairly consistent message in the academic literature. Based on explanations offered by our interviewees, SEA design and, in particular, the choice of methods is largely based on the knowledge of the practitioner. This is worrisome in that the methods used have a significant bearing on the nature and quality of information that is made available to support decision-making, and practitioners reported that they are often not aware of what methods are best used in different SEA contexts. Knowing when and how to use what methods is of considerable importance in ensuring the reliability and efficacy of SEA (see Finnveden et al. Citation2003). Our findings mirror those of Liou et al. (Citation2006), who identified unfamiliarity with SEA procedures and methodologies and a lack of clear guidelines as major impediments to SEA. The SEA community has done a relatively good job of emphasizing that practitioners must design SEA that is fit for purpose and select the methods that are most appropriate to the context at hand, but it has done a poor job in providing practical guidance on how to do so.

Third, it is well argued that SEA must be flexible such that specific design can be applied to specific applications. Devuyst et al. (Citation2000), for example, noted that the most important feature of a proposed SEA system in Flanders was its flexibility and that SEA could ‘be adapted to either the subject that is being examined or the experience of those who will have to do the SEA’. The problem, however, is that the aim to ensure flexibility is resulting in guidance offered to practitioners that is simply too generic, leading to criticisms of SEA as ad hoc, vague or inconsistent. Perhaps it is too complex a task to develop detailed guidance with fit-for-purpose tools given the flexible nature of SEA, and the solution is for practitioners to simply ‘select an SEA approach and associated tools to suit the particular decision-making context’ (OCED DEC Citation2006, p. 33). Croal et al. (Citation2010, p. 5), for example, indicate that ‘even in the best of circumstances the process described here [the SEA decision makers tool] will need to be adjusted and elaborated in various ways to be suitably iterative and sufficiently flexible to accommodate the messy complexities of actual SEA’. However, our interviewees indicated that there is too often limited methodological guidance and much of what is available is too generic for the specific applications needed, treating SEA simply as a less detailed and less structured form of assessment. There has been some, but insufficient progress since Audouin's (Citation1999) observation that one of the major difficulties in the development of SEA guidelines is the facilitation of context-specific methodologies. Fit-for-purpose guidance, at the level of and context in which SEA is operationalized, will be much more useful than relying solely on high-level, ‘guiding principles’ with no concrete direction for the practitioner. Fischer and Gazzola (Citation2006, p. 407) note that, although guiding principles ‘can successfully help practitioners, they need to be tailored to the specific system of application’. There is a need for more systematic methodologies with guidance on methods selection at different SEA tiers and in different contexts, perhaps even sector-based guidance, along with practical tools, models and examples demonstrating how to incorporate context-specific issues in SEA and how SEA is to be integrated with PPP development – as opposed to simply lobbying that it be done. As stated by Fischer (Citation2003, p. 156) ‘a reminder is needed that SEA is an applied instrument’.

Finally, we suggest that the institutional environment and community of decision-makers must bear some of the responsibility for the current state of SEA guidance and performance. The Auditor General's fourth review of SEA practice in Canada reported that the SEA directive has yet to be consistently applied across federal departments and agencies, and that SEA has not been undertaken for some proposals where significant environmental effects could result (Auditor General of Canada Citation2008). The Auditor's report also identified a general lack of leadership, support and accountability in SEA and serious gaps regarding the determination of who is responsible for implementation. Even the most effective SEA methodology and set of methods is of little value if the institutional support for implementation is lacking. As suggested by Gibson et al. (Citation2010), what is required is a core process and substantive requirements for SEA that are set in legislation combined with more flexible policy requirements and expectations that are detailed in supporting guidance.

Conclusion

This paper set out to examine the current state of guidance for SEA and supporting methods used in SEA application. SEA is not challenged by a lack of methods, but the range of methods promoted in SEA guidance and used in practice is more restrictive and less creative than the range of methods available. This may be a reflection of the skill sets of practitioners, or of the time and resource constraints under which SEA operates; regardless, making the ‘right choices’ about methods and methodological design is a significant challenge to practice. Nearly two decades ago Thérivel (Citation1993) noted that one of the main difficulties experienced in relation to the adoption and operationalization of SEA was the lack of methodologies that specifically addressed SEA requirements. Notwithstanding considerable advances in SEA methodology, the practitioner community still struggles with the ‘many forms’ of SEA when making choices about SEA design and methods selection. There are several guidance documents available that provide ‘best practice principles’. However, it is guidance on methodology that is lacking – how to make ‘good’ methodological choices in SEA, and how to identify methods best suited to the nature of the strategic initiative being assessed. Methodology construction is more than assembling a handful of ‘common methods’; it requires awareness and connection to strategic purpose and the desired outcomes of a strategic assessment process. Much of the SEA guidance that does exist is too generic to facilitate the ‘right’ SEA design choices.

In conclusion, flexibility in SEA has become one of its defining principles, if not strengths as a higher-tiered PPP assessment and decision support process, but flexibility also poses significant challenges to the practitioner. More attention needs to be given to operationalizing SEA principles, developing decision rules for methodology construction, and operational guidance for selecting the best available methods that are most suited to the PPP context at hand. This will not be achieved through high-level, principles-based guidance documents (important though they are) that focus on generic processes and compliance with directives, and leaving the selection of appropriate SEA design solely at the discretion of the practitioner. There is an implicit assumption that practitioners will know what methodologies and methods are best to use in SEA, for different types of SEA application. This assumption is at odds with the results of our study. Practitioners must have sufficient and sound guidance to make important SEA methodological choices – choices about the best SEA design and the best set of supporting methods, given the range of options to choose from.

Acknowledgements

This paper is based in part on work completed for the Canadian Environmental Assessment Agency. We wish to acknowledge the Agency for supporting this work, study participants for their time and insight, and the feedback of three anonymous reviewers.

References

  • Atkins Ltd, 2011. Local transport plan 3 strategic environmental assessment. Prepared for the Derby City Council, UK [online]. Available from: http://www.derby.gov.uk/TransportStreets/TransportPlanning/
  • Atomic Energy of Canada Limited, 1994. Environmental impact statement on the concept for disposal of Canada's nuclear fuel waste. Report AECL-10711, COG-93-1
  • Auditor General of Canada . 2004 . the environmental impact of policies, plans, and programs , (Chapter 4). Report of the Commissioner of Environment and Sustainable Development Ottawa : Auditor General of Canada .
  • Auditor General of Canada . 2008 . 2008 March status report of the Commissioner of the Environment and Sustainable Development , Chapter 9, Management tools and government commitments – strategic environmental assessment Ottawa : Office of the Auditor General of Canada . [online]. Available from: http://www.oag-bvg.gc.ca/internet/English/parl_cesd_200803_09_e_30135.html
  • Audouin, M., 1999. Strategic environmental assessment in South Africa: developing a common understanding. Paper presented at the Annual Meeting of the International Association for Impact Assessment, Glasgow, Scotland.
  • Bailey , J. and Renton , S. 1997 . Redesigning EIA to fit the future: SEA and the policy process . Impact Assessment and Project Appraisal , 15 ( 3 ) : 319 – 334 .
  • Bao , C.-K. , Lu , Y.-S. and Shang , J.-C. 2004 . Framework and operations procedure for implementing strategic environmental assessment in China . Environmental Impact Assessment Review , 24 : 27 – 46 .
  • Björklund , A. 2012 . Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden . Environmental Impact Assessment Review , 32 ( 1 ) : 82 – 87 .
  • BMT Cordah Limited, 2003. Offshore wind energy generation: phase 1 proposals and environmental report for consideration by the Department of Trade and Industry [online]. Available from: http://www.offshore-sea.org.uk/site/
  • Bridgewater , G. 1989 . EIA of policies in canada: a beginning , Ottawa : CEARC .
  • Brown , A.L. and Thérivel , R. 2000 . Principles to guide the development of strategic environmental assessment . Impact Assessment Project Appraisal , 18 : 183 – 189 .
  • Canada–Newfoundland Offshore Petroleum Board . 2007 . Strategic environmental assessment: Sydney Basin offshore area , St John's : CNLOPB . [online]. Available from: http://www.cnlopb.nl.ca/env_strategic.shtml
  • Canada–Newfoundland Offshore Petroleum Board . 2008 . Strategic environmental assessment: Labrador Shelf offshore area , St John's : CNLOPB . [online]. Available from: http://www.cnlopb.nl.ca/env_strategic.shtml
  • CEARC (Canadian Environmental Assessment Research Council) . 1990 . The environmental assessment process for policy and program proposals , Ottawa : Minister of Supply and Services Canada .
  • Croal , P. 2010 . A decision maker's tool for sustainability-centred strategic environmental assessment . Journal of Environmental Assessment Policy and Management , 12 ( 1 ) : 1 – 27 .
  • DEAT . 2004 . Strategic environmental assessment, integrating environmental management. Information Series 10 , Pretoria, South Africa : Department of Environmental Affairs and Tourism .
  • Department of Trade and Industry . 2005 . Strategic environmental assessment of draft plan for a 24th seaward round of offshore oil and gas licensing [online] , London : Department of Energy and Climate Change . Available from: http://www.offshore-sea.org.uk/site/index.php
  • Devuyst , D. , van Wijngaarden , T. and Hens , L. 2000 . Implementation of SEA in Flanders: attitudes of key stakeholders and a user-friendly methodology . Environmental Impact Assessment Review , 20 : 65 – 83 .
  • Finnveden , G. 2003 . Strategic environmental assessment methodologies: applications within the energy sector . Environmental Impact Assessment Review , 23 : 91 – 123 .
  • Fischer , T.B. 1999 . The consideration of sustainability aspects in transport infrastructure related policies, plans and programmes: a comparative analysis of North West England, Noord-Holland and Brandenburg-Berlin . Journal of Environmental Planning and Management , 42 ( 2 ) : 189 – 219 .
  • Fischer , T.B. 2003 . Strategic environmental assessment in post-modern times . Environmental Impact Assessment Review , 23 : 155 – 170 .
  • Fischer , T.B. and Gazzola , P. 2006 . SEA effectiveness criteria – equally valid in all countries? The case of Italy . Environmental Impact Assessment Review , 26 : 396 – 409 .
  • Gibson , R.B. 2010 . Strengthening strategic environmental assessment in Canada: an evaluation of three basic options . Journal of Environmental Law and Practice , 20 ( 3 ) : 175 – 211 .
  • Harriman Gunn , J. and Noble , B.F. 2009 . Integrating cumulative effects in regional strategic environmental assessment frameworks: lessons from practice . Journal of Environmental Assessment Policy and Management , 11 ( 3 ) : 267 – 290 .
  • INAC . 2008 . Strategic environmental assessment: Indian and Northern Affairs Canada's response to the cabinet directive on the environmental assessment of policy, plan and program proposals , DG Strategic Environmental Assessment Working Group, Policy and Strategic Direction Hull, Quebec : Indian and Northern Affairs Canada .
  • Industry Canada . 2007 . Industry Canada guidance document for conducting strategic environmental assessments (SEAs) [online] , Ottawa : Industry Canada . Available from: http://www.ic.gc.ca/eic/site/sea-ees.nsf/eng/h_ey00008.html
  • Land Use Consultants, Collingwood Environmental Planning, and Levett-Therivel Sustainability Consultants, 2006. The draft regional spatial strategy for the southwest 2006–2026: strategic sustainability assessment. Prepared for Southwest Regional Assembly [online]. Available from: http://www.southwest-ra.gov.uk/
  • Liou , M.-L. , Yeh , S.-C. and Yu , Y.-H. 2006 . Reconstruction and systemization of the methodologies for strategic environmental assessment in Taiwan . Environmental Impact Assessment Review , 26 : 170 – 184 .
  • Nilsson , M. and Dalkman , H. 2001 . Decision making and strategic environmental assessment . Journal of Environmental Assessment Policy and Management , 3 ( 3 ) : 305 – 327 .
  • Nilsson , M. 2009 . Analytical framework and tool kit for SEA follow-up . Environmental Impact Assessment Review , 29 : 186 – 199 .
  • Nitz , T. and Brown , A.L. 2001 . SEA must learn how policy making works . Journal of Environmental Assessment Policy and Management , 3 ( 3 ) : 329 – 342 .
  • Noble , B.F. 2000 . Strategic environmental assessment: what is it, and what makes it strategic? . Journal of Environmental Assessment Policy and Management , 2 ( 2 ) : 203 – 224 .
  • Noble , B.F. 2009 . Promise and dismay: the state of strategic environmental assessment systems and practices in Canada . Environmental Impact Assessment Review , 29 : 66 – 75 .
  • Noble , B. and Storey , K. 2001 . Towards a structured approach to strategic environmental assessment . Journal of Environmental Assessment Policy and Management , 3 ( 4 ) : 483 – 508 .
  • ODPM . 2005 . A draft practical guide to the Strategic Environmental Assessment Directive: Proposals by ODPM, the Scottish Executive, the Welsh Assembly Government and the Northern Ireland Department of the Environment for practical guidance on applying European Directive 2001/42/EC , Londom : Office of the Deputy Prime Minister .
  • OECD DAC . 2006 . Applying strategic environmental assessment: good practice guidance for development cooperation , Paris : Organization for Economic Co-Operation and Development, Development Assistance Committee .
  • Offshore Energy Environmental Research Association, 2008. Fundy tidal energy strategic environmental assessment, Report prepared for the Nova Scotia Department of Energy, Halifax, Nova Scotia [online]. Available from: http://www.offshoreenergyresearch.ca/OEER/StrategicEnvironmentalAssessment/tabid/117/Default.aspx
  • Ontario Power Authority . 2007 . Integrated power systems plan , Toronto : Ontario Power Authority . [online]. Available at http://www.powerauthority.on.ca/integrated-power-system-plan
  • Packman and Associates, 2005. Strategic environmental assessment; core area public programming and activities vision, Report prepared for the National Capital Commission [online]. Available from: http://www.capcan.ca/data/2/rec_docs/1467_sea_e.pdf
  • Partidário, M.R., n.d. Strategic environmental assessment (SEA): current practices, future demands, and capacity building needs, Course manual. IAIA Training Course [online]. Available from: http://www.iaia.org/publicdocuments/EIA/SEA/SEAManual.pdf
  • Partidário , M.R. 2000 . Elements of an SEA framework – improving the added value of SEA . Environmental Impact Assessment Review , 20 : 647 – 663 .
  • Partidário , M.R. 2003 . Strategic impact assessment for spatial planning: methodological guidance for application in Portugal , Portugal : Direction-General for Spatial Planning and Urban Development, Ministry of the Cities, Spatial Planning and the Environment .
  • Privy Council Office and CEAA . 2010 . The Cabinet Directive on the environmental assessment of policy, plan and program proposals: guidelines for implementing the directive , Ottawa : Minister of Public Works and Government Services Canada .
  • Retief , F. 2007 . A performance evaluation of strategic environmental assessment processes within the South African context . Environmental Impact Assessment Review , 27 : 84 – 100 .
  • Saskfor–MacMillan Limited Partnership . 1997 . Twenty-year forest management plan and environmental impact statement for the Pasquia-Porcupine Forest Management Area. Main Document , Saskatchewan : Saskfor–MacMillan Limited Partnership .
  • Scientific Advisory Committee . 2007 . The Great Sand Hills regional environmental study , Regina, Saskatchewan : Canada Plains Research Centre . [online]. Available at http://www.environment.gov.sk.ca/2007-104GreatSandHillsEnvironmentalStudy
  • Scottish Executive, 2006. Strategic environmental assessment toolkit [online]. Natural Scotland, Scottish Executive. Available from: http://www.scotland.gov.uk/Resource/Doc/148434/0039453.pdf
  • Sinclair , J.A. , Sims , L. and Spaling , H. 2009 . Community-based approaches to strategic environmental assessment: Lessons from Costa Rica . Environmental Impact Assessment Review , 29 ( 3 ) : 147 – 156 .
  • Sommer, A., 2005. Strategic environmental assessment: from scoping to monitoring. Content requirements and proposals for practical work, transl. from German to English by Euro Text Services [online]. Available from: http://www.sea-info.net/files/general/From_scoping_to_monitoring.pdf
  • Thérivel , R. 1993 . Systems of strategic environmental assessment . Environmental Impact Assessment Review , 13 ( 3 ) : 145 – 168 .
  • Thérivel , R. 2004 . Strategic environmental assessment in action , London : Earthscan .
  • Thérivel , R. and Partidário , M. 1996 . The practice of strategic environmental assessment , London : Earthscan .
  • Thérivel , R. 1992 . Strategic environmental assessment , London : Earthscan .
  • Thérivel , R. 2004 . Writing strategic environmental assessment guidance . Impact Assessment and Project Appraisal , 22 ( 4 ) : 259 – 270 .
  • Transport Canada . 2000 . Strategic environmental assessment at Transport Canada – policy statement , Ottawa : Transport Canada, Environmental Affairs Branch [updated 2003] .
  • UNDP and World Bank, 2002. Bulgaria energy environment review. ESM 260. Energy sector management assistance program [online]. Available from: www.worldbank.org
  • UNECE (United Nations Economic Commission for Europe) . 1992 . Application of environmental impact assessment principles to policies, plans and programs , New York : United Nations .
  • UNECE (United Nations Economic Commission for Europe) . 2007 . Resource manual to support application of the protocol on SEA , Draft Final New York : UNECE/REC .
  • Verheem , R. and Tonk , J. 2000 . Strategic environmental assessment: one concept, multiple forms . Impact Assessment and Project Appraisal , 19 ( 3 ) : 177 – 182 .
  • Wallgren , O. 2011 . Confronting SEA with real planning: The case of follow-up in regional plans and programs in Sweden . Journal of Environmental Assessment Policy and Management , 13 ( 2 ) : 229 – 250 .
  • Wiseman , K. 2000 . “ Environmental assessment and planning in South Africa: the SEA connection ” . In Perspectives on strategic environmental assessment , Edited by: Partidário , M.R. and Clark , R. 155 – 166 . New York : Lewis .
  • Wood , C. 1995 . Environmental impact assessment: a comparative review , London : Longman Scientific and Technical .
  • World Bank, 2009. SEA toolkit [online]. Available from: http://web.worldbank.org/
  • World Bank and Ministry of Environment and Natural Resources, 2006. Republic of El Salvador country environmental analysis improving environmental management to address trade liberalization and infrastructure expansion, Report no. 35226-SV [online]. Available from: www.worldbank.org
  • Zhu , Z. 2011 . An inquiry into the potential of scenario analysis for dealing with uncertainty in strategic environmental assessment in China . Environmental Impact Assessment Review , 31 ( 6 ) : 538 – 548 .

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.