1,504
Views
9
CrossRef citations to date
0
Altmetric
Articles

Evaluating the effectiveness of strategic environmental assessment to facilitate renewable energy planning and improved decision-making: a South African case study

ORCID Icon, ORCID Icon &
Pages 28-38 | Received 21 Nov 2018, Accepted 09 May 2019, Published online: 23 May 2019

ABSTRACT

Criteria for the evaluation of the effectiveness of Strategic Environmental Assessment (SEA) have evolved internationally from the requirement for mere procedural effectiveness, to include criteria such as substantial and incremental effectiveness. This paper provides an overview of the international evolution of effectiveness evaluation for SEA and its parallel progression in South Africa. Within this context, the effectiveness of the two SEA case studies conducted in support of national-scale renewable energy planning in South Africa was reviewed against internationally recognised effectiveness criteria to test their ability to influence decision making for renewable energy projects in the country.

The evaluation was informed by the role of the first author as the project director in the case studies. The review found the SEAs to be partially effective, mainly through the introduction of new legislation, processes and systems at the national level that operationalise the outcomes of the SEA. A fundamental limitation emerged in determining whether incremental effectiveness had been achieved as it is too soon to evaluate the effectiveness of the recently promulgated legislation and instruments. The paper builds on previous work on effectiveness evaluations and provides the perspective of an administrative implementer.

Introduction and background

Strategic Environmental Assessment (SEA) practice has been expanding internationally, including developing countries (Dalal-Clayton and Sadler Citation2005; Retief et al. Citation2008). Within the African context, South Africa is a recognised leader in SEA theory and practice, with the publication of three guidelines on the topic between 2000 and 2007 and over 50 SEAs having been prepared between 1996 and 2005 (DEAT Citation2000, Citation2004, Citation2007; Retief et al. Citation2008; Davidovic Citation2014). Despite the expansion in their use, reviews of both legislated and voluntary SEA practice have identified several shortcomings in their implementation and thus their effectiveness (UNEP Citation2004; Desmond Citation2007; Retief et al. Citation2008; Tetlow and Hanusch Citation2012).

While South Africa has a history of SEA practice dating back to 1996 (Retief et al. Citation2007), and despite its publication of three SEA guidelines (DEAT Citation2000, Citation2004, Citation2007), the national Department of Environmental Affairs (DEA) has limited experience in their development and implementation. The launch of the Renewable Energy Independent Power Producers Procurement Programme (REI4P) in 2011 (Eberhard and Naude Citation2016) provided an opportunity for DEA to apply SEAs to guide the implementation of this national-scale programme in a responsible manner. The REI4P programme responds to the Government’s commitment to a low carbon economy and aims to introduce 17.8 GW of renewable energy generating capacity to the national energy mix by 2030 (Presidential Infrastructure Coordinating Commission Citation2012).

The development of facilities generating more than 20 MW of renewable energy is an activity that requires environmental authorisation (EA) (RSA Citation2014) and being a holder of an EA is a prerequisite to submitting a bid into the REI4P. This bid requirement resulted in over 900Footnote1 renewable energy applications being received for authorisation within the first phase of the programme, which ran from 2011 to 2015. Of these projects only 92 were selected for construction (DOE Citation2015), representing a disproportionate resource use and highlighting the need to improve the efficiency of the environmental regulatory approach applied to large infrastructure projects. The National Planning Commission identified regulatory inefficiency as a risk to achieving the national development objectives and explicitly identified the environmental authorisation framework as an area in need of streamlining (National Planning Commission Citation2011). The demand for administrative efficiency is not a new or a specifically South African phenomenon. Governments in Australia, Canada, the Netherlands and South Africa, have recently updated their environmental impact assessment (EIA) legislation to reduce bureaucracy and improve the efficiency of the assessment and review process (Sandham and Pretorius Citation2008; Bond et al. Citation2014; Veronez and Montaño Citation2015).

To reduce the complexity and thereby improve the efficiency of the impact assessment process for REI4P projects and the associated grid expansion, the DEA commissioned two energy-focused SEAs.Footnote2 These were the first in a programme of SEAs to support the government’s infrastructure priorities and to respond to the calls for improved efficiency within the environmental legislative framework.

The purpose of this paper is, firstly, to provide an overview of the evolution of the expectations of effectiveness for SEA both internationally and in South Africa, including identifying recognised shortcomings of SEA in South Africa. Secondly, to evaluate the effectiveness of the two energy-focused SEAs by testing them against internationally recognised effectiveness criteria to determine if they were able to influence decision making for renewable energy projects in the country and improve on the efficiency of the impact assessment process of the REI4P. The analysis is based on the first author’s professional judgement, which is inherently subjective but is supported by first-hand involvement in the SEA implementation. The paper conveys how the energy-focused SEAs attempted to address shortcomings in SEA implementation in South Africa, identifies limitations of the analysis, and suggests aspects for future review.

The promise of SEA

Internationally, a wealth of literature on the performance of SEA exists, with many scholars developing and applying methodologies for determining the effectiveness of SEA in a specific country or sector context (Simpson Citation2001; Retief Citation2007; Chanchitpricha and Bond Citation2013; Acharibasam and Noble Citation2014; Dalal-Clayton and Sadler Citation2017). The early literature motivated SEA use based on its ability to complement and simplify project-level assessments through tiering and the trickle-down of sustainability principles from plans and policies to individual development proposals (Sadler Citation1996; Thérivel and Partidário Citation1996; Noble Citation2002; Gachechiladze-Bozhesku and Fischer Citation2012). Recent literature similarly contextualises SEA as a tool to improve decision-making by facilitating an understanding of how environmental, social and economic considerations interact, thereby contributing to sustainable development (Thérivel and Minas Citation2002; Desmond Citation2007; World Bank Citation2012; Partidário Citation2015; Suzaul-Islam and Yanrong Citation2016; Noble and Nwanekezie Citation2017). The theory points to SEA being able to influence decision-making and sustainability, both at a strategic and project level. SEA should improve the quality of EIA reports, provide the policy context for decision-making, which should reduce the complexity of and enhance the ability of project-level EIA to promote sustainable development.

In practice, the results seem less conclusive. Some researchers hold a view that SEA has a high potential to contribute to better decision-making, while others feel the effectiveness of SEA is often incidental (Desmond Citation2007; Tetlow and Hanusch Citation2012; Gazzola and Rinaldi Citation2016).

In South Africa the situation is similar. Although the literature reveals that SEA practice in the country is mature (CSIR Citation1996; Retief Citation2007; Retief et al. Citation2008; World Bank Citation2012; Davidovic Citation2014), two separate research-based evaluations of SEAs found them to be ineffective, primarily because of their inability to influence government planning or decision-making (Retief Citation2005; Retief et al. Citation2008; Davidovic Citation2014).

The evaluations concluded that the SEAs reviewed were not linked to plans, policies or programme formulation; they did not precede EIAs and could therefore, not lead to tiering. Although SEAs provided information before decisions were made, it was uncertain whether the information was used in any decision-making process. Retief (Citation2007) concluded that SEA in South Africa ‘could be considered as merely a glorified information gathering exercise’. His analysis identified that the SEAs were mainly consultant and donor-driven, which affected how SEAs were conducted and their potential to be effective. The externally driven approach lacked strategic focus and demonstrated little understanding of the decision-making processes they were aiming to inform or of what was needed for implementation. The result was SEAs emphasising situational analysis and data generation, and providing uncertain outcomes far removed from real life decision-making (Retief et al. Citation2008).

Another more recent evaluation, which considered one of the energy-focused SEAs discussed in this paper, scoped the opinion of various stakeholder groupings to determine specific value expectations of SEAs (Cape et al. Citation2018). Based on interviews with government officials, industry representatives, conservation groups and interested and affected parties, the authors found that only three of the 21 value expectations were met by all interviewees and four of the 21 value expectations were not met by any interviewee. The three expectations met related to capacity development, providing more focus to project level assessment and providing strategic guidance on the location of projects. The objectives not met related to the facilitation of integration between decision makers, the alignment of policy between government departments, and the reduction of general and specialist assessments costs.

Considering the insignificant contribution that South African SEA practice has played in decision-making, streamlining project-level EIA and meeting value expectations to date, it is unclear if the government’s SEA programme to streamline the EA process for strategic infrastructure would fare any better. The evaluation presented in this paper of the first two SEAs commissioned by the South African DEA against recognised international criteria, is to identify if these energy-focused SEAs can influence decision-making and promote sustainability.

The evolution of the conceptual model for SEA and effectiveness

A scan of the international literature on SEA between 1992 and 2011 identifies over 106 different definitions of SEA (Silva et al. Citation2014). An analysis of these definitions assists in tracing the ‘conceptual evolution’ of the tool and the criteria to determine its effectiveness over the past two decades. The following sub-section provides an overview of this evolution from an international and national perspective and shows how SEA practice has moved from a formal role and application to a design of SEA which is fit for purpose. In line with the changes to the interpretation and application of SEA, evaluation criteria for SEA have also expanded from requiring mere procedural effectiveness to include criteria such as substantial and incremental effectiveness. This analysis provides the context for the evaluation of the two SEA case studies considered in this paper.

SEA roles and applications

The international experience – various roles and application of SEA emerge from the literature exemplified by the following sample of definitions.

The formalized, systematic and comprehensive procedure for evaluating the environmental effects of a policy, plan or programme and its alternatives, including the preparation of written reports on the findings and using the findings in decision-making. (Thérivel and Partidário Citation1996).

an environmental report should be prepared … identifying, describing and evaluating the likely significant environmental effects of implementing the plan or programme, and reasonable alternatives taking into account the objectives and the geographical scope of the plan or programme. Section 8 of the EU Directive on SEA (2001/42/EC SEA)

SEA is a method that attempts to assess systematically the environmental impacts of decisions made at policy, planning and programmatic levels (Silva et al. Citation2014).

The role of SEA from the above definitions is passive, reactive and informative, with the SEA assessing the environmental consequences of decisions made at the policy, planning and programmatic levels and identifying reasonable alternatives. The SEA is meant to change behaviour but not through the moulding of policy, plans or programmes, but by predicting impacts of actions, reporting on them and proposing alternatives (DEAT Citation2007; Bina Citation2008; Lobos and Partidário Citation2014). The theory is that if environmental aspects have been considered at the very earliest stage of the development of the plan, policy or programme, the sustainable development agenda will be advanced (OECD Citation2006; Lobos and Partidário Citation2014). This SEA type follows procedural stages aligned with a traditional impact assessment process, encompassing tasks related to information collection, processing and consideration of alternatives consideration (Fischer Citation2003) and is referred to as the EIA-Based SEA (UNEP Citation2004).

The South African Experience – SEA practice in South Africa expanded rapidly between 1996 and 2005 with fifty SEAs being conducted (Retief et al. Citation2007). In the South African context, SEAs have been mostly consultant-driven and donor or private sector funded. This is a divergence from internationally applied SEA which is usually undertaken as a legislative requirement and publically funded (Retief et al. Citation2007, Citation2008). The application of SEA in South Africa does not focus on the reactive evaluation of impacts of policies, plans and programmes on the environment as is usual in the EU SEA approach. An assessment of the fifty South African SEAs indicates that South Africa had developed a ‘homegrown’ approach to SEA practice that allowed for flexibility and adaptability and which fits into the ‘plan type’ SEA, with ‘plan’ being defined broadly as ‘a spatial and land use plan’ (CSIR Citation2007; Retief et al. Citation2007, Citation2008). These planning oriented SEAs conform to a ‘pro-active’ SEA practice, intended to inform project-level environmental impact assessment authorisations and fulfil either a plan or programme void. This type of SEA has however been criticised for not reflecting a true ‘assessment’, as they answer a ‘what’ question, ‘what proposals can be considered given the environmental opportunities and constraints’ (Retief et al. Citation2008, pg. 511).

Internationally, Partidário recognises a broader application and form of SEA, which includes site-suitability studies and the appraisal of optimal locations for developments. These applications, although including aspects similar to SEA requirements, are not necessarily labelled as SEA, but in the opinion of Partidário, contribute an equivalent or even an improved value to sound environmental decision-making in some cases. In South Africa, this type of SEA has been referred to as a ‘para-SEA’ which comprises ‘processes that do not meet the formal specifications of SEA but have some SEA characteristics or elements’ (Retief et al. Citation2007, p. 45). Bina (Citation2003) similarly promotes SEA as a tool to be applied to industrial sectors or broader geographical areas before individual projects are defined, allowing long-term planning and regional environmental concerns to be considered proactively. In this context, a SEA would be an ideal means of pre-identifying appropriate sites or siting criteria for future projects by identifying ecologically sensitive areas and discouraging or prohibiting future development in such areas (Stinchcombe and Gibson Citation2011).

SEA – effectiveness evaluation

As the form of SEA has been evolving, so too have the evaluation criteria used to measure their effectiveness. In the EIA-based SEA, effectiveness criteria are followed which adheres to the definition proposed by Bina et al. (Citation2011) as ‘procedural effectiveness’. This refers to something being ‘adequate at accomplishing a purpose; producing the intended or expected result’ and assesses ‘the procedural nature of the impact assessment’. Essential components of procedural effectiveness are the consideration of the quality of the report, the adherence to procedures, the participatory methods used and the benefit-cost ratio (UNEP Citation2004; Fischer Citation2007; Sandham and Pretorius Citation2008; Stoeglehner et al. Citation2009; Bina et al. Citation2011; Runhaar et al. Citation2013; Gazzola and Rinaldi Citation2016).

This reactive evaluation of likely significant environmental effects of implementing a policy, plan or programme in the EIA-based SEA has led to perceptions that SEA is ineffective, as well as being theoretically, politically and practically inadequate (Lobos and Partidário Citation2014; Morrison-Saunders et al. Citation2014; Rozema and Bond Citation2015). The literature reflects a realisation that for SEA to remain relevant, the largely reactive ‘EIA-based mechanism’ must advance to a ‘proactive’ process of developing sustainable solutions as an integral part of strategic planning, rather than merely evaluating the effects of decisions taken (Fischer Citation2003; Bina Citation2008; Silva et al. Citation2014). It is no longer enough to merely present the SEA as an output to decision-makers and expect its implementation (Bina Citation2008). Instead, for the outcomes of SEAs to be taken up and owned by decision-makers, SEAs must be developed in line with their planning and policy formulation structures. In this context, SEA would focus on the needs of decision-makers thereby becoming a catalyst for learning and allowing SEA application to become useful and relevant (Lobos and Partidário Citation2014).

In line with the changing form of SEA, the performance measures against which SEAs are assessed have also progressed. The criteria for appraising effectiveness in the context of emerging SEA practice, prioritise ‘outcomes’ and ‘effects’ that make a decisional difference leading to improved sustainability, referred to as ‘substantial effectiveness’. While advocates for substantive effectiveness acknowledge that it is essential to check procedural effectiveness to understand if the tool achieved its objectives, this is in itself insufficient (Retief et al. Citation2007; Van Doren et al. Citation2013). The test of effectiveness must identify the extent to which the tool fulfils its purpose, produces results, meets stakeholders value expectations and has ‘buy-in’ from the public sector to ensure implementation (Retief et al. Citation2008; Van Doren et al. Citation2013; Cape et al. Citation2018). The question ‘What has SEA achieved?’ should be answered (Retief et al. Citation2007).

Essential aspects in the evaluation of substantive effectiveness include: participation going beyond consultation to generate productive discussion between parties; the significance of learning, leading to continuous improvement in policy development and decision-making; changing processes, policies or legislation; strengthening institutional and governmental capacity; capacity development; changing values within organisations; the provision and collection of new data; and the integration of engineering and social aspects (Stoeglehner et al. Citation2009; Van Doren et al. Citation2013; Pope et al. Citation2018).

Bina et al. (Citation2011) advance the debate by calling for ‘incremental effectiveness’. Incremental effectiveness tests the contribution of the SEA to the environmental governance of institutions and systems in the long-term, which improves the environmental effectiveness and institutional arrangements of the institution and strengthens their ability to assimilate the outputs of SEAs. This type of effectiveness has also been termed ‘indirect effectiveness’ with the emphasis being on the achievement of outcomes which go beyond the expected (Pope et al. Citation2018).

A three-dimensional interpretation of effectiveness, which encompasses procedural, substantive and incremental effectiveness, as proposed by Bina et al. (Citation2011), is used when evaluating the effectiveness of the two South African energy-focused SEAs. The evaluation criteria are grouped into the dimensions of procedural, substantive and incremental effectiveness and broadly include other aspects of effectiveness that have been identified, such as knowledge and learning, which can be considered as part of incremental effectiveness. There are strong interdependencies between these dimensions, and these concepts are evolving rapidly, as reflected in recent reviews such as Pope et al. (Citation2018) and Geissler et al. (Citation2019).

DEA’s energy-focused SEAs – brief description

The overall objective of the DEA’s SEA programme is to support environmental decision-making around government’s infrastructural priority projects through the proactive evaluation and avoidance of environmental sensitivity. In line with this objective, the energy-focused SEAs endeavour to integrate environment aspects into decision-making; strengthen and streamline the project level EIA by early identification of potential impacts and cumulative effects; address strategic issues related to the justification and location of proposed projects; and reduce the time and effort necessary to assess individual projects.

To achieve these aims the purpose of the ‘renewable energy SEA’ as set out in terms of referenceFootnote3 (DEA Citation2012a, Citation2012b) was to identify Renewable Energy Development Zones (REDZs) suitable for the efficient and effective roll-out of wind and solar photovoltaic (PV) energy in South Africa. The ‘electricity grid infrastructure SEA’ was to determine the most suitable routing for five new ‘Strategic Transmission Corridors’ identified in the ‘EskomFootnote4 2040 Transmission Network Study’ (Eskom Citation2012) as being necessary to meet the forecast electricity generation and load growth requirements. The routing was to be based on the optimal technical, socio-economic and environmental sensitivity options to simplify the environmental authorisation requirements. The SEAs were to complement each other, with the REDZs acting as anchor points to facilitate proactive investment into transmission grid expansion.

Both SEAs provided protocols for specialist studies when undertaking the project level EIA to ensure the commensurate level of assessment related to the environmental sensitivity of the site. They also both provided sufficient pre-assessment to enable projects located within the REDZs and Strategic Transmission Corridors (STCs) to dispense with the scoping step when undertaking the project level EIA which reduces the time from application to decision to a maximum of 147 days.

These energy-focused SEAs differ from previous SEAs undertaken in South Africa as their objectives relate to identifying optimal siting locations for renewable energy developments. They, therefore aligned to Partidário’s ‘site-suitability studies, the appraisal of optimal locations studies’, or the Para-SEA of Retief (Citation2007). The SEAs further broke from the traditional South African SEA model in that they were designed and implemented by the public rather than the private sector.

This type of SEA was not considered in the seminal work on SEA effectiveness in South Africa undertaken by Retief in 2005 (Retief Citation2005; Retief et al. Citation2007). Noting these differences, it is essential to subject these SEAs to evaluation to improve on SEA practice in South Africa.

Methodology and evaluation criteria

It was decided to evaluate the two SEAs against the three-dimensional interpretation of effectiveness as proposed by Bina et al. (Citation2011), which encompasses aspects of procedural, substantive and incremental effectiveness and broadly include aspects of knowledge and learning. To determine the appropriate evaluation criteria, four SEA performance evaluation methods were considered in detail.

The first was adapted (Retief Citation2007) from the SEA guideline developed by the Department of Environmental Affairs and Tourism (DEAT Citation2000). The evaluation focused on the proactive role of SEA in influencing plans, policies and programmes and the sustainability function. The two energy SEAs commissioned by the Department were not designed to respond to these priorities. This methodology, although being comprehensive and set within the South African context, was therefore not adopted.

The second methodology considered (Fischer Citation2002) was adapted from the International Association for Impact Assessment (IAIA) ‘performance criteria for a good-quality Strategic Environmental Assessment process’. Ten of the seventeen evaluation criteria used were focused on comparisons of SEA practices. The intention of the SEA case study was not to compare the two energy SEAs, but rather to identify their effectiveness. This set of criteria was therefore not adopted.

The third methodology set out to test the effectiveness of sustainability assessments (Bond et al. Citation2013). This methodology expanded on the three types of effectiveness discussed in this paper by adding ‘normative effectiveness’, ‘knowledge and learning’ and ‘pluralism’. Pope et al. (Citation2018) used the methodology for the assessment of a SEA case study and found three overlaps in criteria. They found similarities between ‘substantial effectiveness’ and ‘knowledge and learning’; between normative effectiveness and pluralism; and they found it difficult to distinguish between procedural effectiveness and pluralism (Pope et al. Citation2018). This methodology was therefore not considered optimal for this paper.

The fourth methodology was provided by Partidário’s 14 priority needs for good practice SEA’ (Partidário Citation2000). These criteria prioritise integration and outcomes. Partidário speaks to ‘the success of the SEA being measured in relation to the quality of the final decision, and the extent to which the decision was improved as a result of the SEA approach’. As the two South African energy SEAs were designed as a tool to guide the development of strategic initiatives, it was decided to use Partidário’s methodology.

Partidário’s fourteen priority needs were grouped into categories reflecting content; process and outcomes. These categories loosely represent procedural, substantive and incremental effectiveness (). The SEAs were evaluated using Partidário’s fourteen priority needs and given a category ranking as proposed by Fischer (Citation2002): ‘fully met’ (score 2); ‘partially fulfilled’ (score 1); or ‘not met at all’ (score 0). A median score was determined for each SEA. The results were then analysed and interpreted based on the first author’s experience and first-hand knowledge on the progress made towards implementation of the SEA outputs. The documents and systems used for the assessment include the SEA terms of reference (DEA Citation2012a, Citation2012b, Citation2013), the final SEA documents (DEA Citation2015, Citation2016a), the specialist studies prepared as part of the SEA, Government Gazettes, the technical specifications for the national online screening toolFootnote5 (DEA Citation2016b), as well as newspaper and media articles.

Table 1. Effectiveness evaluation grouping, category and criteria; Results of the effectiveness evaluation of the two energy-focused SEAs*.

Findings of the SEA effectiveness evaluation

The SEAs were evaluated to determine their procedural, substantive and incremental effectiveness including aspects of knowledge and learning using the effectiveness criteria provided by Partidário’s ‘14 priority needs of a good SEA’. The category scores for each criterion (See ) indicate a median value of 1 for the overall effectiveness of the two SEAs indicating that the criteria were partially fulfilled.

The evaluation identified that progress had been made in formalising the SEA outcomes through the Cabinet process, as well as legislating the outputs of the SEAs. In April 2017, amendments to the 2014 Environmental Impact Assessment Regulations were published which provided for protocols and generic Environmental Management Programmes (EMPr) (RSA Citation2017). In February 2018, five STCs and eight REDZs were published for implementation (RSA Citation2018a, Citation2018b). In March 2019, generic EMPrs for the development of overhead powerlines and for electricity sub-stations were gazetted for implementation (RSA Citation2019). These actions demonstrate a willingness on the part of the decision-maker to accept and implement the findings of the SEAs.

Discussion

What follows is an analysis of the evaluation findings for each of the fourteen Partidário criteria structured to respond to the three effectiveness categories.

Procedural effectiveness

Criterion #1 ‘referring to a policy framework’ – criteria fully met. The terms of reference (DEA, Citation2012a, Citation2012b, Citation2013) identified that the primary motivation for the development of the energy SEAs was to assist with implementing the aspirations of the renewable energy policy for the country and to support both the Strategic Integrated Projects initiative and the REI4P. A statement released by Cabinet in February 2016 indicated that the implementation of these REDZs and the Strategic Electricity corridors would ensure the transition to a low carbon economy, accelerate infrastructure development and contribute to a more coherent and predictable regulatory framework (Department of Planning, Monitoring and Evaluation Citation2016). This was evidence of the SEAs being part of the overall government energy policy framework, and confirmation of the SEAs contributing to achieving the policy framework.

Criterion #2 ‘integrated and well-coordinated with policymaking’ (#2) – fully met. The REDZ and STCs are coordinated with policymaking and are influencing energy and infrastructure planning. Eskom has commissioned a study to plan the micro-grid structure within the REDZs, which demonstrates the integration of the REDZs into their long-term planning processes.Footnote6 At a project level, developers are submitting applications for environmental authorisation for wind and solar projects utilising the basic assessment process, and DEA is adhering to the shortened review timeframes.

Criterion #3 ‘resource allocation’ – fully met. The project proposals submitted by the consultant team provided evidence of the number of specialists that would be involved with the project, as well as their skills level. This information indicated an adequate number of professionals were working on the project and were appropriately skilled. Within the Department of Environmental Affairs, a Chief Director and Director were assigned to manage these SEAs. Both positions are at senior management level. Due to the first-hand experience of the first author in the development of the SEAs, it can be confirmed that the work was undertaken within the allocated budget and within the contract time-frames.

All three criteria linked to determining procedural effectiveness were rated as fully met, meaning that the process and procedures followed were adequate, and the SEAs produced the expected outcomes.

Substantial effectiveness

Criterion #4 ‘focus on the process and not the site’fully met. The work was undertaken at a strategic level and has influenced national planning and decision-making. The outcomes apply to the EA process to be followed for all wind, solar PV and grid expansion projects proposed within a REDZ and the STCs, with amendments made to the EIA legislation to incorporate the SEA outcomes, as identified in the published gazette notices.

Criterion #5 ‘Integrate approaches regarding scope and cross-interaction of relevant factors, ensuring interdisciplinary approach’ – partially fulfilled. The response to this question is considered at two levels. As a procedural effectiveness criterion, during the development of the SEAs ‘push’ and ‘pull’ factors were considered including the development plans of local government, industry, other government departments and state-owned entities. The methodology used to develop the SEAs was provided in the SEA report. This demonstrates a commitment to ensuring the cross-interaction of objectives concerning energy. As a substantive effectiveness criterion, the President’s Public-Private Growth Initiative has acknowledged the REDZs as has the plan of industry to construct 5 000 MW of solar PV and wind energy within these areas.Footnote7 This is a further demonstration of cross-sectoral interaction and ensuring interdisciplinary approaches. Although there have been several demonstrated successes, many initiatives are still under development.

Criterion #6 ‘Ensure simple, interactive and flexible approaches to decision-making’partially fulfilled. The ability to pre-screen environmental sensitivity provided by the national environmental screening tool, allows for internet based interactive decision-making on the part of the developer and government. The pre-identification of preferred development areas through the REDZs, similarly contributes to informed decision-making for wind and solar PV projects. Decision making is therefore supported on a policy and project level. Decisions must, however, be consistent and no flexibility can be legally exercised.

Criterion #7 ‘facilitate a participatory process’ – fully met. The development of the two SEAs followed an inclusive process. Dedicated websites were set up for each SEA, the national news media was utilised, and there were focused and broad stakeholder consultations at the provincial and local government levels. The consultation process has been documented in the SEA reports. The gazetted outputs of the SEAs have been gazetted and have been posted on various websites including the DEA website, which provides easy access to the SEA documentation as well as the implementation tools.

Two of the substantial effectiveness criteria scored a 2, reflecting that the criteria were fully met, and two scored a 1, reflecting that the criteria were partially fulfilled. Noting that substantial effectiveness is concerned with the usefulness of the assessment to inform decision-making and the willingness of the decision maker to implement the results, this is indicative of some outputs having been implemented while others are still in progress.

Incremental effectiveness

Criterion #8 ‘establish guidance and minimum regulatory context’ – partially fulfilled. The tools developed through the SEA process (assessment protocols, environmental screening tool, the standardisation of environmental data and generic EMPr) provide guidance when undertaking the site-specific EIA process. Similarly, the identification of required specialist reports for various developments set minimum regulatory requirements for the environmental authorisation application. The findings of the SEAs contributed directly to the amendment of the EIA regulations to improve the efficiency of the process (RSA Citation2014). Unnecessary administrative steps were removed and additional tools were provided and enabled.

Criteria #9 and #12 ‘accountable decision-making’ and ‘adaptive decision-making’ – partially fulfilled. The terms of reference for the SEAs identified that they were designed to support evidence-based decision-making by the authorities, as well as to inform planning decisions by state utilities and private developers. At a regional and planning level, the REDZs and the STCs are guiding energy planning decisions. As indicated under criteria #2, Eskom has continued energy planning within the REDZs by commissioning the micro-siting of collector sub-stations to facilitate the connection of multiple independent power producers to the grid, demonstrating accountability and adaptive decision-making. At a site-specific level, the availability of DEA’s online environmental screening tool allows the developer to manipulate their development footprint on the preferred site to avoid areas of high sensitivity. In this way, developers are provided with an adaptive decision-making tool.

Criterion # 10 ‘New routines’ – partially fulfilled. The following new routines have been identified for EIA decision-making: (a) A new routine for applications for environmental authorisations for new grid infrastructure based on pre-negotiated routes has been legislated (RSA, Citation2018a). Eskom is implementing the new procedures; (b) The generic EMPr has been gazetted for implementation and is being used by Eskom (RSA Citation2019). Progress towards gazetting the date for the compulsory inclusion of a screening report generated from the national environmental screening tool in applications for environmental authorisation has been made.

Criterion # 11 ‘Establish objectives, criteria and quality standards framework’ – partially fulfilled. Through the development of the SEA, eleven environmental assessment protocols have been prepared for various environmental criteria and two generic EMPr reports for power lines and sub-stations were developed. These outputs represent the setting of environmental criteria and developing quality standards. The environmental screening tool provides standardised and current environmental data for environmental sensitivity screening. These outputs are not all finally implemented although substantial progress has been made. More time is required to finalise the work and to gain experience in implementation.

Criterion #13 ‘contribution to changing attitudes and overcoming prejudices’ – partially fulfilled. The SEA report identifies that the wind industry was not initially supportive of the concept of REDZs. Over the development period this view changed and although industry never supported the REDZs, they did not oppose them during the comment period. The environmental NGOs were initially supportive of the SEA but were unhappy with the outcome as they expected the SEA to identify exclusion areas for wind facilities.

Criterion #14 ‘accessibility of information’ – fully met. The SEAs generated new biodiversity data, and the location and status of renewable energy projects were mapped. As a parallel process, the DEA environmental screening tool was developed which uses the data layers prepared through the SEA and houses the assessment protocols. The screening tool went live in June 2018 and is being used voluntarily.

Only one of seven criteria related to incremental effectiveness was rated as fully met. Considering that incremental effectiveness focuses on the contribution that the SEA makes to institutional governance and systems in the long-term, the rating confirms the finding that so far there has been limited experience in implementation. It is too soon for long-term learning and governance issues to be assessed.

Conclusion

Literature has identified that for decision-makers to own and assimilate the results of SEAs and to advance sustainability objectives, the SEA results need to be useful and support institutions and systems in the long-term. This paper evaluated two South African energy-focused SEAs against Partidário’s ‘14 priority needs of a good SEA’, to determine if they achieve procedural, substantive and incremental effectiveness, meaning that they can impact on planning, decision-making and governance.

The evaluation identified that the requirements for procedural effectiveness were fully met, in that the SEAs were well integrated with policy frameworks and adequately resourced. Regarding substantial effectiveness, it was evident that the decision-maker, the DEA in this case, was willing to implement the outcomes and was in the process of doing so, although the implementation and regulatory processes were still underway. At the time of the review, the SEAs therefore only partially fulfilled the criteria for substantial effectiveness.

Regarding incremental effectiveness, only one of the seven criteria considered was fully met. However, it emerged through the review that it was too soon for long-term learning and governance issues to be assessed.

As a practical outcome of the SEA’s effectiveness, the evaluation showed the SEA approach, as applied to the environmental authorisation process of the renewable energy projects, resulted in greater efficiency in resource use, shortened the duration of the project level assessment process and proactively contributed to achieving improved environmental practices. The impact of the SEAs on the location of projects and the quality of specialist reports is still to be tested once full implementation is achieved. These results demonstrate that SEA can play a prominent role in the ongoing search for instruments that assist governments and other organisations in pursuing the multiple goals of sustainable development.

Three observations were made through the evaluation. The first is that it would be difficult to review the SEAs for incremental effectiveness without having a broader knowledge of areas where the outcomes of the SEAs are adding value. It would therefore be necessary when rating effectiveness criteria for incremental effectiveness to scope a broad stakeholder group and consider effects outside the ambit of environmental impact assessment. A further area of work would be to identify indicators to measure incremental or indirect effectiveness to ensure that the evaluation covers the areas of influence that extend past the boundary of environmental impact assessment.

The second observation is that to evaluate substantial and incremental effectiveness, a review ought to consider not the SEA document itself, but rather the outputs of the SEA. In the South African case studies, the SEAs were designed to produce decision support tools. The success or failure of the SEA did not lie in the quality of the actual SEA document produced, but rather the nature and usefulness of the outputs.

The third observation is that a fundamental limitation of the study emerged in determining incremental effectiveness, as it is too early to evaluate the effectiveness of recently promulgated legislation and instruments that codified the outcomes of the renewable energy SEAs. To overcome this limitation, it is proposed that the SEAs be re-assessed once experience in implementation is gained.

The use of the three-dimensional interpretation of effectiveness proposed by Bina et al. (Citation2011) and Partidário’s fourteen assessment criteria provided a framework that enabled a meaningful assessment of the two SEA case studies. No modifications to the categories or criteria are proposed, and our experience suggests that it would be appropriate to use this methodology in any follow-up review or any other SEA evaluation, with the caveat to include a panel rather than one or two authors in order to avoid potential bias.

Declaration of interest

D Fischer works in the Department of Environmental Affairs as a Chief Director of Integrated Environmental Management and is responsible for the Strategic Environmental Assessment programme.

Additional information

Funding

This research has not received any grant from funding agencies in the public, commercial, or not-for-profit sectors.

Notes

1. Figures provided by the Department of Environmental Affairs, Pretoria.

2. The first SEA focused on the development of wind and solar photovoltaic energy in support of the REI4P and the second SEA addressed the expansion of the national electricity transmission infrastructure to meet future energy generation and demand.

3. Initially two SEAs were envisaged, one focusing on wind and the other on solar energy. However, due to the similar stakeholder groupings involved, it was decided at the inception meeting to combine the SEAs and produce one SEA document for renewable energy.

4. Eskom is South Africa’s national electricity utility.

5. The national online screening tool is a GIS-based application that collates verified data from across different government agencies on a common platform in order to inform the location of projects, with an emphasis on avoiding sensitive environmental features. It includes aspects such as agricultural land capability, biodiversity, wetlands, heritage resources, birds, protected areas etc. as well as buffers required around strategic facilities for telecommunications, defence and radio-astronomy. The tool generates a standardized report required in terms of the EIA Regulation. The tool can be accessed at https://screening.environment.gov.za.

6. Personal communication between author 1 and Mr R Marais, Strategic Grid Planning senior manager .

7. Personal communication between author 1 and Mr Mike Levington, Chair of the Green Economy Subcommittee of the South African Photovoltaic Industry Association (SAPVIA).

References

  • Acharibasam JB, Noble BF. 2014. Assessing the impact of strategic environmental assessment. Impact Assess Project Appraisal. 32(3):177–187.
  • Bina O. 2003. Strategic environmental assessment of potential exploration rights issuance for Eastern Sable Island bank, Western Banquereau bank, the gully trough and the Eastern Scotian Slope. Halifax N.S.: Nova Scotia Offshore Petroleum Board. https://www.cnsopb.ns.ca/pdfs/GullySEAJune03.pdf
  • Bina O. 2008. Strategic environmental assessment. In: Jordan A, Lenschow A, editors. Innovation in environmental policy? Integrating environment for sustainability. Cheltenham UK: Edward Elgar Publishing; p. 34–156.
  • Bina O, Jing W, Brown L, Partidário MR. 2011. An enquiry into the concept of SEA effectiveness: towards criteria for Chinese practice. Environ Impact Assess Rev. 31:572–581.
  • Bond A, Morrison-Saunders A, Stoeglehner G. 2013. Designing an effective sustainability assessment process. In: Bond A, Morrison-Saunders A, Howitt R, editors. Sustainability assessment, pluralism, practice and progress. Oxon (UK): Routledge, Taylor & Francis; p. 231–244.
  • Bond A, Pope J, Morrison-Saunders A, Retief F, Gunn JAE. 2014. Impact assessment: eroding benefits through streamlining? Environ Impact Assess Rev. 45:46–53.
  • Cape L, Retief F, Lochner P, Fischer T, Bond A. 2018. Exploring pluralism – different stakeholder views of the expected and realized value of strategic environmental assessment (SEA). Environ Impact Assess Rev. 69:32–41.
  • Chanchitpricha C, Bond AJ. 2013. Conceptualising the effectiveness of impact assessment processes. Environ Impact Assess Rev. 43:65–72.
  • CSIR. 1996. Strategic Environmental Assessment (SEA) - A primer. Stellenbosch: Council for Scientific and Industrial Research. csir.co.za/www/sea/primer/primerf.html.
  • CSIR. 2007. Strategic Environmental Assessment (SEA) resource document – introduction to the process, principles and application of SEA. Version 4. Stellenbosch: Council for Scientific and Industrial Research. CSIR Report ENV-S-C 2002-073.
  • Dalal-Clayton DB, Sadler B. 2005. Strategic environmental assessment: a sourcebook and reference guide to international experience. London: Earthscan.
  • Dalal-Clayton DB, Sadler B. 2017. A methodology for reviewing the quality of strategic environmental assessments in development cooperation. Impact Assess Project Appraisal. 35(3):257–267. doi:10.1080/14615517.2017.1322811
  • Davidovic D. 2014. Review: experiences of strategic environmental assessment in developing countries and emerging economies – effectiveness, impacts and benefits. Gothenburg: Student project report, Centre for Environment and Sustainability, GMV, University of Gothenburg; p. 45.
  • DEA. 2012a. Terms of reference: for a consultancy to assist the department with the development of a strategic environmental assessment to facilitate the efficient and effective roll out of solar energy in South Africa. Pretoria:Department of Environmental Affairs.
  • DEA. 2012b. Terms of reference: for a consultancy to assist the department with the development of a strategic environmental assessment to facilitate the efficient and effective roll out of wind energy in South Africa. Pretoria:Department of Environmental Affairs.
  • DEA. 2013. Terms of reference: for a strategic environmental assessment and pre-construction site-specific environmental assessment criteria to ensure the efficient and effective expansion of the electricity grid infrastructure for South Africa. Pretoria:Department of Environmental Affairs.
  • DEA. 2015. Strategic environmental assessment for wind and solar photovoltaic energy in South Africa. Pretoria: Department of Environmental Affairs. http://egis.environment.gov.za/redz.
  • DEA. 2016a. Strategic environmental assessment for the electricity grid infrastructure. Pretoria: Department of Environmental Affairs. http://egis.environment.gov.za/egi.
  • DEA. 2016b. Technical system design specification; implementation of a GIS-based screening application V1.1. Prepared by ESRI South Africa. Pretoria: Department of Enviornmental Affairs.
  • DEAT. 2000. Strategic environmental assessment in South Africa: guideline document. Pretoria: Department of Environmental Affairs and Tourism. fred.csir.co.za/project/csir_course_material/Final_Resource%20Document.pdf.
  • DEAT. 2004. Strategic environmental assessment, integrated environmental management, information series 10. Pretoria: Department of Environmental Affairs and Tourism. https://www.environment.gov.za/sites/default/files/docs/series10_strategic_environmental_assessment.pdf.
  • DEAT. 2007. Strategic environmental assessment, guideline. Integrated environmental guideline series 4. Pretoria:Department of Environmental Affairs and Tourism.
  • Department of Planning, Monitoring and Evaluation. 2016. Statement on the cabinet meeting of 17 February 2016. Pretoria: Department of Planning, Monitoring and Evaluation. https://www.gov.za/issues/national-devleopment-plan-2030.
  • Desmond M. 2007. Strategic Environmental Assessment (SEA): a tool for environmental decision-making. Irish Geogr. 40(1):63–78.
  • DOE. 2015. Bid window 4 - preferred bidders’ announcement - presentation. Pretoria: Department of Energy. https://www.esi-africa.com/reipppp-energy-minister-announces-winners-of-renewable-energy-contracts/.
  • Eberhard A, Naude R. 2016. The South African renewable energy independent power producer procurement programme: a review and lessons learned. J Energy South Afr [Online]. 27(4):1–14. ISSN 2413-3051.
  • Eskom. 2012. 2040 transmission network study – network supply and demand balance assessment, part 2. Johannesburg: Eskom. www.eskom.co.za/whatweredoing/TransmissionDevelopmentPlan/Document/2015-2014TDP_SGP_presentation.
  • Fischer TB. 2002. Strategic environmental assessment performance criteria — the same requirements for every assessment? J Environ Assess Policy Manage. 4(1):83–99.
  • Fischer TB. 2003. Strategic environmental assessment in post-modern times. Environ Impact Assess Rev. 23:155–170.
  • Fischer TB. 2007. The theory and practice of strategic environmental assessment: towards a more systematic approach. London: Earthscan.
  • Gachechiladze-Bozhesku M, Fischer TB. 2012. Benefits of and barriers to SEA follow-up – theory and practice. Environ Impact Assess Rev. 34:22–30.
  • Gazzola P, Rinaldi A. 2016. Reflecting on SEA’s usefulness: A case study on Italy. J Environ Assess Policy Manage. 18(4):34–40.
  • Geissler G, Rehhausen A, Fischer T, Hanusch M. 2019. Effectiveness of strategic environmental assessment in Germany? – meta-review of SEA research in the light of effectiveness dimensions. Impact Assess Project Appraisal. doi:10.1080/14615517.2019.1587944
  • Lobos V, Partidário M. 2014. Theory versus practice in Strategic Environmental Assessment (SEA). Environ Impact Assess Rev. 48:34–46.
  • Morrison-Saunders A, Pope J, Gunn JAE, Bond A, Retief F. 2014. Strengthening impact assessment: a call for integration and focus. Impact Assess Project Appraisal. 32(1):2–8.
  • National Planning Commission. 2011. National development plan 2030 - our future. Make it work. Pretoria: The Presidency, Republic of South Africa. https://www.gov.za/issues/national-devleopment-plan-2030.
  • Noble B, Nwanekezie K. 2017. Conceptualizing strategic environmental assessment: principles, approaches and research directions. Environ Impact Assess Rev. 62:165–173.
  • Noble BF. 2002. The Canadian experience with SEA and sustainability. Environ Impact Assess Rev. 22:3–16.
  • OECD. 2006. Applying strategic environmental assessment: good practice guidance for development co-operation. Paris:OECD Publishing.
  • Partidário MR. 2000. Elements of an SEA framework—improving the added-value of SEA. Environ Impact Assess Rev. 20(6):647–663.
  • Partidário MR. 2015. A strategic advocacy role in SEA for sustainability. J Environ Assess Policy Manage. 17(1):1–8.
  • Pope J, Bond A, Cameron C, Retief F, Morrison-Saunders A. 2018. Are current effectiveness criteria fit for purpose? Using a controversial strategic assessment as a test case. Environ Impact Assess Rev. 70(2018):34–44.
  • Presidential Infrastructure Coordinating Commission. (2012). Summary of the South African national infrastructure plan. Retrieved Jan 20, 2013, from 2013 Jan 20 http://www.gov.za: http://www.gov.za/sites/www.gov.za/files/PICC_Final.pdf. (Accessed 2015 Apr 22
  • Retief F. 2007. Effectiveness of strategic environmental assessment (SEA) in South Africa. J Environ Assess Policy Manage. 9(1):83–101.
  • Retief F, Jones C, Jay S. 2007. The status and extent of strategic environmental assessment (SEA) practice in South Africa, 1996–2003. South Afr Geog J. 89(1):44–54.
  • Retief F, Jones C, Jay S. 2008. The emperor’s new clothes – reflections on strategic environmental assessment (SEA) practice in South Africa. Environ Impact Assess Rev. 28:504–514.
  • Retief FP. 2005. Quality and effectiveness of Strategic Environmental Assessment (SEA) in South Africa [PhD thesis]. Manchester (UK): School of Environment and Development, University of Manchester.
  • Rozema JG, Bond AJ. 2015. Framing effectiveness in impact assessment: discourse accommodation in controversial infrastructure development. Environ Impact Assess Rev. 50:66–73.
  • RSA. 2014. National environmental management act, 1998 (act 107 of 1998), environmental impact regulations 2014. Pretoria:Government Printer. Government gazette no. 38282, notice no. R.594.
  • RSA. 2017. Amendments to the environmental impact assessment regulations, 2014. Pretoria:Government Printer. Government gazette no. 40772, notice no, R326.
  • RSA. 2018a. Notice of identification in terms of section 24 (5)(a)and (b) of the National Environmental Management Act, 1998, of the procedures to be followed in applying for environmental authorisation for large scale electricity transmission and distribution development activities identified in terms of section 24 (2)(a)of the National Environmental Management Act, 1998, when occurring in geographical areas of strategic importance. Pretoria:Government Printer. Government gazette no. 41445, notice no. R.113.
  • RSA. 2018b. Notice of identification in terms of section 24 (5)(a)and (b) of the National Environmental Management Act, 1998, of the procedures to be followed in applying for environmental authorisation for large scale wind and solar photovoltaic energy development activities identified in terms of section 24 (2)(a)of the National Environmental Management Act, 1998, when occurring in geographical areas of strategic importance. Pretoria:Government Printer. Government gazette no. 41445, notice no. r.114
  • RSA. 2019. Notice of identification, in terms of Section 24 (5)of the National Environmental Management Act, 1998, of a Generic Environmental Management Programme relevant to an application for substation and overhead electricity transmission and distribution infrastructure which requires environmental authorisation as identified in terms of Section 24 (2)of the Act. Pretoria:Government Printer. Government gazette no. 42323, notice no. R.435.
  • Runhaar H, Van Laerhoven F, Driessen P, Arts J. 2013. Environmental assessment in The Netherlands: effectively governing environmental protection? A discourse analysis. Environ Impact Assess Rev. 39:13–25.
  • Sadler B 1996. Environmental assessment in a changing world: evaluating practice to improve performance. Final report of the international study of the effectiveness of environmental assessment. Canadian Environmental Assessment Agency. https://www.ceaa-acee.gc.ca/Content/2/B/7/2B7834CA-7D9A-410B-A4ED-FF78AB625BDB/iaia8_e.pdf. [Accessed 2015 Oct 28].
  • Sandham LA, Pretorius HM. 2008. A review of EIA report quality in the North-West Province of South Africa. Environ Impact Assess Rev. 28(4):229–240.
  • Silva A, Selig PM, Leripio ADA, Viegas C. 2014. Strategic environmental assessment: one concept, multiple definitions. Int J Innovation Sustainable Dev. 8(1):53–76.
  • Simpson J. 2001. Developing a review package to assess the quality of EA reports of local authority structure and local plans in the UK. Environ Impact Assess Rev. 38:83–95.
  • Stinchcombe K, Gibson RB. 2011. Strategic environmental assessment as a means of pursuing sustainability: ten advantages and then challenges. J Environ Assess Policy Manage. 3(3):343–372.
  • Stoeglehner G, Brown AL, Kørnøv LB. 2009. SEA and planning: ‘ownership’ of strategic environmental assessment by the planners is the key to its effectiveness. Impact Assess Project Appraisal. 27(2):111–120.
  • Suzaul-Islam MD, Yanrong Z. 2016. Strategic environmental assessment and sustainable development: climate change perspective. J Earth Sci Clim Change. 7(12):1000379.
  • Tetlow MF, Hanusch M. 2012. Strategic environmental assessment: the state of the art. Impact Assess Project Appraisal. 30(1):15–24.
  • Thérivel R, Minas P. 2002. Ensuring effective sustainability appraisal. Impact Assess Project Appraisal. 20(2):81–91.
  • Thérivel R, Partidário MR. 1996. The practice of strategic environmental assessment. London: Earthscan Publications.
  • UNEP. 2004. Environmental impact assessment and strategic environmental assessment: towards an integrated approach. Geneva:UNEP.
  • Van Doren D, Driessen PPJ, Schijf B, Runhaar HAC. 2013. Evaluating the substantive effectiveness of SEA: towards a better understanding. Environ Impact Assess Rev. 38:120–130.
  • Veronez FA, Montaño M. 2015. EIA effectiveness: conceptual basis for an integrative approach. In: Proceedings of the 35th Annual Conference of the International Association for Impact Assessment; Florence, Italy; IAIA. http://conferences.iaia.org/2015/Final-Papers/. [accessed 2015 Oct 28].
  • World Bank. 2012. Strategic environmental assessment in the world bank: learning from recent experience and challenges. Washington D.C.:World Bank.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.