850
Views
0
CrossRef citations to date
0
Altmetric
Articles

Exploring the strategic environmental assessment (SEA) process behaviour using sensitivity analysis

Pages 160-175 | Received 15 Jun 2015, Accepted 17 Dec 2015, Published online: 17 Feb 2016

Abstract

This paper reflects on the results from a study that set out to model the behaviour of the strategic environmental assessment (SEA) process as it delivers environmental integration. It applied sensitivity analysis to the English SEA system, whilst viewing the SEA process as composed of a complex interaction of individual elements; through ‘inputs and outputs’ underpinned by negative and positive feedback loops that self-regulate the achievement of environmental integration. The study revealed at least five behavioural characteristics at the level of individual elements based on their relative dominance and consequent influence in process activities. At systemic process level, at least three key behavioural tendencies were identified. Overall, the process was poor in self-regulation, hampered by a lack of adequate feedback mechanisms. No single element or group of elements was critical in shaping process response and self-regulation towards delivering a consistent level of environmental integration.

Introduction

This paper reflects on the dynamics of the strategic environmental assessment (SEA) process by trying to establish its key behavioural characteristics. It applies sensitivity analysis to model and simulate the SEA process under various conditions. This views the SEA process as composed of a self-regulating complex interaction of elements via inputs and outputs underpinned by feedback loops that determine environmental integration. This self-regulatory perspective is informed by one of the main understandings of the term ‘systematic SEA process’, which is acknowledged in the SEA literature, but remains unexplored, empirically. The scientific importance and relevance of this research stems from the fact that there has been no investigation of this subject to date. Insight from such studies could increase our understanding of the SEA process, offering a more objective basis to help intervene in it, ostensibly, to improve the delivered process output(s).

The study addresses questions pertinent to the dynamics and behaviour of the process in terms of:

What key behavioural characteristics do SEA elements exhibit within the process?

How does the process respond to change?

What are the limits and constraints in the process behaviour?

The first question addresses behaviour at the level of constituent elements of the SEA process, whilst the next two address behaviour at the systems or holistic process level. This in no way assumes that the experience with SEA can be relatively uniform; or claims to perfectly describe how a specific SEA exercise will automatically ‘behave’ or develop. Rather, it suggests how the process may evolve over time, expressed in terms of response patterns of the constituent elements and output(s) of the process. This is a convenient approach, for lack of a better way to express the idea of process dynamics in SEA, which ‘smoothens over’ the complex and dynamic interactions within the otherwise unobservable SEA process. Such a study involving modelling is necessarily conducted under clearly defined terminologies, assumptions and process boundaries as will be subsequently explained, to allow for clear interpretation of the results. The terminologies used to describe process behaviour in this paper are taken from the sensitivity analysis software and effort will be made to translate them within an SEA context.

SEA has been described as a systematic process for integrating environmental considerations into policies, plans and programmes (PPPs) (Therivel & Partidario Citation1996; Fischer Citation2007) with a view to promoting sustainable development (Gibson Citation2006; Joao & McLauchlan Citation2011; Partidario Citation2015). Environmental integration has various interpretations and is among the most cited motivations for SEA (Eggenberger & Partidário Citation2000; CEC Citation2003; Sheate et al. Citation2003; Kessler & Abaza Citation2006; Jiliberto Citation2007). As one of several products of SEA (Fischer Citation2007), this leads to a rising need for adequate organisation of process structures and functions to enhance its effective delivery, however defined. Yet, this necessarily depends on the degree to which the SEA process is reliably understood. For example, what are its dynamics and how do changes in internal and external factors influence process activities and the delivery of process output(s)?

SEA process dynamicsFootnote1 is herein pursued in terms of changes within the process; i.e. exhibited behavioural characteristics and tendencies of process elements, including the process output of environmental integration. A more technical definition of process dynamics refers to a set of relationships among two or more measurable quantities in which a fixed rule describes how the quantities evolve over time in response to their own values (Katok & Hasselblatt Citation1995; Strogatz Citation2001). In this study, instead of measurable quantities, descriptive relationships between SEA elements, based on mathematical functions in the sensitivity analysis software, are applied.

Scholars have stated that SEA delivers its objectives via various rationales, ranging from positivist and rationalist (Bina Citation2008; Elling Citation2009; Tetlow & Hanusch Citation2012; Meuleman Citation2015) to postmodernist approaches (Stoeglehner Citation2010; Bina et al. Citation2011). Various mechanisms have been described across the spectrum, from analytical perspectives holding that SEA helps to establish strategic objectives, evaluate PPP alternatives and enhance PPP implementation (Chaker et al. Citation2006; Bidstrup & Hansen Citation2014; Geneletti Citation2014); to more participatory, consultative and deliberative mechanisms of enhanced public and stakeholder involvement (Bonifazi et al. Citation2011; Gauthier et al. Citation2011; Jiliberto Citation2011; Rega & Baldizzone Citation2015). Other postmodern approaches see the SEA process as a forum for the identification, analysis and consideration of various issues, perspectives and types of knowledge (Tetlow & Hanusch Citation2012; Partidario & Sheate Citation2013; Lobos & Partidario Citation2014); and as a forum of communication, learning and negotiation among its participants (Jha-Thakur et al. Citation2009; Stoeglehner Citation2010; Illsley et al. Citation2014). It has also been argued that such distinctions are unnecessary (Bina et al. Citation2011; Gao et al. Citation2013) and that multiple rationales and mechanisms are at play (Richardson Citation2005; Cashmore et al. Citation2010).

Despite the discourse on SEA delivering mechanisms, empirical insight into the process dynamics remains scant. This is a significant gap in knowledge given that the SEA process is defined as intrinsic to the delivery mechanisms for environmental assessment purposes (Jackson & Illsley Citation2007; White & Noble Citation2013). A recent compendium by Da Silva et al. (Citation2014) listing over 106 definitions of SEA supports the conclusion that the ‘systematic process’ is at the core of SEA’s prescriptive characteristic, conceptual and purposeful nature. Fischer (Citation2002) and Bojo et al. (Citation2004), looking at the transport sector and poverty reduction strategies, respectively, demonstrated a positive and significant correlation between SEA proceduresFootnote2 and achieved environmental integration. Latterly, Bonifazi et al. (Citation2011) and Jiliberto (Citation2011) explain how the procedural quality of the SEA process is critical to SEA effectiveness. This emphasises that the process is itself a significant aspect of current SEA theory and practice, hence worthy of further empirical attention, as herein attempted.

The paper is structured as follows. Following the introduction, the research context is provided covering an overview of the theoretical and methodical fundamentals used in the study. This expounds on the concept of an SEA process that is both systematic and complex, explaining why establishing the behaviour of such a process can be challenging. The application of sensitivity analysis method is presented followed by the results. Finally, the concluding section draws on the findings, lessons learnt and recommendations for further research.

Research context and rationale

Whilst SEA has to date received increased attention in its theory and practice (see Bina et al. Citation2011; Fischer & Onyango Citation2012; Pope et al. Citation2013; Caschili et al. Citation2014; Lobos & Partidario Citation2014), the oft-repeated definitional claim that SEA is a systematic process (CEC Citation2003; ODPM Citation2005) remains relatively unresearched. This is exacerbated by observations that theory in SEA is ahead of practice (Kagston & Richardson Citation2015), highlighting a mismatch between assumptions and reality in SEA (Axelsson et al. Citation2012; Lobos & Partidario Citation2014). Some scholars have argued that a lack of solid SEA theories and scientific rigour stems from the weak theoretical foundations of EIA out of which SEA emerged (Bartlett & Kurian Citation1999; Lawrence Citation2003). This is complicated by the multifaceted application of professional contexts, disciplinary roots and theoretical assumptions borrowed from other fields of studies (Lawrence Citation2003; Noble Citation2009).

The limited use of post hoc assessments (Cashmore Citation2004) and over-reliance on qualitative research in SEA have also been identified as presenting a major hindrance to SEA theory development (Noble et al. Citation2012; Geneletti Citation2014). For example, Thérivel and Wood (Citation2005) and Noble (Citation2003) warned that expert judgement commonly applied in SEA can be prone to bias and often be unsatisfactory in explaining cause-effect. The corollary is that if cause and effect pathways in SEA (Perdicoúlis et al. Citation2007) are better established, then SEA application can be better configured to more effectively deliver its objectives (Doyle & Sadler Citation1996; Cashmore Citation2004). This is aptly summarised in Storey and Noble (Citation2005), who state that empirical results are needed in order to reconsider and gauge the validity of initial theory claims, thus contributing to establishing which SEA theories require further building. This is pertinent to this study as the definitional claim of SEA being a systematic process will be tested empirically. This echoes Kornov’s (Citation2015) call for more empirical studies aimed at evaluating assumptions within SEA in order to address the perceived lack of theory use and development.

SEA as a systematic process

The term ‘process’ can be understood following Hall and Fagan (Citation1956) as a series of ongoing collection of activities, often part of a system, with the energy to transform inputs to outputs, directed towards a specific aim. Therivel and Walsh (Citation2006) and Noble (Citation2009) described an SEA ‘system’ as the entire set of SEA procedures, including the contextual and substantive requirements of the regulatory frameworks within which SEA is applied. A review of SEA literature unearthed four main understandings of the term ‘systematic SEA process’. The most common understanding reflects a predictable, organised and procedural process, with defined stages and tasks that should not be changed arbitrarily (Sadler Citation1996; Fischer Citation2007). This depicts a methodical, progressive and planned process (CEC Citation2003; ODPM Citation2005; Bina Citation2008) applied following a flexible and purpose-led approach, adapted to different contexts in response to different needs (Therivel Citation2004; Fischer & Gazzola Citation2006). Each step is defined by the existence of a set of procedural, substantive and contextual elements that must be in place, even if broadly, for SEA effectiveness to be achieved (Brown & Thérivel Citation2000; Fischer & Gazzola Citation2006).

A second understanding refers to SEA’s scope of application, i.e. a system-wide oriented approach taking into account a range of effects and impacts (Therivel Citation2004; Fischer Citation2007). The spectrum of inter-related issues includes spaces, activities, effects and aspirations over time, creating a collectivity that can be referred to as a system. A third understanding refers to the notion of belonging to systems or ‘interconnected’ mechanisms for SEA’s delivery of environmental integration (Kessler & Abaza Citation2006). The delivery mechanisms are considered ‘systemic’ and/or ‘systematic’ because they are distinctively methodical and largely operationalised through a formalised yet specific concept or instrument of intervention. Eggenberger and Partidário (Citation2000) categorised these delivery mechanisms as substantive, methodological, procedural, institutional, or policy-based.

The fourth understanding refers to a functional nature characterised by dynamic interactions among SEA elements. Therivel and Partidario (Citation1996, p. 187) emphasised the importance of these functional interactions, stating that SEA ‘should be based on a systematic methodology, possibly linking objectives, indicators, baseline analysis, impact predictions, mitigation and monitoring’. These interactions have been described as comprising iterative loops of positive and negative feedbackFootnote3 mechanisms (Noble Citation2000; Therivel Citation2004). For example, the scoping stage can be revisited and refined until (possibly) improved conditions are met (Jiliberto Citation2004). Also, a competent authority that administers SEA can iteratively review a draft SEA report. The results then feed back into, and lead to (re)adjustments of the SEA process, further influencing its procedures, output(s) and scope (Therivel Citation2004; Fischer Citation2007). Figure illustrates how such feedback, within iterative mechanisms, can function among SEA elements (in this case procedures) within an SEA process.

Figure 1. The iterative interconnectedness among SEA elements as SEA integrates with PPP-making (dotted arrows). Public participation and consultation is possible throughout the SEA process. The arrows allow for any SEA element to be repeated or its results used to update an earlier element.

Figure 1. The iterative interconnectedness among SEA elements as SEA integrates with PPP-making (dotted arrows). Public participation and consultation is possible throughout the SEA process. The arrows allow for any SEA element to be repeated or its results used to update an earlier element.

This paper henceforth applies itself only to the fourth understanding of SEA systematic process, frequently acknowledged in the literature, albeit unexplored empirically. However, the four understandings are neither categorical nor exhaustive, but complementary, as the procedures, mechanisms, application contexts and/or delivery systems are inextricably linked to functionality (see Fischer Citation2007, p. 123) within what Hilding-Rydevik and Bjarnadóttir (Citation2007) called ‘blackboxes’ of complex processes. More generally, Mulnix (Citation2012) observes that systematic processes are associated with critical thinking, involving actively and skilfully conceptualising, applying, analysing, synthesising and evaluating information to reach an answer. The application of a systematic process is therefore regarded as a means of management aimed at reducing the number and severity of errors and failures due to either human or technological functions in a process (Mulnix Citation2012).

SEA as a complex process

The literature also acknowledges that the SEA process is complex (Deelstra et al. Citation2003; Fischer Citation2007; Retief et al. Citation2013; Lobos & Partidario Citation2014); with a high degree of uncertainty in its dynamics and outputs (Cherp et al. Citation2007; Van Buuren & Nooteboom Citation2009). One source of complexity is the decision-making arena which SEA is primarily tasked to influence (Jiliberto Citation2002). Scholars like Kørnøv and Thissen (Citation2000) and Nilsson and Dalkmann (Citation2001) discuss how making strategic decisions is complicated, fluid and hostage to various factors which are not easy to fully observe, identify or manage. Another source of complexity, overlapping with the first, comes from what Hilding-Rydevik and Bjarnadóttir (Citation2007) described as a ‘blackbox’ of unobservable and non-linear interactions within the SEA procedures. Whilst generic SEA procedures can be presented in a sequence, the actual implementation is often non-linear, iterative and subject to change as the process unfolds (Lobos & Partidario Citation2014). The SEA process is neither closed nor insular, and open to influence from both internal and external factors (Retief et al. Citation2013; Lobos & Partidario Citation2014). Furthermore, the influences, values, priorities and composition of actors and stakeholders can change unpredictably along the process (Deelstra et al. Citation2003; Shepard Citation2005; Dietz & Stern Citation2008).

Generally, complex processes are by definition described as often behaving differently from the sum of their parts; exhibiting properties that are unknown to the analysis of their constituent parts in isolationFootnote4 (Vester Citation2007). Therefore, simple causal associations are misplaced and it is hard to see how things work by looking at the different elements of the process. Whilst scholars like Dorner (Citation1996) have argued that our understanding of complex processes is often plagued by errors of logic, Chaker et al. (Citation2006) and Retief (Citation2007) highlighted epistemic challenges relating to methods. Vester (Citation2007) further advances the arguments of human tendency to approach issues from non-systemic and linear perspectives, introducing errors from separate analysis of system components, failure to account for feedback effects, and short planning horizons. Potential parallels in SEA can be identified as follows:

False description of goals, e.g. wrong identification of the SEA elements for tweaking in order to improve the delivery of environmental integration (see Onyango Citation2015).

Uni-dimensional analysis of situations, e.g. approaching SEA effectiveness from single perspectives or individual procedures (screening, scoping or public participation, etc.).

Initial perspectives or original assumptions and claims in SEA are held as true years later without confirming evidence.

Neglected side-effects (reality picking), by focusing on specific singular aspects of interest. For example, by selecting certain SEA procedures and outputs only, sight of other potentially significant elements is lost, including the long-term behaviour of the holistic SEA system.

Aspects of an SEA element or SEA process overdevelops in one direction in the absence of robust feedback to trigger self-regulation towards a predetermined level or state. E.g. public participation in an SEA exercise gets worse without any effective remedial action within the process being triggered.

Therefore, within a systematic and complex SEA process, to what extent can its behavioural characteristics be reliably established? In other words, one can reliably ‘smooth over’ the enormous amount of detail needed to explain/account for the complex and unobservable interactions within the SEA process, into a manageable set of clearly defined behavioural aspects. To start answering this question, a brief introduction to process behaviour is provided.

Process behaviour

In reality, processes are never usually performed as perfectly as envisaged in their idealised conceptual foundations (Shewhart Citation1980): and SEA is no exception. A process is a dynamic system whose behaviour changes over time and with control systems needed to handle such changes (Seborg et al. Citation2004). In SEA, the controls can be construed to occur within the iterative SEA procedures through which the process operates. It is therefore important to understand process dynamics if one is to design control mechanisms for steering it towards desired process outputs. The idea of examining a process in terms of its structure, behaviour and performance is not new, following a long tradition in the engineering and industrial sectors (Von Bertalanffy Citation1968; Rehani Citation2002; Weigt & Seidel Citation2004). Empirically exploring behaviour exhibited by business, industrial and management processes (see Kotter Citation1999; Kotter & Cohen Citation2002), including sources of variation within them, is a well-established practice (Shewhart Citation1980; Deming Citation1986). The related disciplines of systems theory and control theory, initiated in the late 1970s by J.C. Willems have today matured, with the sole purpose of explaining how processes behave or can be understood and controlled. In this context, Willems (Citation1997) and Marr (Citation2009) construed behaviour as a set of all signals compatible with the system, emphasising that the science of process behaviour must focus on the observable relationship of behaviour to the environment, without resorting to ‘hypothetical constructs’ from social science.

Generally, ‘behaviour’ can be understood as the range of actions and mannerisms made by individuals, organisms, systems, or artificial entities in conjunction with themselves or their environment. In reference to SEA, a more appropriate definition is here borrowed from Levitis et al. (Citation2009): behaviour is the response of a system or organism to various stimuli or inputs, whether internal or external. A common approach to investigating process behaviour is through modelling and/or simulation, which produces scenarios of alternative process developments or projections, for diagnosis, prognosis and management decisions (Vester Citation2007). For example, game theory uses mathematical models of conflict and cooperation between rational decision-makers and is applied to a wide range of behavioural relations in humans, animals and computers (see Hart & Mas-Colell Citation1997; Aumann & Hart Citation2003; Camerer Citation2003). An example, from natural systems, shows how behaviour of a river flow can be modelled within the field of hydrology (see Bahremand & De Smedt Citation2008). Such a model is one that is reliably consistent with observed natural processes and simulates well the observed river discharge, useful in ‘predicting’ the watercourse under various conditions of rainfall.

Whilst not perfectly representative, modelling of processes is compelling because it is cost-effective, faster than experimenting, benign and can allow for testing under various conditions and with alternative interventions (Smith Citation1999). Although the simplicity of modelling can be deceptive as it is based on the concept of mathematical equations and data analysis (Smith Citation1999), the use of mathematical models to capture complex unobservable relationships has been shown to be very effective in a number of diverse situations (Turban & Meredith Citation1985; Vester Citation2007).

Sensitivity analysis

In this paper, sensitivity analysis was purposively applied to the England SEA process of which significant literature has been published (Curran et al. Citation1998; Smith & Sheate Citation2001; Short et al. Citation2004; Therivel & Walsh Citation2006; Smith et al. Citation2010; Fischer et al. Citation2015). It studies how a model output varies with its inputs, revealing the interactions and behavioural roles of elements within the process (Bahremand & De Smedt Citation2008). Sensitivity analysis can help determine the interactions between elements that mostly contribute to the output variability, input factors for which the model variation is maximum or optimal, and instability regions useful in calibrating a process (Malik Management Zentrum Citation2014). The analysis is underpinned by mathematical models defined by a series of equations, input factors and variables (elements) aimed at characterising the process being investigated (Vester Citation2007). The relationship between the elements is established based on table equations and fuzzy logic, with effects between any two elements being simulated one equation at a time.

Sensitivity analysis is structured by cybernetics (Rehani Citation2002), a field of study where self-regulation is a defining characteristic of the process (Ashby Citation1970); with changes in the system causing the system to alter its behaviour and adapt to new conditions via information feedback loops (Figure ). This self-regulation is herein compared to iterative procedures in SEA, within which good practice and quality can be mediated and assured, thus serving to steer the process activities and its output.

Figure 2. Self-regulation based on feedback mechanism: an output is compared to a pre-stated standard/endpoint; and if different, then a corrective or negative feedback (dotted lines) is sent back to the input element, which then affects the next input and consequent output.

Figure 2. Self-regulation based on feedback mechanism: an output is compared to a pre-stated standard/endpoint; and if different, then a corrective or negative feedback (dotted lines) is sent back to the input element, which then affects the next input and consequent output.

Although various modelling software such as Vensim and STELLA exist the Sensitivity Model Prof. Vester® (Vester Citation2007) (hereinafter the Model), further developed as Malik Sensitivity Model®Prof.Vester (Malik Management Zentrum Citation2014), was selected because it offers the following advantages:

It is user-friendly and allows trans-disciplinary groups of experts to build a common language as opposed to the prevalent jargons of specific areas of expertise;

It allows repeat simulations of the same partial scenarios under different conditions and comparison of the different results;

Its table functions allow non-linear relations and the application of fuzzy logic that recognises limits and threshold values as may occur in reality;

The model’s application of circular causal logic, i.e. feedback thinking, appropriately fits the rationale underlying this research.

The Model’s validity is well established as it has been successfully applied for over 35 years in the fields of management and technical consulting, business strategies, mediation, risk management, traffic planning, town and regional planning, scientific research and education (Malik Management Zentrum Citation2014). The Model has nine key steps (Vester Citation2007; Malik Management Zentrum Citation2014), which follow a recursive nature with each new step updating the previous one. The nine steps reflect three stages of a sensitivity analysis, i.e. cybernetic systems description (data collection and aggregation, steps 1–3), cybernetic systems interpretation (understanding the network, e.g. in terms of the influence table, steps 4–6) and cybernetic systems evaluation (understanding the need, consequences, and risks of interventions, steps 7–9) (see Figure ).

Figure 3. Starting at System description, a recursive interaction and description of the system occurs, with positive (full) and negative (dotted) feedback arrows indicating direction of flow of effect within the system (adapted from Vester Citation2007).

Figure 3. Starting at System description, a recursive interaction and description of the system occurs, with positive (full) and negative (dotted) feedback arrows indicating direction of flow of effect within the system (adapted from Vester Citation2007).

Following the system description stage, simulation of various scenarios is done. The simulation results of the Model are an IF-THEN type of policy scenario exploration (Vester Citation2007). This means that if the prescribed conditions and variables hold, then the following is likely to develop. The Model was applied based on a number of assumptions. First, the goal of the English SEA system is to promote sustainability through the SEA process output of environmental integration into PPPs (ODPM Citation2005). Second, environmental integration refers to the consideration and incorporation of environmental values and targets into decision-making processes in a manner that explicitly recognises and adequately addresses environmental concerns (Sheate et al. Citation2003).

Data collection

Three sources of data were used to model the English SEA system:

Literature review which identified the key elements to describe the English SEA process;

Questionnaire survey sent to 192 IAIA members based in England, to fill a pair-wise matrix to establish the relationship between key SEA elements;

Two three-hour workshop sessions to input descriptive details and criteria about the SEA elements into the Model. This involved a total of 13 participants in the researcher’s faculty, who were conversant with the Model and helped define the SEA elements according to 24 descriptors predetermined in the Model.

Each element in Table was comprehensively described and its relationships with other elements registered using a pair-wise matrix and a further set of 24 descriptors (see Vester Citation2007). In step 8, partial scenarios similar to IF-THEN policy tests were built and simulated in step 9. These last two steps are subsequently explained in more detail, as they explore key aspects of process behaviour.

Table 1. Generic SEA procedural and contextual elements used to describe the England SEA system.

Scenario simulations

Scenario simulation is the imitation of the performance of a specifically defined real-world process or system, after first developing a model that represents its key characteristics (Wiener Citation1961; Forrester Citation2007; Vester Citation2007). Simulations reveal trends in systemic behaviour by exploring system-specific interplays, stabilising tendencies, limits and irreversibilities (Jamshidian & Zhu Citation1997). They also help identify possible reactions to specific (policy) interventions. In this study, 15 scenarios were simulated, exploring the effect of altering the relative performance of five selected SEA elements (Scoping, Political Will, Impact Assessment, Decision-making and Reporting and Monitoring and Evaluation). These elements are commonly presented in the literature as being crucial for achieving environmental integration (see for example Brown & Thérivel Citation2000; Partidario Citation2000). The partial scenarios were composed of 7–11 elements. The parameter settings for SEA elements ranged from Very Low to Optimum, including intermediate stages (Table ). The parameters were put into categories by the author, to represent the contribution that an element makes to the process events and output at each iteration of the process.

Table 2. Author’s classification of parameter ranges for the elements relative to ‘good practice’ literature (see e.g. IAIA Citation2002; ODPM Citation2005). They were useful in describing settings for elements and for ease of discussion.

Here is how it works. The product of each SEA element e.g. screening or scoping is considered an output of the element; and becomes an input into the next element(s) that uses that output e.g. impact assessment or public participation, until the final process output of environmental integration is delivered. One period or iteration of the SEA process means that it is run from the beginning to the end. A second period or iteration means a repeat of the run, and so on, until the fifteenth and maximum allowable iteration by the Model.

Simulations were run, with some graphs collapsing (reaching zero) by the 3rd iteration, whilst other graphs went to the 15th iteration. Some simulations were stopped as soon as a steady state was reached, or an untenable result had been reached. For example, when all variables fell to ‘Very low’, then there was no need to continue the simulation even though the curves may have later picked up and risen to optimum levels. It was assumed that in reality such a trajectory would not ordinarily be knowingly pursued. The simulations were exploratory in nature, focusing on key elements of interest identified from the results of cybernetic evaluation. The element Monitoring and Evaluation was lagged by one loop to show that its effects are felt one cycle later.

Analytical framework

The Model generated effects data of ‘Active Sum’ (AS) and ‘Passive Sum’ (PS) and their derivate P (product = AS × PS) and Q (quotient = AS/PS) values, revealing the behavioural response characteristics of each element in the SEA process. An AS value indicates how strongly an element affects the rest of the system: a small change in an element with a high AS value leads to big changes in the system, and vice versa. PS value indicates how susceptible an element is to changes in the system and how it reacts to them. For instance, an element with a high PS value means that as soon as something happens in the system, the element reacts very markedly, and vice versa. Q values reflect the elements’ ‘active’ or ‘reactive’ character, regardless of its strength as shown by the AS and PS data. A bigger P value means that the element plays a ‘critical’ role within the system; whilst an element with a smaller P value plays a lesser ‘buffering’ role (Vester Citation2007). The graphical results of simulations were synthesised and inductively analysed to tease out key patterns and tendencies of process behaviour under various conditions.

Results

Five main behaviour characteristics were identified at the level of individual elements, and three at system level. They are subsequently discussed under the themes following the three research questions set out at the start of the paper.

Behavioural characteristics of elements

From the cybernetic data (Figure (a) and (b)), five categories of behavioural characteristics exhibited by the elements in terms of their roles within the process are identified. The categories are predefined in the Model, and are a result of calculations in the software, based on the series of descriptions and matrices that had been entered. Although some elements exhibited more than one characteristic, the predominant ones are summarised below.

A Slightly critical role was exhibited by 27.7% of the elements, having a relatively low potential to directly influence process events towards environmental integration. This behaviour represents a risk factor through which the process performance could be significantly enhanced or suppressed.

An Active role was exhibited by 33.3% of the elements, showing a tendency to directly influence other elements and activities within the process, but without necessarily influencing the process output in any significant way.

A Reactive role was exhibited by 22% of the elements, tending to be more reactive and influenced by others, behaving as ‘measuring sensors’ to tell what is happening in the system.

A Background role was exhibited by 22% of the elements, with a stronger tendency to influence other elements whilst being much less influenced in return.

A Buffering role was exhibited by 16.7% of the elements, having the tendency to be mainly acted upon by other elements, and potentially could be useful to stabilise the system.

Figure 4. (a) Behavioural characteristics of elements based on cybernetic data from the Effect Metrics, indicating element’s capacity to influence process events and output. P value reflects how critical or strongly an element plays a role in the system, i.e. its strength of involvement in system events. Q value reflects the active or reactive character of an element. Please read ‘Aktive’ as ‘Active’. (b) Influence Index indicating the relative extent an element has influence in process events and consequent contribution to process output, based on PS and AS values. They show which elements may be control levers or jeopardise the process. Please read ‘Aktive’ as ‘Active’.

Figure 4. (a) Behavioural characteristics of elements based on cybernetic data from the Effect Metrics, indicating element’s capacity to influence process events and output. P value reflects how critical or strongly an element plays a role in the system, i.e. its strength of involvement in system events. Q value reflects the active or reactive character of an element. Please read ‘Aktive’ as ‘Active’. (b) Influence Index indicating the relative extent an element has influence in process events and consequent contribution to process output, based on PS and AS values. They show which elements may be control levers or jeopardise the process. Please read ‘Aktive’ as ‘Active’.

Overall, no element was deemed as behaving in a manner that is ‘highly critical’ or ‘critical’ to influencing process activities and delivery of environmental integration. This is despite the SEA literature highlighting several elements as key to effective environmental integration e.g. mitigation and decision-making and review (Brown & Thérivel Citation2000; Partidario Citation2000), scoping (Mulvihill Citation2003) and political will (Elling Citation2008). This is a significant finding about the nature of the SEA process, especially to those aiming to enhance its effectiveness: it lacks ‘critical’ elements whose roles can be directly leveraged to intervene and influence process activities and output.

Process response behaviour

The holistic or systemic process behaviour was discerned by looking at the graphical patterns from the scenario simulations. An overall and significant finding was that the English SEA process was rather stable, exhibiting inertia and tending not to robustly respond to change. This can be explained in three ways. Firstly, the process in terms of structure did not have adequate or effective feedback loops for self-regulation. Analysis of the feedback loops indicated that the degree of interconnectedness among system elements was significantly low as explained in more detail in Onyango (Citation2015). This means that the ‘iterative’ flow of information within the process was constrained, decreasing the possibility for self-regulation whenever needed. Secondly, the similarity in the total number of negative and positive feedback loops revealed that the system was more dependent on factors external to the process, such as an influential interest group, than on internally controlled factors, which are necessary in order to make any changes. This is further evidence of poor self-regulation, as an aspect of behaviour, at the system level. A viable cybernetic system requires more negative than positive feedback loops to facilitate robust self-regulation (Vester Citation2007). Thirdly, the process did not have adequate points of quality analysis that could facilitate evaluation and consequently trigger self-regulatory mechanisms. Such a mechanism is discussed at length in Caratti et al.’s (Citation2004) ANSEA (analytical SEA) concept.

Process limits and constraints

Results from scenario simulations were particularly useful in revealing that the response behaviour of the England SEA process exhibited certain limits under various conditions. Most significantly, as a result of its low capacity to self-regulate, in scenarios where most elements were either very low or at least sub-optimal, the achieved environmental integration was correspondingly either very low or very high (see Figure (a) and (b)). The process tended to over-develop in one direction without self-regulating, emphasising a linear model where self-regulation is poorly developed and/or weak. In practice, this implies that the English SEA process is poorly equipped to pick up deficiencies, inadequacies or anomalies, and adjust via corrective actions. In terms of SEA theory, this calls into doubt the efficacy of the iterative character of the process, which is hypothesised to be fundamental in the function and performance of a self-regulatory SEA. Therefore, an SEA process that starts off with ‘poor’ elements will tend to deliver low environmental integration. By extension, even a properly designed SEA system, which starts off with poor elements, will tend to not self-correct and improve its outputs, unless intervention comes from outside the SEA process itself. This means the process is weak in assuring and determining its output(s), but is more at the mercy of factors outwith the process e.g. government or other special interests.

Figure 5. (a) When most elements were at least ‘sub-optimum’ the process over-developed i.e. all elements improved in the same direction. (b) When most elements were at least ‘very low’ the process behaved similarly to Figure (a) and overdeveloped in opposite direction without any self-correction.

Figure 5. (a) When most elements were at least ‘sub-optimum’ the process over-developed i.e. all elements improved in the same direction. (b) When most elements were at least ‘very low’ the process behaved similarly to Figure 5(a) and overdeveloped in opposite direction without any self-correction.

In another example, the limited influence of a single ‘active’ element within the holistic process was clearly revealed. When Political will (and two other elements) was very low (see Figure (a)), then its influence was insignificant in the short term (1–3 iterations). However, by the 6th iteration, its effect in the process was overwhelming and almost all other elements gradually collapsed. In this case, the two other low elements helped the low Political will to exert significant influence. However, when only Political will was improved from low to almost optimum level (Figure (b)) then its singular effect did not alter the process output of environmental integration from low to high. This is significant in that whilst an element like Political will exhibited an ‘active’ behavioural characteristic, its single-handed effect was limited under certain conditions.

Figure 6. (a) When Political will and another element are low, their combined influence overwhelmed almost all other elements in the long run. (b) Improving Political will from low to almost optimum, its singular effect was not enough to change environmental integration from low to high.

Figure 6. (a) When Political will and another element are low, their combined influence overwhelmed almost all other elements in the long run. (b) Improving Political will from low to almost optimum, its singular effect was not enough to change environmental integration from low to high.

Also, in some scenarios where environmental integration and another element was low, and the remaining elements were at least sub-optimal, the process could compensate and have those functions underpinning the low element achieved elsewhere. For example, where Scoping was low, its functions were offset by Quality Review, Mitigation, Monitoring and Evaluation and analysis of PPP Alternatives, which were at above sub-optimal setting in the model. However, such compensatory function took unrealistically long and was difficult to predict (Figure ).

Figure 7. When Environmental integration was very low and all other elements were at least medium, a very slow compensatory effect occurred.

Figure 7. When Environmental integration was very low and all other elements were at least medium, a very slow compensatory effect occurred.

This compensatory effect is a welcome characteristic for any self-regulating system (Vester Citation2007). It underpins equifinality, referring to alternative ways of attaining the same objectives (convergence); and multifinality, referring to attaining alternative objectives from the same inputs (divergence). These are key characteristics for a more specialised ‘complex adaptive system’ (see Holland Citation2006), allowing the SEA system to learn from experience and improve the process. Overall, the majority of SEA elements needed to be at least above a certain threshold in order for the systemic effects of the elements to significantly influence environmental integration. However, the exact identities of such elements and their appropriate settings are best identified on a case-by-case basis.

Discussion and conclusion

The results represent the generalised behaviour characteristics of the SEA process within a cybernetics perspective, representing ‘likely behavioural tendencies or patterns’ rather than precise predictions. Whereas most empirical work in SEA has been from a uni-dimensional and linear perspective looking at individual elements (see Mulvihill Citation2003; Elling Citation2008; Hanusch & Glasson Citation2008), the approach in this paper applies a holistic perspective of the SEA process. This emphasises interconnectivity instead of a series of disjointed procedural stages promoting non-systemic goals and strategies. In this sense, it is a pioneering empirical study aimed at providing researchers, regulators, policy-makers and administrators of SEA systems with a new ‘thinking aid’ to comprehensively understand the SEA process. It partly explains the process dynamics in terms of behaviour at both element and systemic levels. Furthermore, sensitivity analysis focuses on the long-term behaviour of the process, in contrast with SEA-users who often face short-term pressures. The cybernetic perspective can augment expert opinion approaches in understanding the holistic SEA process (Table ).

Table 3. Summary of findings showing behavioural characteristics and limits of the England SEA process.

Overall implications

The findings about the dynamics and behavioural characteristics shed empirical insight into the ‘systematic SEA process’, exploring the definitional assumptions mentioned at the start of the paper. Two key inferences are made. Firstly, based on the England SEA process, the claim that SEA is a functional process made of self-regulatory feedback loops, is not strongly supported by the evidence. Whilst some self-regulatory capacity exists, this was rather limited and only operated within narrow limits. Secondly, the system exhibited considerable inertia and did not robustly adjust to any changes or new conditions, contrary to what a robust ‘iterative SEA process’ is expected to do. In practice, the above two findings imply that more interventions outwith the process will be needed to assure SEA process outputs; and/or, the optimism in SEA process ‘robustness’ to self-regulate and assure its quality of process output should be lowered. A more worrisome implication is that its capacity to learn via double-feedback loops, as argued in Jha-Thakur et al. (Citation2009), Stoeglehner (Citation2010), Illsley et al. (Citation2014) and Sims (Citation2012) is more limited than previously thought.

Unpacking the behavioural characteristics of the systematic SEA process, albeit within the limited domain of a self-regulatory process, contributes to the body of knowledge that is part of SEA theory-building. This provides a supposition or a system of ideas, based on general principles, intended to explain how the holistic SEA process responds to change as it delivers environmental integration. Such insight can provide guidance for practical application. This is relevant to those who administer SEA systems, regulate them, and/or formulate relevant policies for SEA application. For example, by knowing the critical elements or the limits of responses within the process, those trying to intervene and improve the process have a more objective basis upon which to identify priority elements or levels of calibration needed to deliver certain levels of process output(s).

Overall, whilst it is acknowledged that the SEA process uses iterative procedures and feedback mechanisms to achieve its objectives (see Partidario Citation2000; Dalkman & Bongardt Citation2004; Therivel Citation2004, p. 77); insight from this study also raised some fundamental questions, perhaps better addressed by further debate and empirical studies. For example:

Will SEA’s future effectiveness require the process to be more aligned to and essentially be more driven by ‘iterative procedures and feedback mechanisms’? What are the merits and demerits for such an approach?

To what extent will the efficacy of SEA rely upon the effectiveness of the iterative feedback mechanisms?

Where in the current SEA process can more structure and function for self-regulation be nested and what form should they take?

Constraints

Some methodological constraints experienced in the study are worth mentioning. Firstly, the SEA system was described based on the input of a small sample of English SEA experts, rendering the results more indicative than definitive. Secondly, the accurate modelling of complex real systems and of their interactions cannot be guaranteed, as total data capture remains impossible because such systems are in reality never closed. Thirdly, values and subjectivity likely permeated the model as decisions had to be made about the choice of system elements and their interrelationships; what to put in, what to simplify and what to leave out. The solution is to undertake more sensitivity analysis and learn from our mistakes until we can produce more accurate models of the SEA process behaviour. It is herein conceded that a cybernetic dimension to the SEA process behaviour is not a complete description of SEA: based on an assumed description and translation of rather complex and non-discrete relationships, using a series of mathematical equations. Perhaps, such a study is over-optimistic as the SEA process is valued for its analytical rationale at every stage of the SEA process (Montanez-Cartaxo Citation2014) as opposed to the notion of an output at the end of the systematic process. Finally, extrapolating any specific results from the English SEA system to the wider international SEA arena should be approached with caution. Rather, each SEA process should undergo its own sensitivity analysis in order to fully understand how its behaviour relates to its context.

Conclusion and recommendations

This study was driven by the oft-repeated definitional claim that the SEA process was systematic in nature, even though few empirical studies explored this assumption. The study set out to explore the functional understanding of the term systematic SEA process: in which the process is underpinned by self-regulatory mechanisms of negative and positive feedback loops (i.e. cybernetics). Sensitivity analysis was applied to the England SEA process, to explore and reliably map the behavioural aspects of the SEA process. The findings revealed behavioural characteristics at two levels: (1) at individual elements based their relative dominance and consequent influence in process activities, and (2); at holistic and systemic level. The behavioural roles of SEA elements were differentiated according to the sensitivity analysis terms of ‘active’, ‘reactive’, ‘critical’, ‘background’ and ‘buffering’ (Figure ). At the systemic process level, key behavioural tendencies were revealed as follows:

rather stable and exhibiting inertia to respond to changes;

starting conditions were a critical limiting factor, determining how the process developed, often with limited self-regulation;

compensatory behaviour was slow and only kicked in when a threshold had been reached, simultaneously, by a majority of SEA elements.

These revealed behavioural characteristics are an attempt to provide a systems thinking language for describing SEA process dynamics. This is done without having to provide a theory for any detailed interactions. In other words, one can reliably ‘smooth over’ the enormous amount of detail needed to explain/account for the complex and unobservable interactions within the SEA process, into a manageable set of clearly defined behavioural roles. It also partly addresses the epistemic challenges mentioned by Chaker et al. (Citation2006) and Retief (Citation2007), whilst promoting more systems thinking approach to the SEA process. As this was exploratory research, it is generally recommended that more sensitivity analysis studies be undertaken to further unpack the behavioural dynamics of the SEA process, with a view to establishing more valid models for diagnosing and simulating possible more effective strategic options within the SEA process.

Notes

1. The Collins English Dictionary defines dynamic as characterised by ‘full of energy; relating to force that produces a change, motion; constant change, activity, or progress’ (Citation2006, p. 263). This dynamic nature of SEA process involves a wide range of actors, input of new information, different views and interests, giving rise to varied levels of uncertainty and unpredictability in the process.

2. A process comprises several procedures.

3. Feedback is both a mechanism, process and signal that is looped back to control a system within itself, thereby facilitating self-regulation (Vester Citation2007).

4. This has become a standard definition of a system or process that is complex (see Vester Citation2007).

References

  • Ashby R. 1970. An introduction to cybernetics. 5th ed. London: Chapman & Hall.
  • Aumann RJ, Hart S. 2003. Long cheap talk. Econometrica. 71:1619–1660.10.1111/ecta.2003.71.issue-6
  • Axelsson A, Annandale D, Cashmore M, Slunge D, Ekbom A, Loayza F, Verheem R. 2012. Policy SEA: lessons from development co-operation. Impact Assess Proj Apprais. 30:124–129.10.1080/14615517.2012.659993
  • Bahremand A, De Smedt F. 2008. Distributed hydrological modeling and sensitivity analysis in Torysa watershed. Slovakia Water Resour Manage. 22:293–408.
  • Bartlett RV, Kurian PA. 1999. The theory of environmental impact assessment: implicit models of policy making. Policy Politics. 27:415–433.10.1332/030557399782218371
  • Bidstrup M, Hansen AM. 2014. The paradox of strategic environmental assessment. Environ Impact Assess Rev. 47:29–35.10.1016/j.eiar.2014.03.005
  • Bina O, Wallington T, Thissen W. 2011. SEA theory and research: an analysis of the discourse. In: Aschemann R, Jahn T, Partidario MR, Verheem R, editors. Handbook of strategic environmental assessment. Washington (DC): Earthscan; p. 445–471.
  • Bina O. 2008. Context and systems: thinking more broadly about effectiveness in strategic environmental assessment in China. Environ Manage. 42:717–733.10.1007/s00267-008-9123-5
  • Bojo J, Green K, Kishore S, Pilapitiya S, Reddy RC. 2004. Environment in poverty reduction strategies and poverty reduction support credits. Working paper No. 2. Washington (DC): World Bank.
  • Bonifazi A, Rega C, Gazzola P. 2011. Strategic environmental assessment and the democratisation of spatial planning. J Environ Planning Manage. 13:9–37.
  • Brown AL, Thérivel R. 2000. Principles to guide the development of strategic environmental assessment methodology. Impact Assess Proj Apprais. 18:183–189.10.3152/147154600781767385
  • Camerer CF. 2003. Behavioural game theory: experiments in strategic interaction. New York (NY): Russell Sage Foundation.
  • Caratti P, Dalkmann H, Jiliberto R. 2004. Analysing strategic environmental assessment. Cheltenham: Edward Elgar; p. 16–26.10.4337/9781845421533
  • Caschili S, De Montis A, Ganciu A, Ledda A, Barra M. 2014. The strategic environment assessment bibliographic network: a quantitative literature review analysis. Environ Impact Assess Rev. 47:14–28.10.1016/j.eiar.2014.03.003
  • Cashmore M, Richardson T, Hilding-Ryedvik T, Emmelin L. 2010. Evaluating the effectiveness of impact assessment instruments: theorising the nature and implications of their political constitution. Environ Impact Assess Rev. 30:371–379.10.1016/j.eiar.2010.01.004
  • Cashmore M. 2004. The role of science in environmental impact assessment: process and procedure versus purpose in the development of theory. Environ Impact Assess Rev. 24:403–426.10.1016/j.eiar.2003.12.002
  • [CEC] Commission of the European Commission. 2003. Directive 2003/35/EC of the European parliament and of the council of May 2003 providing for public participation in respect of the drawing up of certain plans and programmes relating to the environment and amending with regard to public participation and access to justice council directives 85/337/EEC and 96/61/EC. Official J Eur Union. L156:17–24.
  • Chaker A, El-Fadl K, Chamas L, Hatjian B. 2006. A review of strategic environmental assessment in 12 selected countries. Environ Impact Assess Rev. 26:15–56.10.1016/j.eiar.2004.09.010
  • Cherp A, Watt A, Vinichenko V. 2007. SEA and strategy formation theories: from three Ps to five Ps. Environ Impact Assess Rev. 27:624–644.10.1016/j.eiar.2007.05.008
  • Collins English dictionary. 2006. Glasgow: Harper Collins.
  • Curran JM, Wood C, Hilton M. 1998. Environmental appraisal of UK development plans: current practice and future directions. Environ Planning B Planning Des. 25:411–433.10.1068/b250411
  • Da Silva AWL, Selig PM, Lerípio A, Viegas CV. 2014. Strategic environmental assessment: one concept, multiple definitions. Int J Innov Sustainable Dev. 8:53–76.10.1504/IJISD.2014.059222
  • Dalkman H, Bongardt D. 2004. Case study – the German federal transport infrastructure plan (FTIP). In: Caratti P, Dalkmann H, Jiliberto R. Analysing strategic environmental assessment. Cheltenham: Edward Elgar; p. 123–154.
  • Deelstra Y, Nooteboom SG, Kohlmann HR, van den Berg J, Innanen S. 2003. Using knowledge for decision-making purposes in the context of large projects in The Netherlands. Environ Impact Assess Rev. 23:517–541.10.1016/S0195-9255(03)00070-2
  • Deming WE. 1986. Out of the crisis. Cambridge: MIT Press.
  • Dietz T, Stern PC. 2008. Public participation in environmental assessment and decision making. Washington (DC): National Academies Press.
  • Dorner D. 1996. The logic of failure: recognizing and avoiding error in complex situations. New York (NY): Metropolitan Books.
  • Doyle D, Sadler B. 1996. Environmental assessment in Canada: frameworks, procedures and attributes of effectiveness. Ottawa: Canadian Environmental Assessment Agency.
  • Eggenberger M, Partidário MR. 2000. Development of a framework to assist the integration of environmental, social and economic issues in spatial planning. Impact Assess Proj Apprais. 18:201–207.10.3152/147154600781767448
  • Elling B. 2008. Rationality and the environment: decision making in environmental politics and assessment. London: Earthscan.
  • Elling B. 2009. Rationality and effectiveness: does EIA/SEA treat them as synonyms? Impact Assess Proj Apprais. 27:121–131.10.3152/146155109X454294
  • Fischer TB, Gazzola P. 2006. SEA effectiveness criteria – equally valid in all countries? The case of Italy. Environ Impact Assess Rev. 26:396–409.10.1016/j.eiar.2005.11.006
  • Fischer TB, Onyango V. 2012. Strategic environmental assessment-related research projects and journal articles: an overview of the past 20 years. Impact Assess Proj Apprais. 30:253–263.10.1080/14615517.2012.740953
  • Fischer TB, Jha-Thakur U, Hayes S. 2015. Environmental impact assessment and strategic environmental assessment research in the UK. J Environ Assess Policy Manage. 17. doi:10.1142/S1464333215500167
  • Fischer TB. 2002. Strategic environmental assessment in transport and land use planning. London: Earthscan.
  • Fischer TB. 2007. Theory and practice of strategic environmental assessment – towards a more systematic approach. London: Earthscan.
  • Forrester JW. 2007. System dynamics: a personal view of the first fifty years. Syst Dyn Rev. 23:345–358.10.1002/(ISSN)1099-1727
  • Gao J, Kørnøv L, Christensen P. 2013. The politics of strategic environmental assessment indicators: weak recognition found in Chinese guidelines. Impact Assess Proj Apprais. 31:232–237.10.1080/14615517.2013.786925
  • Gauthier M, Simard L, Waaub JP. 2011. Public participation in strategic environmental assessment (SEA): critical review and the Quebec (Canada) approach. Environ Impact Assess Rev. 31:48–60.10.1016/j.eiar.2010.01.006
  • Geneletti D. 2014. Integration of impact assessment types improves consideration of alternatives. Impact Assess Proj Apprais. 32:17–18.10.1080/14615517.2013.872846
  • Gibson RB. 2006. Beyond the three pillars: sustainability assessment as a framework for effective integration of social, economic and ecological considerations in significant decision-making. J Environ Planning Manage. 8:259–280.
  • Hall AD, Fagan RE. 1956. Definition of system. Gen Syst. 1:18–28.
  • Hanusch M, Glasson J. 2008. Much ado about SEA/SA monitoring: the performance of English regional spatial strategies and some German comparisons. Environ Impact Assess Rev. 28:60–61.
  • Hart S, Mas-Colell A. 1997. Cooperation: game theoretic approaches. New York (NY): Springer-Verlag.10.1007/978-3-642-60454-6
  • Hilding-Rydevik T, Bjarnadóttir H. 2007. Context awareness and sensitivity in SEA implementation. Environ Impact Assess Rev. 27:666–684.10.1016/j.eiar.2007.05.009
  • Holland JH. 2006. Studying complex adaptive systems. J Syst Sci Complexity. 19:1–8.10.1007/s11424-006-0001-z
  • [IAIA] International Association for Impact Assessment. 2002. Strategic environmental assessment performance criteria. IAIA special publication series no. 1. Fargo: IAIA.
  • Illsley B, Jackson T, Deasley N. 2014. Spheres of public conversation: experiences in strategic environmental assessment. Environ Impact Assess Rev. 44:1–10.10.1016/j.eiar.2013.08.001
  • Jackson T, Illsley BM. 2007. An analysis of the theoretical rationale for using strategic environmental assessment to deliver environmental justice in the light of the Scottish Environmental Assessment Act. Environ Impact Assess Rev. 27:607–623.10.1016/j.eiar.2007.05.004
  • Jamshidian F, Zhu Y. 1997. Scenario simulation: theory and methodology. Financ Stoch. 1:43–67.
  • Jha-Thakur U, Gazzola P, Peel D, Fischer TB, Kidd S. 2009. Effectiveness of strategic environmental assessment – the significance of learning. Impact Assess Proj Apprais. 27:133–144.10.3152/146155109X454302
  • Jiliberto R. 2002. Decisional environment values as the object of analysis for strategic environmental assessment. Impact Assess Proj Apprais. 20:61–70.10.3152/147154602781766816
  • Jiliberto R. 2004. Setting the ground for a new SEA approach. In: Caratti P, Dalkmann H, Jiliberto R, editors. Analysing strategic environmental assessment. Cheltenham: Edward Elgar; p. 16–25.
  • Jiliberto R. 2007. Strategic environmental assessment: the need to transform the environmental assessment paradigms. J Environ Assess Policy Manage. 9:211–234.10.1142/S1464333207002731
  • Jiliberto R. 2011. Recognizing the institutional dimension of strategic environmental assessment. Impact Assess Proj Apprais. 29:133–140.10.3152/146155111X12959673795921
  • Joao E, McLauchlan A. 2011. Strategic environmental assessment as a tool to contribute to high-level policy objectives. J Environ Assess Policy Manage. 13:1–7.10.1142/S1464333211003766
  • Kagston M, Richardson T. 2015. Space for action: how practitioners influence environmental assessment. Environ Impact Assess Rev. 54:110–118.
  • Katok A, Hasselblatt B. 1995. Introduction to the modern theory of dynamical systems. Cambridge: Cambridge University Press.10.1017/CBO9780511809187
  • Kessler JJ, Abaza H. 2006. United Nations Environment Programme’s approach to integrated assessment of trade-related policies: evolution and recent progress. Impact Assess Proj Apprais. 24:273–283.10.3152/147154606781765093
  • Kørnøv L, Thissen WAH. 2000. Rationality in decision- and policy-making: implications for strategic environmental assessment. Impact Assess Proj Apprais. 18:191–200.10.3152/147154600781767402
  • Kornov L. 2015. Faces and functions of theory in impact assessment research. J Environ Assess Policy Manage. 17:1550008-1–1550008-9.
  • Kotter JP, Cohen DS. 2002. The heart of change real-life stories of how people change their organizations. Boston (MA): Harvard Business Publishing School.
  • Kotter JP. 1999. On what leaders really do. Boston (MA): Harvard Business School Press.
  • Lawrence D. 2003. Environmental impact assessment: practical solutions to recurrent problems. Hoboken (NJ): Wiley.10.1002/0471722022
  • Levitis D, Lidicker WZ, Freund Jr. Glenn. 2009. Behavioural biologists do not agree on what constitutes behaviour. Anim Behav. 78:103–110.10.1016/j.anbehav.2009.03.018
  • Lobos V, Partidario MR. 2014. Theory versus practice in strategic environmental assessment (SEA). Environ Impact Assess Rev. 48:34–46.10.1016/j.eiar.2014.04.004
  • Malik Management Zentrum. 2014. Malik Sensitivity Model®Prof Vester: the computerized system tools for a new management of complex problems. St. Gallen: Malik Management Zentrum.
  • Marr M. 2009. The natural selection: behavior analysis as a natural science. Eur J Behav Anal. 10:103–118.
  • Meuleman L. 2015. Owl meets beehive: how impact assessment and governance relate. Impact Assess Proj Apprais. 33:4–15.10.1080/14615517.2014.956436
  • Montanez-Cartaxo LE. 2014. Strategic environmental assessment in the Mexican electricity sector. J Environ Assess Policy Manage. 16:1–32.
  • Mulnix JW. 2012. Thinking critically about critical thinking. Educ Philos Theor. 44:464–479.10.1111/j.1469-5812.2010.00673.x
  • Mulvihill PR. 2003. Expanding the scoping community. Environ Impact Assess Rev. 23:39–49.10.1016/S0195-9255(02)00039-2
  • Nilsson M, Dalkmann H. 2001. Decision making and strategic environmental assessment. J Environ Policy Manage. 2:203–224.
  • Noble B. 2009. Promise and dismay: the state of strategic environmental assessment systems and practices in Canada. Environ Impact Assess Rev. 29:66–75.10.1016/j.eiar.2008.05.004
  • Noble BF, Gunn J, Martin J. 2012. Survey of current methods and guidance for strategic environmental assessment. Impact Assess Proj Apprais. 30:139–147.10.1080/14615517.2012.705076
  • Noble BF. 2000. Strategic environmental assessment: what is it and what makes it strategic? J Environ Assess Policy Manage. 2:203–224.
  • Noble BF. 2003. Auditing strategic environmental assessment practice in Canada. J Environ Assess Policy Manage. 5:127–147.10.1142/S1464333203001310
  • [ODPM] Office of the Deputy Prime Minister. 2005. Sustainability appraisal of regional spatial strategies and local development documents: guidance for regional planning bodies and local authorities. London: ODPM.
  • Onyango V. 2015. Enhancing environmental integration in strategic environmental assessment (SEA): insight from sensitivity analysis. J Environ Planning Manage. doi:10.1080/09640568.2015.1062745
  • Partidario MR, Sheate WR. 2013. Knowledge brokerage – potential for increased capacities and shared power in impact assessment. Environ Impact Assess Rev. 39:26–36.10.1016/j.eiar.2012.02.002
  • Partidario MR. 2000. Elements of an SEA framework: improving the added value of SEA. Environ Impact Assess Rev. 20:633–647.
  • Partidario MR. 2015. A strategic advocacy role in SEA for sustainability. J Environ Assess Policy Manage. 17:1550015-1–1550015-8.
  • Perdicoúlis A, Hanusch M, Kasperidus HD, Weiland U. 2007. The handling of causality in SEA guidance. Environ Impact Assess Rev. 27:176–187.10.1016/j.eiar.2006.09.001
  • Pope J, Bond A, Morrison-Saunders A, Retief F. 2013. Advancing the theory and practice of impact assessment: setting the research agenda. Environ Impact Assess Rev. 41:1–9.10.1016/j.eiar.2013.01.008
  • Rega C, Baldizzone G. 2015. Public participation in strategic environmental assessment: a practitioners’ perspective. Environ Impact Assess Rev. 50:105–115.10.1016/j.eiar.2014.09.007
  • Rehani S. 2002. Complex systems theory and development practice. London: Zed Books.
  • Retief F, Morrison-Saunders A, Geneletti D, Pope J. 2013. Exploring the psychology of trade-off decision-making in environmental impact assessment. Impact Assess Proj Apprais. 31:13–23. doi:10.1080/14615517.2013.768007
  • Retief F. 2007. A performance evaluation of strategic environmental assessment (SEA) processes within the South African context. Environ Impact Assess Rev. 27:84–100.10.1016/j.eiar.2006.08.002
  • Richardson T. 2005. Environmental assessment and planning theory: four short stories about power, multiple rationality and ethics. Environ Impact Assess Rev. 25:341–365.10.1016/j.eiar.2004.09.006
  • Sadler B. 1996. Environmental assessment in a changing world: evaluating practice to improve performance. Final report of the international study of the effectiveness of environmental assessment. Ottawa: Canadian Environmental Assessment Agency.
  • Seborg DE, Mellichamp DA, Edgar TF, Doyle FJ. 2004. Process dynamics and control. Hoboken (NJ): Wiley.
  • Sheate W, Dagg S, Richardson J, Aschemann Ralf, Palerm J, Steen U. 2003. Integrating the environment into strategic decision-making: conceptualizing policy SEA. Euro Environ. 13:1–18.10.1002/(ISSN)1099-0976
  • Shepard RB. 2005. Quantifying environmental impact assessments using fuzzy logic. New York (NY): Springer Science+Business Media; p. 1–8.10.1007/0-387-28098-7
  • Shewhart WA. 1980. Economic control of quality manufactured product. New York (NY): D. Van Nostrand Company. Reprint of 1931 version by the American Society of Quality Control.
  • Short M, Jones C, Carter J, Baker M, Wood C. 2004. Current practice in the strategic environmental assessment of development plans in England. Reg Stud. 38:177–190.10.1080/0034340042000190154
  • Sims L. 2012. Taking a learning approach to community-based strategic environmental assessment: results from a Costa Rican case study. Impact Assess Proj Apprais. 30:242–252.10.1080/14615517.2012.736761
  • Smith RD. 1999. Simulation: the engine behind the virtual world [Internet]. [cited 2014 Aug 10]. Available from: www.modelbenders.com/papers/sim2000/simulation%20engine.pdf
  • Smith S, Richardson J, McNab A. 2010. Towards a more efficient and effective use of strategic environmental assessment and sustainability appraisal in spatial planning: final report. London: Department of Communities and Local Government.
  • Smith SP, Sheate WR. 2001. Sustainability appraisals of regional planning guidance and regional economic strategies in England: an assessment. J Environ Planning Manage. 44:735–755.10.1080/09640560120080009
  • Stoeglehner G. 2010. Enhancing SEA effectiveness: lessons learnt from Austrian experiences in spatial planning. Impact Assess Proj Apprais. 28:217–231.10.3152/146155110X12772982841168
  • Storey K, Noble BF. 2005. Socio-economic effects monitoring: toward improvements informed by bio-physical effects monitoring. Impact Assess Proj Apprais. 23:210–214.10.3152/147154605781765526
  • Strogatz SH. 2001. Nonlinear dynamics and chaos: with applications to physics, biology and chemistry. London: Perseus.
  • Tetlow M-F, Hanusch M. 2012. Strategic environmental assessment: the state of the art. Impact Assess Proj Apprais. 30:15–24.10.1080/14615517.2012.666400
  • Therivel R, Partidario MR. 1996. The practice of strategic environmental assessment. London: Earthscan.
  • Therivel R, Walsh F. 2006. The strategic environmental assessment directive in the UK: 1 year onwards. Environ Impact Assess Rev. 26:663–675.10.1016/j.eiar.2006.03.001
  • Thérivel R, Wood Graham. 2005. Tools for SEA. In: Schmidt M, João E, Albrecht E, editors. Implementing strategic environmental assessment. Heidelberg: Springer; p. 349–363.10.1007/b138661
  • Therivel R. 2004. Strategic environmental assessment in action. London: Earthscan.
  • Turban E, Meredith JR. 1985. Fundamentals of management science. Durham (NC): Business Publications.
  • Van Buuren A, Nooteboom S. 2009. Evaluating strategic environmental assessment in The Netherlands: content, process and procedure as indissoluble criteria for effectiveness. Impact Assess Proj Apprais. 27:145–154.10.3152/146155109X454311
  • Vester F. 2007. The art of interconnected thinking – tools and concepts for tackling complexity. Munich: MCB Verlag.
  • Von Bertalanffy L. 1968. General system theory: foundations, development, applications. New York (NY): George Braziller.
  • Weigt M, Seidel M. 2004. Systematic process engineering and its application in product planning, paper at 5th Integrated Product Development Workshop 2004, Institute für Informationsmanagement im Ingenieurwesen, Universität Karlsruhe, TH [Internet]. [cited 2014 Sept 11]. Available from: www.imi.uni-karlsruhe.de/text/951.php
  • White L, Noble BF. 2013. Strategic environmental assessment for sustainability: a review of a decade of academic research. Environ Impact Assess Rev. 42:60–66.10.1016/j.eiar.2012.10.003
  • Wiener N. 1961. Cybernetics: or control and communication in the animal and the machine. Paris: MIT Press.10.1037/13140-000
  • Willems JC. 1997. On interconnections, control, and feedback. IEEE Trans Automatic Control. 42:326–339.10.1109/9.557576

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.