4,896
Views
7
CrossRef citations to date
0
Altmetric
Articles

System dynamics modelling and the use of evidence to inform policymaking

&
Pages 454-472 | Received 27 Feb 2022, Accepted 18 May 2022, Published online: 30 May 2022

ABSTRACT

In recent years there has been growing interest in the policy community to apply insights from system dynamics modelling to address the complexity of many policy issues. This, however, has occurred in parallel to recent developments in critical scholarship on the nature of evidence use within public policymaking. While system dynamics aims to assist in the analysis and solving of complex policy problems, in doing so it also serves to identify which pieces of data and evidence are considered policy-relevant, or how pieces of evidence fit within a complex policy space. In this paper, we combine insights from the fields of complex systems modelling and critical policy studies in relation to these issues. Scholars working on the use of evidence within policymaking have explored how policy problems, and their potential solutions, have a range of potential framings and constructions. They further identify how processes are undertaken to define problems, apply evidence, and choose solutions can themselves specify which constructions become realized. As system dynamics modelling is increasingly applied as a policy-informing tool, it is critical to reflect on how policy issues and their solutions are constructed or understood, as well as whose values and views are represented in doing so.

Introduction

Understanding and capturing complexity is a major ambition within policymaking due to the multifaceted nature of so many policy issues. Accordingly, complexity theory has seen increasing attention in fields such as public administration and management. This includes special issues in journals such as Public Management Review (Teisman and Klijn Citation2008) and Political Analysis (Goertz Citation2006), as well as systematic reviews undertaken to analyse the use of dynamic models that attempt to capture elements of complexity in fields such as health (Carey et al. Citation2015; Chang et al. Citation2017) or environment (Andersen, Rich, and Macdonald Citation2020). Complexity-based framing is also widely used by government evaluators, think tanks, and research consultancies (see Bicket Citation2019; Brandon et al. Citation2020).

There are, however, a range of potential issues for the policymaking community to consider as planning tools informed by complexity thinking are increasingly adopted. One particular set of concerns arises in relation to the nature and role of evidence in shaping policy processes and decisions. This topic is captured in a parallel area of interest in the public policy literature over recent years – with a growing field of critical policy scholarship exploring political aspects and implications of evidence use. Traditional calls for “evidence-based policymaking” or doing “what works” (UK Government Citation2013; Walter, Nutley, and Davies Citation2005) have increasingly been rejected as reflecting an overly naïve technocratic and depoliticized approach to understanding evidence use in policymaking (Biesta Citation2007; Cairney Citation2016). Instead, policy scholars have noted that evidence use can take many forms and meanings, and have undertaken analyses to understand why or how particular uses of evidence arise – drawing on a range of social and political theories to do so (see Cairney Citation2016a; Cairney, Oliver, and Wellstead Citation2016; Gibson, Lin, and Gibson Citation2003; Head Citation2010; Smith Citation2014; Weed Citationn.d.). Critical scholars drawing on constructivist traditions (e.g. within the Sociology of Knowledge or Science and Technology Studies) have further explored how the choices of evidence and processes of evidence use can in themselves have critical implications for the way that public policy problems are constructed, understood, or valued (Bracken and Oughton Citation2013; Dahler-Larson Citation2011; Lancaster Citation2014). Such a perspective allows critical consideration of which ideas, values, or concepts become realized within evidence-informed policy processes, but further permits questioning around what might constitute a “better” or “worse” use of evidence from different normative positions (Parkhurst Citation2017; Pielke Citation2007).

Within complexity theory, system dynamics modelling (at times shortened to system dynamics) is a methodological approach applied to problem-solving, providing a defined technique aiming to specify and operationalize actionable elements of complex systems. As such it has in itself become an increasingly popular method applied to understanding policy problems and guiding policy decision-making, appearing in fields such as health (Atun et al. Citation2007), sustainability (Stave Citation2002), and defence (McLucas and Elsawah Citation2020). As a field, system dynamics has a strong research community of practitioners, conferences, and stand-alone journals and receives sustained research funding worldwide. However, like all problem-solving approaches, system dynamics has its own formalized approaches to problem conceptualization and data utilization – approaches that will direct how policy-relevant evidence is understood, applied, and used to reinforce particular constructions of both policy problems and solution sets.

The critical policy studies literature on evidence use recognizes that what is considered policy-relevant evidence can be both shaped by, and influential in shaping, how policy problems and solution sets are understood. Yet, as a discipline, system dynamics has not explicitly engaged with these evidence-informed policy debates. Instead, discussions of policy relevance, evidence use, and stakeholder goals or representation have occurred concurrently across these two fields. Here, we seek to remedy this by exploring the implications of applying system dynamics approaches within policy processes in relation to evidence-informed policymaking.

Materials and methods

In this paper we first outline system dynamics modelling and its increased use in policymaking, considering how evidence and data are understood within this approach. We then apply a critical lens to consider the embedded understanding of the policy process found in many applications of system dynamics with the implications this has for the consideration of policy-relevant evidence. We select three key conceptual issues commonly considered in the public policy literature in relation to evidence use that particularly map onto the implications of using system dynamics modelling to inform policy and planning in relation to the bounds of rationality of policymaking; problem framing, agenda setting, and choice opportunities; and the governance of decision-making – in terms of representation and accountability within decision-making processes. This results in a critical discussion reflecting on where the literature on systems dynamics shows insights into this area as well as the further considerations that may be important for democratic policymaking which the critical policy scholarship brings to bear.

Results

System dynamics modelling in policy settings

System dynamics serves as a methodology for delineating, describing, and analysing complex systems and problems; as well as to help define and select solutions to those problems when applied in decision-making contexts. It was primarily developed out of the field of management, with an initial focus on issues in corporate firms such as patterns of inventory and corporate development (Forrester Citation2007). So, while early system dynamics primarily addressed production concerns arising in private companies, by the end of the 1960s it began to be used to address public challenges such as transport and healthcare. System dynamics models may be used in different ways, however. They can be used theoretically, without involvement of stakeholders, in which the objectives are driven by researchers themselves. Such would be seen perhaps through development of a model that was based on publicly available data but did not consult with stakeholders (de Gooyert and Größler Citation2018). System dynamic models can also be applied to active issues with the objectives driven by relevant stakeholders, such as with models built in consultation with stakeholders (de Gooyert and Größler Citation2018). When applied to public policy problems, system dynamics is typically used with the intention to deliver greater clarity in policymaking – allowing decision-makers to understand or identify multiple interacting factors that affect a policy problem, as well as to develop interventions or solutions based on the insights from the model (Sterman Citation2000, Citation2006). Indeed, system dynamics is often attractive for policymakers because of its general applicability to many policy areas, and the potential to conceptualize interconnected issues across policy areas including housing, electricity use, and environmental impact (see Eker, Zimmermann, and Carnohan Citation2018).

In recent health sector examples, system dynamics has been applied to communicable diseases such as HIV/AIDS (Dangerfield, Fang, and Roberts Citation2001; Weeks, Li, and Liao Citation2013), tuberculosis (Atun et al. Citation2007; Lebcir, Atun, and Coker Citation2010), chlamydia (Teng, Kong, and Tu Citation2015; Townshend and Turner Citation2000; Viana, Brailsford, and Harindra Citation2014), and COVID-19 (Jarynowski, Wojta-Kempa, and Platek Citation2020). In the environment policy space system dynamics approaches have been applied to areas such as wetland management (Chen, Chang, and Chen Citation2014), shifting mining in sustainable directions (Maluleke and Pretorius Citation2013), and forest management (Hossain, Ramirez, and Szabo Citation2020; Thompson and Dunn Citation2019). It is also applied in other policy areas such as transport (Fontoura, Chaves, and Ribeiro Citation2019; Haghshenas, Vaziri, and Gholamialam Citation2015; Liu, Benoit, and Liu Citation2015) and crime (Bianchi and Williams Citation2015; Jaen and Dyner Citation2008; Newsome Citation2008).

It is worth noting that within the category of systems methodologies (which themselves are part of the broader research paradigm of complexity science), what is referred to as system dynamics is not always clearly defined. Some define system dynamics strictly as research that results in a quantitative computer simulation model (Homer and Oliva Citation2001), whereas others include qualitative modelling processes that are informed by key concepts such as casual loop diagrams and stock and flow diagrams (Coyle Citation2000). System dynamics is not a self-contained field, and the approach is regularly used in interdisciplinary or transdisciplinary ways, blurring the boundaries of the discipline itself. Thus it has been described as taking “harder” directions where system dynamics researchers work to integrate the approach with other highly quantitative mathematical approaches such as agent-based modelling (Martin and Schlüter Citation2015) or fuzzy logic calculations (Karavezyris, Timpe, and Marzi Citation2002; Nasirzadeh, Afshar, and Khanzadi Citation2008). It also has been taken in so-called “softer” directions, such as integration with methodologies like critical systems thinking or operational research, in which authors reject the positivist implications of some modelling approaches and instead reflect on how values are embedded in system definition (Checkland Citation1995; Ulrich Citation1994; Ulrich and Reynolds Citation2010).

In this paper, we will primarily reflect on the use of evidence within applied system dynamics approaches that include a quantitative modelling component, although we recognize that some of the soft systems approaches might begin to address the conceptual issues discussed here as well. We also particularly focus on system dynamics applications in public policy contexts, rather than those developed in the private sector.

Over the decades, authors have characterized the model building process in different ways, with Luna-Reyes and Andersen (Citation2003, 275) summarizing the way key authors have described classic modelling procedures in the following table. The final stage of Implementation has been particularly highlighted in bold to flag where it has implications for policy decision-making:

Accounts of the system dynamics modelling process thus involve some form of (i) problem articulation or problem framing, (ii) system conceptualization, (iii) model building, (iv) simulation and evaluation, and ultimately (v) introduction of solutions to the policy space. This unique approach to problem framing has implications, however, for both the data chosen as evidence for the model, and the application of that evidence to identify or select policy solutions ().

Table 1. The system dynamics modelling process across classic literature, reproduced from Luna-Reyes and Andersen (Citation2003, 275) with emphasis added (note references in table are from the original source).

Data and evidence in system dynamics

The application of system dynamics to inform public policy is in many ways a process of evidence utilization, although the language itself may differ from the public policy literature – for example, there is a tendency within system dynamics to speak about “data” rather than evidence. Yet the motivation behind much system dynamics work is to identify the pieces of information to inform a decision that solves a problem or achieves a more desirable endpoint (Bérard Citation2010; Sterman Citation2000). There is often an assumption that not all relevant information is known to policymakers, or that it is not understood how it all fits together – with the system dynamics process (and the process of modelling) giving decision-makers new ways of seeing, with which to conceptualize problems and institute solutions based on decision-relevant data (Newell and Proust Citation2012). Sterman (Citation2003) has described the way data from experiments informs models as follows:

Experiments conducted in the virtual world inform the design and execution of experiments in the real world; experience in the real world then leads to changes and improvements in the virtual world and in participants’ mental models. (Sterman Citation2003, 28)

According to widely cited works in the field (e.g.: Forrester Citation1980; Sterman Citation2000), a system dynamics model draws on three main types of data or evidence: (i) numerical data, (ii) written reports (policies and procedures, manuals, published works, etc.), and (iii) the knowledge of stakeholders or experts in the system that is being examined (Forrester Citation1980). Sterman (Citation2000) again explains:

usually the modeller develops the initial characterization of the problem through discussion with the client team, supplemented by archival research, data collection, interviews, and direct observation or participation. (Sterman Citation2000, 90)

The “stakeholders” or “experts” in the system refers to those who actively take part in the group modelling process, facilitated by the systems modellers. These stakeholders provide details about the “system structure”, a term that refers to the ways that different factors in the model are connected. Modellers will work then with stakeholders to translate their knowledge about the policy problem into a system’s structure. Where a system dynamics model is to be quantified, there will typically be a further process of assigning available numerical data (data type i) to variables in the model, and also drawing on existing reports (data type ii) to inform system structure (Homer and Oliva Citation2001). In doing so, modellers and stakeholders work to collaboratively establish the definitions, structures, and meanings of policy problems being addressed and modelled.

From this perspective, the application of system dynamics modelling within policymaking spaces can potentially play an important knowledge translation role. Knowledge translation itself has become a burgeoning area of work to consider how actors, processes, or whole systems function to transfer or apply pieces of evidence to decision-making spaces (Shaxson, Ahmed, and Brien Citation2012). Yet some have challenged whether these efforts reflect critically enough on whether knowledge being “transferred” or “taken up” are appropriate to the specific needs and goals of decision-makers. In theory, if the systems modelling process accurately captures the concerns and values of the decision-makers, then it could help to identify the most appropriate forms and applications of evidence to achieve policy goals. Formalizing the use of modelling processes could thus work to institutionalize a system that aids in the identification and application of policy-appropriate evidence to solve established policy problems.

Selecting and transforming evidence in modelling processes

While the intention is clearly to capture the realities of the stakeholder’s experiences, crucially, pieces of evidence used to inform system dynamics models will also need to be selected and potentially transformed in key ways to fit within the confines of the modelling process. For instance, data typically need to fit the “stock and flow” and “feedback” concepts that form the core elements of all system dynamics models. A stock refers to the accumulation of something at a given point in time – be that money, water, numbers of people with a disease, and so on. A flow refers to movement of stocks. For example, Chen, Chang, and Chen (Citation2014) provide an environmental model in which a stock consists of pollutants in water, and the flow is the rate at which these accumulate. Another stock in the example is the number of yachts in the wetland area, with the flow being of tourism to the area (Chen, Chang, and Chen Citation2014). In a healthcare example, Weeks et al. modelled HIV spread in Chinese female sex workers. In this model, a stock was considered to be the number of sex workers using female condoms, and the flow is the rate of change between workers choosing to use or not use a female condom. These were presented as influenced by a range of factors depicted in the model including positive relationships with the workplace, public health interventions, and network/peer promotion of female condom use (Weeks, Li, and Liao Citation2013). It is worth noting, however, that the need for quantitative data, either empirically defined or estimated, to construct a mathematical system dynamics model can pre-select for variables that fit within the confines of stock and flow concepts. Thus concepts that might have weight within political analysis, such as the cultural power of groups, or charismatic authority, might be less easily quantified in system dynamics modelling.

The concepts of feedback, stocks, and flows are regarded as cornerstone concepts that enable both the system dynamics models themselves to work and provide the unique perspective to problem-solving that system dynamics claims for itself (Forrester Citation1969; Sterman Citation2000). Within a quantified system dynamics model, stocks, and flows are assigned quantities and rates by the modellers and stakeholders drawn from existing data if possible, or estimated by the group if no existing data are available (Sterman Citation2000). It is through the interaction of stocks and flows, with influence from feedback, that the “dynamic” structure of the system can be conceptualized (Sterman Citation2000).

So, for example, in the aforementioned model of HIV transmission through commercial sex work, information about the social networks of sex workers is transformed into a “stock”, a “flow”, or a “feedback variable” through the application of the method. This collapses a concept that might have numerous social meanings into a simplified form to fit the model needs. Doing so raises potential questions about whether the correct concepts are ultimately specified into model parameters – but the applicability of the stocks, flows, and feedback ideas to a wide range of public policy issues is indeed part of the appeal of the approach. When applied to private sector purposes (e.g. cost containment, or profit maximization), there may be little objection or importance for critical reflection on how socially constructed concepts are reified. Yet when system dynamics is used in public policy spaces, this can have important political implications, e.g. on the resourcing of sex worker education or policies to balance tourism and pollution in waterways – many of which may be politically contested, prioritized, or understood in different ways.

The role of evidence in the policy process

The political implications of the formalization of system dynamics within policy decision spaces can be seen to be linked to a number of key challenges that have been raised more broadly by critical policy scholars who have studied the use of evidence from an explicitly political lens. Such scholars have typically taken issue with the underlying assumptions within much knowledge translation work that policy is simply a rational process of technical problem-solving. As highlighted by Cairney (Citation2016), different assumptions about the role of evidence in the policy process derive from corresponding understandings of the policy process itself. Thus, rationalist perspectives of the policy process are associated with understanding of evidence as objective and apolitical. In contrast stand those who study evidence use in relation to the political, complex, and contested nature of policymaking, who take issue with such assumptions.

Lancaster, for instance, explains that: “‘policy-relevant knowledge’ may not be a stable concept but rather one which is itself constructed through the policy process, and, through a process of validation, is rendered useful” (Lancaster Citation2014, 948). From this perspective, the evidence used to build a policy-informing system dynamics model, and thus the evidence considered to be “policy-relevant knowledge”, is established throughout the system dynamics process itself – including the sub-processes of problem framing and evidence selection necessary to have the data fit the model structure. Critical policy scholars have further noted how evidence use in decision-making must be considered within the context of the values and agendas of those in decision-making power (Cairney Citation2016; Parkhurst and Ghilardi Citation2021; Weiss Citation1979). Such works explore how the creation, selection, and utilization of evidence (or science more broadly) for policymaking serves to both construct, prioritize, and valuate policy problems and their solutions (Hoppe and in’t Veld Citation2010; Jasanoff Citation2004).

These perspectives thus raise particular questions in relation to the use and application of evidence in system dynamics processes established to inform policymaking. Three particular overlapping considerations from the critical scholarship on evidence use in public policy settings are identified here, to reflect on the use of system dynamics in the following section. First is the way that evidence use, and thus the system dynamics modelling process, relates to the bounded rationality of policymaking. Second is how planning processes that draw on evidence can serve to specify policy understandings to particular constructions – collapsing the realities from a range of possible problematizations and sets of solutions to a narrow formalization of “the problem” and solution choices. And third, is the concern over representation and accountability when policy decision-makers utilize evidence and undertake planning activities that have implications for which social values and concerns are realized and prioritized in public policy spaces.

System dynamics and bounded rationality

The first concern derived from the policy studies literature on evidence use relates to the nature of rationality in decision-making and how evidence use plays into this within the policy process. Cairney in particular has explored the political aspects of evidence use around the concept of bounded rationality (Cairney Citation2016). He notes that many traditional approaches to evidence use rest on assumptions of a linear policy process informed by situations of comprehensive rationality. Yet these situations have been long recognized in the policy sciences as an ideal type that fails to capture the realities of most decision situates which are instead typified by gaps in knowledge, multiple competing needs in limited time, and diverse incentives. Bounded rationality refers to the limits on human thinking capacities, available information, and time to process these in decision-making (Simon Citation1955, Citation1982).

This principle of bounded rationality challenges the assumptions that evidence can be simply used as a problem-solving device, and rather point to the importance of understanding evidence use in relation to the cognitive limitations on decision-makers. Cairney argues that this points to the need to understand the strategies (be they conscious or unconscious) policymakers tend to enact in such contexts to apply evidence to policy problems, rather than assuming evidence speaks for itself in policy terms.

Botterill and Hindmoor (Citation2012) go further still applying the bounded rationality concept to evidence utilization for policy, highlighting that bounded rationality doesn’t simply affect decision-making once evidence has been collected, but also the selection and collection of the evidence base. The decisions made in collecting an evidence base are themselves subject to the bounded rationality of decision-makers: “The very information that is available to policy-makers as an input into their decision-making processes is itself subject to the constraints of summary and interpretation” (368). Attempts to systematically collect and communicate evidence for policy use necessarily involve a simplification process, a process of choosing and selecting affected by bounded rationality. They further highlight that points that may be keenly contested within the scientific community may be considered too complex for use in a policy space, meaning that “Alternative interpretations and contradictory findings are not simply ignored – they ‘remain unseen’” (Botterill and Hindmoor Citation2012, 368 citing Fleck Citation1979 [Citation1935], 27).

Thus, in relation to the use of evidence within decision-making, policy scholars have highlighted two important aspects of bounded rationality – in relation to the limitations and biases of the decision-makers themselves, but also in relation to collective processes that serve to collect and present evidence to decision-makers. Both of these can have important implications in relation to the use of system dynamics to inform policy decisions which fundamentally relies on a set of experts collecting and synthesizing information for decision-making through a process that must in many ways simplify complex realities.

The concept of bounded rationality in decision-making, however, is an area where there has been some explicit reflection within the system dynamics literature – although not in relation to evidence use per se. Despite a tendency to conceive of the policy process as linear (with the associated treatment of evidence), some in the system dynamics field have addressed the incomplete information available to those involved in model-building processes. Größler for instance (referencing Sterman’s work) explains how:

modelling itself is affected by bounded rationality because the modeller is prone to possess only limited capabilities to perceive and understand a real-world system. Because modellers are themselves biased (Sterman Citation2000), the work they create, i.e. the entire model, will relate to the referent system in a boundedly rational way. (Größler Citation2004)

Further, Größler (Citation2004) describes that the bounded understandings of the modeller are subsequently transferred into the models through the process followed. He explains: “Following system dynamics practice, this transformation implies that artefacts of bounded rationality occurring in the real world have to be represented in the formal model” (Größler Citation2004, 325). Yet the author is optimistic that the system dynamics processes can, in fact, overcome biases and limitations of this kind, arguing:

With the help of [system dynamics] understanding they [policymakers] can design new structures of the real system. By way of improved policies and reduced structural complexity, this could lead to a mitigation of negative effects of bounded rationality. (Größler Citation2004, 325)

Whether or not the system dynamics modelling process imposes biases and simplifications, or provides a mechanism by which they can be recognized and overcome may be a matter of how such issues are addressed in the process itself. What becomes clear is that the bounded rationality of decision-making, and of the processes of evidence utilization, is something that can be explicitly discussed or addressed to ensure that system dynamics processes reflect on potential biases in their goal to produce policy-relevant information to inform decision-making.

Problem choice, and the construction of problems and solution sets

Another concern in relation to evidence use for policy raised by public policy scholars lies more fundamentally around problem choice and issue construction. The first of these is captured by Kingdon’s classic definition of agenda setting as a “process of narrowing a set of conceivable subjects to the set that actually becomes the focus of attention” (Kingdon Citation1995, 3). Thus, while many issues might become the subject of systems modelling, the integration of system dynamics into policymaking will necessarily involve a process through which a number of possible subjects are selected to be the subject of modelling and become the focus of attention. The second issue, however, relates more specifically to the importance of recognizing the constructed nature of both policy problems and their potential solutions within critical policy studies (Bacchi Citation2009; Beland Citation2005). This tradition of work recognizes that policy problems, and their potential solutions, have a range of conceivable constructions, and considers how the political arena serves as spaces in which policy issues can be (re)framed and ultimately become constructed as problems to be solved.

As described by system dynamics authors, systems modelling explicitly involves a process of problem framing and definition (Luna-Reyes and Andersen Citation2003). However, the generated model itself can be thought of as problem frame for a certain understanding of the system (Midgley Citation1996). This idea of the model serving as a problem frame, or a hypothesis for a certain understanding of the system, is a point well recognized within the discipline itself. Within system dynamics, models are often referred to as depicting a “dynamic hypothesis” (Sterman Citation2003, Citation2006). Here, the “hypothesis” term is used as a concession that the model cannot be a fully complete representation of the system but is rather trying to get at the best possible set of ideas of how the problem is playing out. This idea is also at the core of “soft” systems methodologies that conceive of the model as a contestable object that is open to debate (Midgley Citation1996, Citation1997; Ulrich Citation1994; Ulrich and Reynolds Citation2010, 2020). Because a system dynamics model can be thought of as an argument for a certain problem frame, it can also be interrogated in the same way that any argument can be interrogated (Ulrich and Reynolds Citation2010, Citation2020) and examined for its use of evidence to validate the argument.

Within the Sociology of Science, it is taken that choices about the form of evidence that is deemed valid and meaningful are defined by particular social values (Merton Citation1973). Scholars have explored the interface between science, knowledge, and political decision-making – reflecting on how the policymaking process can shape knowledge claims while scientific knowledge or evidence can also play key roles in legitimizing policy decisions (Fischer Citation1995; Majone Citation1989; Radaelli Citation1995). Similarly, within this field, Bacchi’s (Citation2009, Citation2016) work can be used to explore the role of evidence in relation to the processes of problem representation – part of a process by which problems are constructed and solutions specified (Walton Citation2016). Thus, the selection of evidence from all available information into a supposed “evidence base” on which to inform policy can serve to perpetuate or formalize the particular values informing that evidence base (Parkhurst Citation2017; Stewart and Smith Citation2015). Evidence use can therefore play a centrally productive function in the construction and understanding of policy problems, and any formalized evidence utilization process – including the application of system dynamics – can have an influence on the way that a policy problem is framed; formalizing which stakeholders values, ideas, and goals are used to understand the policy problem and potential solutions.

In these key ways, there is overlap in some concepts recognized in the system dynamics literature with constructivist reflections of how policy problems and solutions are produced in relation to evidence within decision-making processes. When a group of stakeholders begins a system dynamics process there is a need first to prioritize which issues will be selected, but beyond that, there will be a great number of possible policy systems, problem frames, and solutions to be explored within the one group. As the process of system dynamics continues, power dynamics within the room, combined with the introduced mechanistic ways of understanding the problem such as “stocks and flows” can intersect to effectively reduce or constrain the number of problem frames and solutions explored (Ulrich and Reynolds Citation2010, 11). There are decision points (Cohen, March, and Olsen Citation1972) reached within a group modelling process, such as the naming of a variable, the decision to focus on a particular “system archetype” (Senge Citation1990) or the identification of “emergent properties” (Meadows Citation1999) – each of which will act to narrow and specify the problem frame that a group draws on in the session. The processes involved in selecting variables or populating models with data will work to construct the understanding of what the problem is, and what the solutions may be. Thus, elements of systems models can be seen to affect the bounds of rationality of the decision-makers, while the processes of populating variables with evidence can further be seen to construct and reify particular problematizations and solution sets for decision-makers to consider. While the systems literature is aware of the constructed nature of many such elements (Rouwette, Korzilius, and Vennix Citation2011), the critical policy literature enables these insights to be linked to consideration of their political implications.

Participation, representation, and accountability in system dynamics modelling

The final consideration explored here relates to how the utilization of evidence itself within decision-making spaces has governance implications in terms of the nature of representation and accountability of policymaking. The preceding section notes that when policymakers utilize evidence within systems planning activities it will shape the choice and construction of policy problems and potential solution sets. In doing so, however, there will naturally be further implications in relation to which societal values and public concerns are realized and prioritized in the resultant choices.

Within the critical scholarship on evidence use, these issues align with concerns around representation and accountability within the processes of evidence utilization, and growing calls for participation and deliberation in evidence-to-policy processes (see Flitcroft and Gillespie Citation2011; MacGregor and Cooper Citation2020). In reflecting specifically on using complexity theory for policy evaluation, Walton explains: “[a] key mechanism for governance networks to produce solutions to complex problems lies in the ability to bring multiple perspectives and knowledge into a deliberative decision-making process” (Citation2016, 82). Similarly, Biesta (Citation2007) has raised concerns over how evidence-use practices concerned with the application of effectiveness research can potentially limit opportunities for democratic participation in education policymaking.

Parkhurst (Citation2017) has further applied the concept of legitimacy in relation to evidence provision systems, to ask whether the institutional bodies and advisory systems providing evidence are accountable to, and representative of, the population served. Legitimacy can arise through science-advisory bodies being established with a formal mandate from publicly accountable authorities, but also depend on whether the process they follow is transparent and subject to deliberative engagement by the public (to ensure that priorities and value choices within evidence utilization systems reflect those of the ultimate population beneficiaries). Parkhurst considers these issues within a broader “good governance of evidence” framework which emphasizes the importance of both democratic and scientific principles such as transparency, accountability, contestability, and rigour. Other scholars have further explored the dynamic boundaries between science and policy, illustrating that ideas of what constitutes expertise, the legitimacy of scientists, and what is seen as policy-relevant science, can actually be continually constructed and redefined throughout science-policy interactions (Cozzens et al. Citation1995; Hoppe and in’t Veld Citation2010; Jasanoff Citation1987).

When system dynamics modelling is used in policy areas, particularly drawing on the input of selected stakeholders to inform the models, questions of representation in the construction of model building, problem framing, and evidence selection therefore become significant. Who is included or excluded from the process of system dynamics modelling, as well as the transparency and contestability of these processes are in turn important to consider when utilizing system dynamics modelling to provide evidence for policy decisions. Within the system dynamics literature, however, there is considerable variability in reporting on representation in stakeholder selection. For example, Hossain, Ramirez, and Szabo (Citation2020) give some detail about their stakeholder selection processes including descriptions of group process and participant selection, and Achterkamp and Vos (Citation2007) examine processes for identifying stakeholders for involvement in systems approaches based on principles of boundary critique. Whereas other work, for example, Chen, Chang, and Chen (Citation2014) and Shen, Chen, and Tang (Citation2009), both papers producing system dynamic models, do not detail their stakeholder involvement or selection in any way. These examples are characteristic of the variability found in system dynamics in relation to detailing stakeholder selection. This variability in the reporting and examination of stakeholder selection and democratic representation in system dynamics raises challenges in the ability to evaluate representativeness and accountability within the processes of system dynamics model construction and policy influence.

While the system dynamics field does not have a unified stance on stakeholder selection and democratic representation, these issues are addressed explicitly in related systems-based methods and methodologies such as critical systems thinking (Midgley Citation1996, Citation1997; Ulrich Citation1994) and operational research (Checkland Citation1995). Critical systems thinking, for instance, offers a set of questions focusing on sources legitimacy, knowledge, power, and motivation that are drawn upon to inform a system dynamics model. Thus, the work of critical systems thinking potentially provides opportunities to integrate more explicit political considerations within efforts to use system dynamics modelling in public policymaking. Achterkamp and Vos (Citation2007) make explicit the connection between stakeholder participation in system dynamics and the problem analysis and solutions that result from this: “What is and who are included or excluded is crucial: a different system boundary may result in a different problem analysis and, accordingly, in different solutions or changes.” (Achterkamp and Vos Citation2007, 6). However, at present these questions still appear to be only rarely examined within system dynamics modelling processes in public policy spaces, and only when informed by critical system thinking discourse.

Discussion

The application of system dynamics offers decision-makers tools to try to manage elements of complexity in their decisions. In doing so, system dynamics modelling provides a conduit through which policy relevant data and evidence can be better identified, understood, and therefore utilized to help achieve programme goals. While system dynamics modelling presents an opportunity to formalize evidence provision in some problem-solving ways, there is a wide body of public policy work that has problematized the idea that any use of tools can result in policy being a simple rational process of technical problem-solving. Indeed, this rationalist model has been rejected as an inaccurate representation of policymaking since Lindblom described “the science of muddling through”, over half a century ago (Lindblom Citation1959). In relation to evidence use, Carol Weiss similarly noted that “it probably takes an extraordinary concatenation of circumstances for research to influence policy decisions directly … the problem-solving model of research use probably describes a relatively small number of cases” (Weiss Citation1979, 428).

In this paper, we have explored three broad issues raised by public policy scholars of evidence use to reflect on their implications when considering the increased application of system dynamics to policymaking: the bounded rationality inherent in decision-making; the selection and construction of policy problems and solutions; and the representation and accountability of those stakeholders involved in evidence selection and application. In all of these cases, political considerations relate not just to individuals, but also to the systems themselves that shape the use of evidence in policymaking, including any formalized processes followed within planning processes. We recognize, however, that there are other critical policy perspectives that have been used to explore features of evidence use which were outside the feasible scope to explore in this article, yet could provide further areas of exploration in relation to the implications of increased application of system dynamics in policy spaces. One example would be to empirically explore how incorporating system dynamics modelling processes appears to affect the spaces for public deliberation within evidence-informed policymaking. Another area providing scope for further consideration lies in the work of scholars who explore how evidence advice has been historically constructed or embedded into policy and planning systems from an institutionalist perspective (see Elvbakken and Hansen Citation2019; Kuchenmüller and Boeira Citation2022; Nutley, Walter, and Bland Citation2002). Over time, evidence provision systems are likely to evolve both in terms of their thinking on the role of tools such as system dynamics modelling, or potentially the structural arrangements facilitating their use – providing further areas for empirical and conceptual work.

Indeed, while a historical institutionalist lens was not applied in depth here, it is clear that when applied to real-world policy problems, system dynamics processes will be embedded within existing political relationships and decision-making structures. It may be that only certain policy problems are conducive to system dynamics modelling processes, or that individuals facilitating system dynamics may be bounded in their views of what issues to address. This implies a potential risk of bias to certain issues over others which may need to be explicitly considered – what Parkhurst (Citation2017) has termed “issue bias” in terms of how the forms or applications of evidence can bias attention to, or increase the priority of, particular issues. In its application, policy-relevant evidence must further be transformed into the core concepts of system dynamics – with the stocks, flows, and feedback themselves serving as rationality bounding concepts inherent in the process. This has further political implications, as such transformation necessitates this narrowing or collapsing of conceivable problem frames, or solution sets, into this specific framework of system dynamics. Some of these are elements which critical systems thinking has already recognized in some ways. However, integrating systems modelling processes will embed new actors (such as modelling facilitators) into planning structures, or invite particular stakeholders into model building processes without necessarily considering how changing the orientation of stakeholders can have implications for representation, accountability, and legitimacy of the decision-making process.

Conclusion

In the search to better capture real-world complexity, public policymakers have increasingly looked at tools such as system dynamics modelling to provide information that can inform decision-making. The practice of system dynamics modelling, as a technical exercise that names stocks, flows, and variables, may appear outside politics from a technocratic perspective. However, such exercises will involve the issues of problem construction, political power dynamics, and value representation as occur in other planning processes. When implementing or institutionalizing the use of a particular policy-informing and evidence-utilization methods such as these, we must remain critical of the boundaries that decide what is and is not considered important in the depiction of a system. Thus, problems will continue to be constructed with limitations, values prioritized according to political power structures, and solutions specified accordingly. It is, therefore, essential to not obscure or depoliticize the inherently political acts that take place through the policy-application of methods found in system dynamics modelling or complexity theory more generally.

In this paper, we have highlighted some of the conceptual intersections between complexity thinking and public policy discussions of the use of evidence in policymaking. These include bounded rationality, the interplay between evidence selection and problem definition, and issues of accountability and representation regarding who is included in system dynamics modelling processes. Such perspectives can naturally lead to further thinking about how to institutionalize the use of such tools, and the democratic implications of doing so. As policy spaces continue to engage in system dynamics modelling and the use of other such tools, attention on these issues associated with evidence selection and use will be essential.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by Research England's Connecting Capabilities Fund through the Bloomsbury SET programme (Research England grant CCF17-7779).

Notes on contributors

Eleanor Malbon

Elearnor Malbon is a Research Fellow at the Centre for Social Impact, University of New South Wales, where she is part of an interdisciplinary team of researchers committed to addressing inequality. She is an expert in systems-based research methods and uses these along with community efforts to help find solutions to complex problems.

Justin Parkhurst

Justin Parkhurst is an Associate Professor of Global Health Policy at the London School of Economics and Political Science (the LSE). He is an expert in health policy and politics, and the use of evidence to inform policymaking. He is the author of the open-access book The Politics of Evidence: from Evidence-Based Policy to the Good Governance of Evidence published by Routledge.

References

  • Achterkamp, M. C., and J. F. J. Vos. 2007. “Critically Identifying Stakeholders: Evaluating Boundary Critique as a Vehicle for Stakeholder Identification.” Systems Research and Behavioral Science 24 (1): 3–14. doi:10.1002/sres.760.
  • Andersen, D., E. Rich, and R. Macdonald. 2020. “System Dynamics Applications to Public Policy.” In: System Dynamics. Encyclopedia of Complexity and Systems Science Series. New York, NY: Springer.
  • Atun, R. A., R. M. Lebcir, M. McKee, J. Habicht, and R. Coker. 2007. “Impact of Joined-up HIV Harm Reduction and Multidrug Resistant Tuberculosis Control Programmes in Estonia: System Dynamics Simulation Model.” Health Policy 81 (2–3): 207–217. doi:10.1016/j.healthpol.2006.05.021.
  • Bacchi, C. 2009. Analysing Policy: What’s the Problem Represented to Be? NSW: Pearson.
  • Bacchi, C. 2016. “Problematizations in Health Policy: Questioning How ‘Problems’ are Constituted in Policies.” Sage Open April-June(1–16), doi:10.1177/2158244016653986.
  • Beland, D. 2005. “Ideas and Social Policy: An Institutionalist Perspective.” Social Policy and Administration 39 (1): 1–18. doi:10.1111/j.1467-9515.2005.00421.x.
  • Bérard, C. 2010. “Group Model Building Using System Dynamics: An Analysis of Methodological Frameworks.” Accessed 15 January 2015. http://basepub.dauphine.fr/handle/123456789/5035.
  • Bianchi, C., and D. W. Williams. 2015. “Applying System Dynamics Modeling to Foster a Cause-and-Effect Perspective in Dealing with Behavioral Distortions Associated with a City’s Performance Measurement Programs.” Public Performance & Management Review 38 (3): 395–425. doi:10.1080/15309576.2015.1006471.
  • Bicket, M. 2019. Complexity Evaluation Framework. Department for Environment, Food and Rural Affairs. http://sciencesearch.defra.gov.uk/Document.aspx?Document=14675_ComplexityEvaluationFramework.pdf
  • Biesta, G. 2007. “Why ‘What Works’ Won’t Work: Evidence-Based Practice and the Democratic Deficit in Educational Research.” Educational Theory 57 (1): 1–22.
  • Botterill, L. C., and A. Hindmoor. 2012. “Turtles All the Way Down: Bounded Rationality in an Evidence-Based Age.” Policy Studies 33 (5): 367–379. doi:10.1080/01442872.2011.626315.
  • Bracken, L., and E. Oughton. 2013. “Making Sense of Policy Implementation: The Construction and Uses of Expertise and Evidence in Managing Freshwater Environments.” Environmental Science & Policy 30: 10–18.
  • Brandon, M., P. Sidebotham, P. Belderson, et al. 2020. “Complexity and Challenge: A Triennial Analysis of SCRs 2014-2017 Final Report.” UK Government, Department of Education.
  • Cairney, P. 2016. The Politics of Evidence Based Policy. London: Palgrave Pivot.
  • Cairney, P., K. Oliver, and A. Wellstead. 2016. “To Bridge the Divide Between Evidence and Policy: Reduce Ambiguity as Much as Uncertainty.” Public Administration Review 76 (3): 399–402. doi:10.1111/puar.12555.
  • Carey, G., E. Malbon, N. Carey, et al. 2015. “Systems Science and Systems Thinking for Public Health: A Systematic Review.” British Medical Journal. Accessed 11 July 2016. http://www.academia.edu/download/40830972/BMJ.pdf.
  • Chang, A. Y., O. Ogbuoji, R. Atun, et al. 2017. “Dynamic Modeling Approaches to Characterize the Functioning of Health Systems: A Systematic Review of the Literature.” Social Science & Medicine 194: 160–167. doi:10.1016/j.socscimed.2017.09.005.
  • Checkland, P. 1995. “Model Validation in Soft Systems Practice.” Systems Research 12 (1): 47–54. doi:10.1002/sres.3850120108.
  • Chen, H., Y.-C. Chang, and K.-C. Chen. 2014. “Integrated Wetland Management: An Analysis with Group Model Building Based on System Dynamics Model.” Journal of Environmental Management 146: 309–319.
  • Cohen, M. D., J. G. March, and J. P. Olsen. 1972. “A Garbage Can Model of Organizational Choice.” Administrative Science Quarterly 17 (1): 1. doi:10.2307/2392088.
  • Coyle, G. 2000. “Qualitative and Quantitative Modelling in System Dynamics: Some Research Questions.” System Dynamics Review 16 (3): 225–244.
  • Cozzens, S., E. Woodhouse, et al. 1995. “Science, Government, and the Politics of Knowledge.” In Handbook of Science and Technology Studies, edited by S. Jasanoff, G. Markle, and J. Peterson, 533–553. Thousand Oaks, CA: Sage.
  • Dahler-Larson, P. 2011. “Organizing Knowledge: Evidence and the Construction of Evaluative Information Systems.” In From Studies to Streams: Managing Evaluative Systems, edited by R. Rist, and N. Stame, 65–80. New Brunswick: Transaction Publishers.
  • Dangerfield, B. C., Y. Fang, and C. A. Roberts. 2001. “Model-based Scenarios for the Epidemiology of HIV/AIDS: The Consequences of Highly Active Antiretroviral Therapy.” System Dynamics Review 17 (2): 119–150. doi:10.1002/sdr.211.
  • de Gooyert, V., and A. Größler. 2018. “On the Differences Between Theoretical and Applied System Dynamics Modeling.” System Dynamics Review 34 (4): 575–583. doi:10.1002/sdr.1617.
  • Eker, S., N. Zimmermann, S. Carnohan, et al. 2018. “Participatory System Dynamics Modelling for Housing, Energy and Wellbeing Interactions.” Building Research & Information 46 (7): 738–754. doi:10.1080/09613218.2017.1362919.
  • Elvbakken, K. T., and H. F. Hansen. 2019. “Evidence Producing Organizations: Organizational Translation of Travelling Evaluation Ideas.” Evaluation 25 (3): 261–276.
  • Fischer, F. 1995. Evaluating Public Policy. Chicago, IL: Nelson Hall.
  • Fleck, L. 1979 [1935]. Genesis and Development of a Scientific Fact. Chicago: University of Chicago Press.
  • Flitcroft, K., J. Gillespie, et al. 2011. “Getting Evidence Into Policy: The Need for Deliberative Strategies?” Social Science & Medicine 72 (7): 1039–1046.
  • Fontoura, W. B., G de LD Chaves, and G. M. Ribeiro. 2019. “The Brazilian Urban Mobility Policy: The Impact in São Paulo Transport System Using System Dynamics.” Transport Policy 73: 51–61. doi:10.1016/j.tranpol.2018.09.014.
  • Forrester, J. 1969. Urban Dynamics. Cambridge, MA: MIT Press.
  • Forrester, J. W. 1980. “Information Sources for Modeling the National Economy.” Journal of the American Statistical Association 75 (371): 555–566. doi:10.1080/01621459.1980.10477508.
  • Forrester, J. W. 2007. “System Dynamics – the Next Fifty Years.” System Dynamics Review 23 (2–3): 359–370. doi:10.1002/sdr.381.
  • Gibson, B., V. Lin, and B. Gibson. 2003. “Beyond “Two Communities”. In: Evidence-Based Health Policy: Problems and Possibilities, 18–30. Oxford: Oxford University Press.
  • Goertz, G. 2006. “Introduction to the Special Issue ‘Causal Complexity and Qualitative Methods’.” Political Analysis 14 (3): 223–226. doi:10.1093/pan/mpj016.
  • Größler, A. 2004. “A Content and Process View on Bounded Rationality in System Dynamics.” Systems Research and Behavioral Science 21 (4): 319–330. doi:10.1002/sres.646.
  • Haghshenas, H., M. Vaziri, and A. Gholamialam. 2015. “Evaluation of Sustainable Policy in Urban Transportation Using System Dynamics and World Cities Data: A Case Study in Isfahan.” Cities 45: 104–115. doi:10.1016/j.cities.2014.11.003.
  • Head, B. W. 2010. “Reconsidering Evidence-Based Policy: Key Issues and Challenges.” Policy and Society 29 (2): 77–94. doi:10.1016/j.polsoc.2010.03.001.
  • Homer, J., and R. Oliva. 2001. “Maps and Models in System Dynamics: A Response to Coyle.” System Dynamics Review 17 (4): 347–355. doi:10.1002/sdr.224.
  • Hoppe, R. 2010. “From ‘Knowledge Use’ Towards ‘Boundary Work’: Sketch of an Emerging New Agenda for Inquiry Into Science-Policy Interaction.” In Knowledge Democracy. Consequences for Science, Politics and Media, edited by R. In't Veld, 169–186. Heidelberg: Springer.
  • Hossain, M. S., J. Ramirez, S. Szabo, et al. 2020. “Participatory Modelling for Conceptualizing Social-Ecological System Dynamics in the Bangladesh Delta.” Regional Environmental Change 20 (1): 28. doi:10.1007/s10113-020-01599-5.
  • Jaen, S., and I. Dyner. 2008. “Criminal Cycles in the Illegal Drug Industry: A System Dynamics Approach Applied to Colombia.” In: 2008 Winter Simulation Conference, Miami, FL, . 1429–1436. IEEE. doi:10.1109/WSC.2008.4736220.
  • Jarynowski, A., M. Wojta-Kempa, D. Platek, et al. 2020. “Attempt to Understand Public Health Relevant Social Dimensions of COVID-19 Outbreak in Poland.” Society Register 4 (3): 7–44. doi:10.14746/sr.2020.4.3.01.
  • Jasanoff, S. S. 1987. “Contested Boundaries in Policy-Relevant Science.” Social Studies of Science 17 (2): 195–230. doi:10.1177/030631287017002001.
  • Jasanoff, S. 2004. States of Knowledge: The Co-production of Science and the Social Order. London: Routledge.
  • Karavezyris, V., K.-P. Timpe, and R. Marzi. 2002. “Application of System Dynamics and Fuzzy Logic to Forecasting of Municipal Solid Waste.” Mathematics and Computers in Simulation 60 (3–5): 149–158. doi:10.1016/S0378-4754(02)00010-1.
  • Kingdon, J. 1995. Agendas, Alternatives, and Public Policies. 2nd ed. New York: Harper Collins.
  • Kuchenmüller, T., L. Boeira, et al. 2022. “Domains and Processes for Institutionalizing Evidence-Informed Health Policy-Making: A Critical Interpretive Synthesis.” Health Research Policy and Systems 20: 27. doi:10.1186/s12961-022-00820-7
  • Lancaster, K. 2014. “Social Construction and the Evidence-Based Drug Policy Endeavour.” International Journal of Drug Policy 25 (5): 948–951.
  • Lebcir, R. M., R. A. Atun, and R. J. Coker. 2010. “System Dynamic Simulation of Treatment Policies to Address Colliding Epidemics of Tuberculosis, Drug Resistant Tuberculosis and Injecting Drug Users Driven HIV in Russia.” Journal of the Operational Research Society 61 (8): 1238–1248. doi:10.1057/jors.2009.90.
  • Lindblom, C. E. 1959. “The Science of ‘Muddling Through’.” Public Administration Review 19 (2): 79–88.
  • Liu, H., G. Benoit, T. Liu, et al. 2015. “An Integrated System Dynamics Model Developed for Managing Lake Water Quality at the Watershed Scale.” Journal of Environmental Management 155: 11–23. doi:10.1016/j.jenvman.2015.02.046.
  • Luna-Reyes, L. F., and D. L. Andersen. 2003. “Collecting and Analyzing Qualitative Data for System Dynamics: Methods and Models.” System Dynamics Review 19 (4): 271–296. doi:10.1002/sdr.280.
  • MacGregor, S., and A. Cooper. 2020. “Blending Research, Journalism, and Community Expertise: A Case Study of Coproduction in Research Communication.” Science Communication 42 (3): 340–368.
  • Majone, G. 1989. Evidence, Argument, and Persuasion in the Policy Process. New Haven, CT: Yale University Press.
  • Maluleke, G., and L. Pretorius. 2013. “A Systems Thinking Approach to Sustainable Manganese Mining: A Case for System Dynamics Modelling.” In: 2013 IEEE International Conference on Industrial Technology (ICIT). Cape Town, 1450–1455. IEEE. doi:10.1109/ICIT.2013.6505885. February 2013
  • Martin, R., and M. Schlüter. 2015. “Combining System Dynamics and Agent-Based Modeling to Analyze Social-Ecological Interactions – an Example from Modeling Restoration of a Shallow Lake.” Frontiers in Environmental Science 3. doi:10.3389/fenvs.2015.00066.
  • McLucas, A. C., and S. Elsawah. 2020. “System Dynamics Modeling to Inform Defense Strategic Decision-Making.” In System Dynamics: Theory and Applications, edited by B. Dangerfield, 341–373. New York, NY: Springer. doi:10.1007/978-1-4939-8790-0_657.
  • Meadows, D. 1999. Thinking in Systems. Sustainability Institute.
  • Merton, R. K. 1973. The Sociology of Science: Theoretical and Empirical Investigations. Chicago, IL: University of Chicago Press.
  • Midgley, G. 1996. “What Is This Thing Called CST?” In Critical Systems Thinking, edited by R. Flood and N. Romm, 11–24. Boston, MA: Springer.
  • Midgley, G. 1997. “Dealing with Coercion: Critical Systems Heuristics and Beyond.” Systems Practice 10 (1): 37–57.
  • Nasirzadeh, F., A. Afshar, M. Khanzadi, et al. 2008. “Integrating System Dynamics and Fuzzy Logic Modelling for Construction Risk Management.” Construction Management and Economics 26 (11): 1197–1212. doi:10.1080/01446190802459924.
  • Newell, B., and K. Proust. 2012. “Introduction to Collaborative Conceptual Modelling.” Canberra: Australian National University Working Paper 20. https://openresearchrepository.anu.edu.au/handle/1885/9386
  • Newsome, I. M. 2008. “Using System Dynamics to Model the Impact of Policing Activity on Performance.” Journal of the Operational Research Society 59 (2): 164–170. doi:10.1057/palgrave.jors.2602454.
  • Nutley, S., I. Walter, and N. Bland. 2002. “The Institutional Arrangements for Connecting Evidence and Policy: The Case of Drug Misuse.” Public Policy and Administration 17 (3): 76–94.
  • Parkhurst, J. 2017. The Politics of Evidence: From Evidence-Based Policy to the Good Governance of Evidence. London: Routledge, Taylor & Francis Group.
  • Parkhurst, J., L. Ghilardi, et al. 2021. “Understanding Evidence Use from a Programmatic Perspective: Conceptual Development and Empirical Insights from National Malaria Control Programmes.” Evidence & Policy 17 (3): 447–466.
  • Pielke, R. Jr 2007. The Honest Broker: Making Sense of Science in Policy and Politics. Cambridge: Cambridge University Press.
  • Radaelli, C. 1995. “The Role of Knowledge in the Policy Process.” Journal of European Public Policy 2 (2): 159–183.
  • Randers, J. 1980. “Guidelines for Model Conceptualization.” In Elements of the System Dynamics Method, edited by J. Randers, 17–139. Cambridge, MA: MIT Press.
  • Richardson, G. P., A. L. Pugh, III. 1981. Introduction to System Dynamics Modeling with DYNAMO. Cambridge, MA: Productivity Press.
  • Roberts, N. H., D. F. Andersen, R. M. Deal, M. S. Grant, and W. A. Shaffer. 1983. Introduction to Computer Simulation: The System Dynamics Modeling Approach. Reading, MA: Addison-Wesley.
  • Rouwette, E. A. J. A., H. Korzilius, J. A. M. Vennix, et al. 2010. “Modeling as Persuasion: The Impact of Group Model Building on Attitudes and Behavior.” System Dynamics Review: N/a-n/a. doi:10.1002/sdr.441.
  • Senge, P. 1990. The Fifth Discipline: The Art & Practice of the Learning Organization. 2nd ed. New York: Crown.
  • Shaxson L, Ahmed I, Brien D, et al. (2012) “Expanding Our Understanding of K* (KT, KE, KTT, KMb, KB, KM, etc.).” In: The K* Conference, Hamilton, Ontario, 88. United Nations University, Institute for Water, Environment and Health.
  • Shen, Q., Q. Chen, B. Tang, et al. 2009. “A System Dynamics Model for the Sustainable Land Use Planning and Development.” Habitat International 33 (1): 15–25. doi:10.1016/j.habitatint.2008.02.004.
  • Simon, H. A. 1955. “A Behavioral Model of Rational Choice.” The Quarterly Journal of Economics 69 (1): 99. doi:10.2307/1884852.
  • Simon, H. 1982. Models of Bounded Rationality. Cambridge, MA: MIT Press.
  • Smith, K. 2014. Beyond Evidence-based Policy in Public Health. Houndmills, Basingstoke: Palgrave Macmillan.
  • Stave, K. A. 2002. “Using System Dynamics to Improve Public Participation in Environmental Decisions.” System Dynamics Review 18 (2): 139–167. doi:10.1002/sdr.237.
  • Sterman, J. 2000. Business Dynamics: Systems Thinking and Modeling for a Complex World. Boston: McGraw-Hill.
  • Sterman, J. (2003) System Dynamics: Systems Thinking and Modeling for a Complex World. In: Working Paper Series, 31. Cambridge, MA: Massachusetts Institute of Technology. Engineering Systems Division.
  • Sterman, J. 2006. “Learning from Evidence in a Complex World.” American Journal of Public Health 96: 505–514.
  • Stewart, E., and K. Smith. 2015. “‘Black Magic’ and ‘Gold Dust’: The Epistemic and Political Uses of Evidence Tools in Public Health Policy Making.” Evidence & Policy: A Journal of Research, Debate and Practice 11 (3): 415–437.
  • Teisman, G. R., and E.-H. Klijn. 2008. “Complexity Theory and Public Management: An Introduction.” Public Management Review 10 (3): 287–297. doi:10.1080/14719030802002451.
  • Teng, Y., N. Kong, and W. Tu. 2015. “Optimizing Strategies for Population-Based Chlamydia Infection Screening among Young Women: An Age-structured System Dynamics Approach.” BMC Public Health 15 (1): 639. doi:10.1186/s12889-015-1975-z.
  • Thompson, M., Y. Wei, C. Dunn, and C. O'Connor. 2019. “A System Dynamics Model Examining Alternative Wildfire Response Policies.” Systems 7 (4): 49. doi:10.3390/systems7040049.
  • Townshend, J. R. P., and H. S. Turner. 2000. “Analysing the Effectiveness of Chlamydia Screening.” Journal of the Operational Research Society 51 (7): 812–824. doi:10.1057/palgrave.jors.2600978.
  • UK Government. 2013. What Works: Evidence Centres for Social Policy. London: UK Cabinet Office.
  • Ulrich, W. 1994. Critical Heuristics of Social Planning: A New Approach to Practical Philosophy. Chichester: Wiley.
  • Ulrich, W., and M. Reynolds. 2010. “Critical Systems Heuristics.” In Systems Approaches to Managing Change: A Practical Guide, edited by M. Reynolds and S. Holwell, 243–292. London: Springer. doi:10.1007/978-1-84882-809-4_6.
  • Ulrich, W., and M. Reynolds. 2020. “Critical Systems Heuristics: The Idea and Practice of Boundary Critique.” In Systems Approaches to Making Change: A Practical Guide, edited by M. Reynolds and S. Holwell (Retired), 255–306. London: Springer London. doi:10.1007/978-1-4471-7472-1_6.
  • Viana, J., S. C. Brailsford, V. Harindra, et al. 2014. “Combining Discrete-Event Simulation and System Dynamics in a Healthcare Setting: A Composite Model for Chlamydia Infection.” European Journal of Operational Research 237 (1): 196–206. doi:10.1016/j.ejor.2014.02.052.
  • Walter, I., S. Nutley, and H. Davies. 2005. “What Works to Promote Evidence-Based Practice? A Cross-Sector Review.” Evidence & Policy: A Journal of Research, Debate and Practice 1 (3): 335–364.
  • Walton, M. 2016. “Setting the Context for Using Complexity Theory in Evaluation: Boundaries, Governance and Utilisation.” Evidence & Policy 12 (1): 73–89.
  • Weed, M. n.d. “Models and Methods to Analyse the Interaction of Evidence and Policy in the First 100 Days of the UK Government’s Response to COVID-19.” SocArXiv 7. doi:10.31235/osf.io/f73u4.
  • Weeks, M. R., J. Li, S. Liao, et al. 2013. “Multilevel Dynamic Systems Affecting Introduction of HIV/STI Prevention Innovations among Chinese Women in Sex Work Establishments.” Health Education & Behavior 40 (1_suppl): 111S–122S. doi:10.1177/1090198113490723.
  • Weiss, C. H. 1979. “The Many Meanings of Research Utilization.” Public Administration Review 39 (5): 426. doi:10.2307/3109916.
  • Wolstenholme, E. 1990. System Enquiry: A System Dynamics Approach. Chichester: Wiley.