2,201
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

‘That would break the containment’: the co-production of responsibility and safety-by-design in xenobiology

ORCID Icon
Pages 6-27 | Received 23 Feb 2020, Accepted 11 Jan 2021, Published online: 16 Feb 2021

ABSTRACT

Few studies of frameworks of Responsible Research and Innovation (RRI) have addressed the meanings and framings of responsibility in relation to scientific practice. Researchers in the emerging field of xenobiology aim to explore the non-natural biological world through the development of alternative genetic systems, promising the safety of these systems with built–in–safety features (biocontainment). This article argues that approaches to responsibility partly shape the research agenda of a xenobiology laboratory. Conceptions of responsibility are co-produced with an imagination of the public as expectant of safety, and whether users may modify technological artefacts. I suggest the term ‘technologies of compliance’ to reflect that safety and the acceptance of new technology is a priority in synthetic biology research and design. This study provides an account based on ethnographic and interview data of the association of responsibility with design principles, relevant for RRI frameworks that are aligned with scientific practice.

Introduction

Researchers in the emerging field of xenobiology, a subfield of synthetic biology, aim to reconfigure the genetic code and develop alternative genetic systems, based on nucleic acid (DNA) analogues. As they expand and redefine life, they put forward discourses about how to fit new organisms into society and the world, raising questions about how to deal with the unnatural. Proponents of xenobiology have promoted the discipline as (bio)safe – by incorporating mechanisms of biological containment (Herdewijn and Marlière Citation2009; Marlière Citation2009; Schmidt Citation2010) – and responsible. For instance, xenobiology ‘vanguard’ (Hilgartner Citation2015) Markus Schmidt commented in a videoclip about the first xenobiology conference held in Berlin in 2014:Footnote1

I see my role in the organization of this conference as to ensure that right from the beginning, the societal and ethical issues are taken into account when developing this field, so that the field is developing in a responsible and conscious manner.

References to responsibility and safety are widespread in synthetic biology, for example in the ‘scientific philosophy’ of gene drives researcher Kevin Esvelt,Footnote2 or the report New Directions: The Ethics of Synthetic Biology and Emerging Technologies (Presidential Commission for the Study of Bioethical Issues Citation2010, 9): ‘A number of safety features can be incorporated into synthetic organisms to control their spread and life span. Surveillance or containment of synthetic organisms is a concrete way to embrace responsible stewardship.’ Moreover, the improvement of safety locks of genetically modified organisms (GMOs) (such as genetic firewalls offered in xenobiology), was recommended by the working group on the Opinion on Synthetic Biology II: Risk assessment methodologies and safety aspects, conducted by the European Commission Scientific Committees in 2015.Footnote3 Markus Schmidt claims that xenobiology can constitute the ‘ultimate safety tool’ and advocates for its responsible emergence (as a conclusion of the article) (Schmidt Citation2010, 330): ‘We should not fear unfamiliar life forms but try to rationally judge their risks and benefits and embrace them in a responsible way for the benefit of humankind.’

The design of technological artefacts and sociotechnical systems is shaped by social dynamics and values (Bijker, Hughes, and Pinch Citation1987; Latour Citation1992; MacKenzie Citation1993). Artefacts have politics and embed forms of power, as they rule specific relationships between people and modes of action (Winner Citation1986). Biocontainment seems to contradict the principle that ‘technologies be built with a high degree of flexibility and mutability’ (Winner Citation1977, 326); it is based upon the rationale of restricting the range of possibilities of an artefact (or biological system). Biocontainment constitutes a mode of governance embedded in biological systems, translating risks and matters of concern into technical design problems (Hurlbut Citation2017).

This paper offers an account of the co-production of responsibility and safety-by-design (or biocontainment) in xenobiology and synthetic biology, looking into the practices and framings of xenobiologists working toward the goal of ‘separating’ xenobiological systems from nature. Xenobiology encapsulates a tension between biocontainment, responsibility and control – both in terms of controlling consequences and controlling access to a technology. At the core of such tensions are the design principles that supposedly make xenobiological organisms safe, becoming a site of ascribing meanings to responsibility. Given the importance of responsibility and safety in synthetic biology (Schmidt et al. Citation2010), this paper analyses the forms of reasoning about technology development and responsibility embedded in biocontainment, and reflects on their implications for governance and responsible research and innovation (RRI). This is particularly relevant for frameworks of RRI that aim to orient the ‘right impacts’ of technologies in society (Owen, Macnaghten, and Stilgoe Citation2012).

Xenobiologists seek to engineer novel biological systems and biochemistries that differ from the canonical DNA–RNA-20 amino acid system, adding nucleic acid analogues, termed Xeno-Nucleic Acid (XNA), to the repertoire of nucleic acids DNA and RNA as carriers of genetic information.Footnote4 In 2014 the expansion of the genetic alphabet to incorporate a third base pair into the genetic code (Malyshev et al. Citation2014) won the people’s choice for Breakthrough of the Year in the journal Science, which was also among the runners-up for breakthrough of the year.Footnote5 Promised applications or products from xenobiology range from developing novel therapeutics, to bacterial strains resistant to bacteriophages, to novel biomaterials. Pioneers of xenobiology promote the field as a solution for biocontainment – an ‘ultimate safety tool’ (Schmidt Citation2010), which would support the release of synthetic microorganisms in open environments, an important goal in genetic engineering (Cases and de Lorenzo Citation2005). Biocontainment has been articulated in two ways (Torres et al. Citation2016). First, limit the survival of genetically modified organisms (GMO) outside their intended area of application (e.g. using auxotrophy); second, limit ‘horizontal gene transfer’ from the GMO to other organisms in the environment (through semantic containment, whereby a ‘genetic firewall’ is erected between synthetic and natural organisms). Theoretically, xenobiology ‘represent[s] the safest biocontainment mechanism possible’ (Wright, Stan, and Ellis Citation2013, 1230).

The environmental release of GMOs (outside laboratories or factories) has been the subject of controversy (Jasanoff Citation2005) and remains a topic of concern in synthetic biology (Marris and Jefferson Citation2013). Similarly to nanotechnology, synthetic biology offers an opportunity to earn public trust (Fisher and Mahajan Citation2006); a major concern in policy circles has been the public acceptability of synthetic biology (Marris and Calvert Citation2020). ⁠Debates and policies on synthetic biology have been centred around an imaginary of fear of public acceptance of synthetic biology, where the public is imagined as fearful of synthetic biology, potentially blocking the development of the field (Marris Citation2015).

In this paper I build upon previous conceptualizations of biocontainment as a form of governance, ordering science–society relationships as a technical design problem, obviating the need for external oversight (Hurlbut Citation2017). I argue that a consequentialist approach to responsibility shapes and justifies the research agenda of biocontainment in xenobiology, based on design principles that assume control (and separation) of xenobiological systems – maintaining issues of safety as predictable and solvable through technological design. Questions about responsibility often get framed in terms of how to design safer organisms and testing their safety. The adherence to a vision of biocontainment, as a problem defined in advance, limits flexibility, or the pursuit of alternative pathways in xenobiology. I show that biocontainment is not dependent exclusively on design principles, but on researchers maintaining control over the design and modification of xenobiological systems, otherwise the containment would be broken. In showing that biocontainment depends on users, and commitments to funding agencies, I highlight the difficulty of steering the technological trajectory of biocontainment in xenobiology, and introducing questions about the purposes of innovation and extended responsibilities of scientists (such as environmental impact).

Given that design principles in safety-by-design are co-produced with an understanding of backward-looking responsibility (Robaey, Spruit, and Poel Citation2018), and assuming that biocontainment does work of governance, I take a further step to suggest that an imaginary held by scientific and policy actors sustains the impression that addressing safety may earn public acceptance of synthetic biology; hence, biocontainment can be interpreted as a ‘technology of compliance' – similar to the concept of ‘technologies of hubris’ (Jasanoff Citation2003) – expected to meet the perceived requirements of the public and institutions. This is anchored in a culture that emphasizes the safety of individual technologies, as if this were the only legitimate grounds for social concern about technology (Wynne Citation2001, Citation2002).

This paper is based on fieldwork I conducted for my doctoral dissertation during 2016 and 2017, consisting of 34 semi-structured interviews with xenobiologists, synthetic biologists, civil servants and members of civil society, in Europe and the US. Fieldwork also comprised participant observation in a synthetic biology and xenobiology laboratory located in a world-leading British university for over a year; I facilitated five focus groups with members of the laboratory about governance of technology. In addition, I analysed secondary sources and attended events related to xenobiology. In studying xenobiology as an emerging discipline, I do not provide sufficient granularity to distinguish between dynamics in the US and Europe, similar narratives and imaginaries shape the field in these regions.

The structure of this article is as follows. I outline a set of considerations about safety-by-design and social studies of risk, and their relevance for responsible research and innovation (RRI). Next I show that ideas about safety and responsibility are undertaken as scientific and design problems, with the repercussions for thinking about uncertainty and ways of imagining risks. Then I explain that responsibility and safety are shaped by external aspects such as controlling access and uses of the technology, and design commitments. I finalize with the suggestion of thinking about biocontainment in xenobiology as a technology of compliance, which reflects ways of imagining the public (fearful and sceptical about new technologies and regulators that demand safety) and the expectations surrounding xenobiological systems in governing science–society relations. I finish with a discussion of the implications of my argument, the co-production (Jasanoff Citation2004) of responsibility with scientific design and practice.

Responsibility and ethics in governance of technology

Frameworks for responsible research and innovation (RRI) are coevolutionary, building upon a trajectory of efforts that aim to introduce reflexivity, democratization and inclusivity in knowledge production, bridging the gap between science and society (Owen, Bessant, and Heintz Citation2013). RRI seeks to enable social learning and empowering social agency (Stilgoe, Owen, and Macnaghten Citation2013), welcoming the participation of society in science and technology activities ‘upstream,’ and questioning the purposes of innovation (Stilgoe and Guston Citation2017). It is future-oriented and reflexive about the world-making attributes of technoscience (Zwart, Landeweerd, and van Rooij Citation2014).

RRI has been conceptualized as an integration of various dimensions – anticipation, inclusion, reflexivity, and responsiveness – that serve as a scaffold for embedding deliberation within innovation processes (Stilgoe, Owen, and Macnaghten Citation2013), however institutionalized with different implementation criteria (i.e. public engagement, open access, gender, ethics, and science education in Horizon 2020) that promote the goal of ‘work[ing] together during the whole research and innovation process in order to better align both the process and its outcomes with the values, needs and expectations of society.’Footnote6 The lack of uniformity or open-ended understanding of meanings of RRI has allowed its uptake in policy circles, which offers challenges of implementation but opportunities to experiment on practices and interpretations (Rip Citation2016). Whether the principles of RRI can shape and permeate the norms, discourses and structures of institutions in the EU remains to be determined (de Saille Citation2015).

RRI promotes a future-oriented understanding of responsibility as a collective process, broader than consequentialist ethics perspectives (Grinbaum and Groves Citation2013), compatible with the distribution of responsibility in safety-by-design (Poel and Robaey Citation2017). Phil Macnaghten (Citation2020, 12) suggests that RRI presents a prospective model of responsibility that

Has moved attention away from accountability, liability and evidence towards more future–oriented dimensions of responsibility, encapsulated by concepts of care and responsiveness. It is these, we argued, that offer greater potential for reflection on the uncertainties and indeterminacies associated with novel science and innovation, as well as with the goals, purposes and ends of research.

Responsiveness involves mechanisms of reflection, anticipation and inclusive deliberation, as well as discussion concerning purposes and the acknowledgment of uncertainty, and steering the direction of research (Owen, Macnaghten, and Stilgoe Citation2012). As RRI seeks to move away from accountability, it is necessary an overview of such approach to responsibility.

A narrative of ‘containment’ is a type of ethical arrangement in which if actors avoid causing harm, research may continue (Rip Citation2014). Responsibility in science governance has been concerned with the ‘products’ of science and innovation, portrayed as accountability – focused on assigning responsibility to an agent from a moral perspective – assigning blame (Nissenbaum Citation1996). Responsibility has been discussed from a consequentialist perspective, questions about liability and blameworthiness are asked after some undesirable event has occurred. Consequentialism is result-oriented, whether an act is morally right depends on the consequences of the act or of something related to it. Such framing is adequate for engineering contexts, where the focus is problem solving and preventing accidents, not allocating blame (Doorn Citation2012). However, consequentialism does not provide a reason for the authoritativeness of morality or the decisions being made (Hurley Citation2009). In the presence of uncertainty, or the impossibility of calculating risks, consequentialist criteria are not applicable (Grunwald Citation2012, 160). Responsibility ascriptions rely on the predictive capabilities of the individual; in case the individual had not foreseen negative consequences, it is questionable whether blame would be attributed. Another limitation concerns the ‘tyranny of small decisions’ (Kahn Citation1966), according to which consequences (or events) lead to further consequences, unknowable by the individual who caused the events. For Swierstra and Rip (Citation2007), consequentialism can narrow discussion on technological alternatives, by orienting debates into negative and desirable consequences of technologies. Second, it promotes a calculative analysis, of estimating whether benefits exceed risks; it is not always clear what the benefits are, or who may benefit. Third, the tendency of individuals to avoid assuming responsibility and instead blaming the complexity of political institutions and organizational settings.

In this paper I focus on the limitations that are present in xenobiology to consider responsibility in a prospective and reflexive manner, and the implications for conceptualization of responsibility in RRI, since a consequentialist approach to responsibility is common in emerging technologies.

Governance of risk

Biocontainment, or biological containment, which for the purposes of this paper I consider a form of safety-by-design, represents also a form of managing risks and uncertainties inherent to biological systems. Risk governance should be understood as decision-making in conditions of ignorance, rather than developing better predictions and aiming to know all possible unintended consequences (Collingridge Citation1980). High-risk technologies have become ubiquitous and accidents resulting from technology have become ‘normal’ (Perrow Citation1984). As a response to such challenges and the controlling hegemony of Modernity that engineering institutions aim to achieve (Stirling Citation2019), risk governance has aimed to provide orientations on how to deal with the complex nature of risks and the challenges of disciplining it; balancing the need for action based on incomplete knowledge and uncertain events (Aven and Renn Citation2010; Rosa, Renn, and McCright Citation2014). Risk governance involves social and physical dimensions, that not only cover the addition of safety layers to sociotechnical systems, but public values, perceptions of risk, decision making and the organization of operating systems (Renn Citation2008). For van Asselt and Renn (Citation2011), risk governance is a hybrid of an analytical frame and a normative model: how decisions are made and a model for improving decision-making structures and processes. Risk governance and RRI can be mutually supportive and share components and processes, with the former offering insights into how to put principles into practice, and the latter with setting normative goals (Florin Citation2019).

On the other hand, risk governance constitutes ‘a way of thinking about the future’ (Burgess Citation2016, 12) in a modern, ‘runaway world,’ that has a pronounced tendency to surprise us (Giddens Citation2000). Emerging technologies constitute complex systems, which may produce new information over time – increasing their uncertainty (Driebe and McDaniel Citation2005). Brian Wynne (Citation1992) draws attention to the limitations of knowledge in open systems, with properties that cannot be controlled within the boundaries of a technological artefact. Ignorance, or ‘not knowing what we don’t know’ is endemic to knowledge production. Controlling and taming such systems is impossible, and established concepts of risk and uncertainty are not adequate (Funtowicz and Ravetz Citation1990). Scholars have conceptualized beyond risk and uncertainty, to include ambiguity and ignorance, as states of knowledge or ‘incertitude’ that evade risk calculations, involving other considerations such as the quality of knowledge (knowing-how), the framing of problems and the multiple courses of action that come into play (Funtowicz and Ravetz Citation1990; Stirling Citation2008, Citation2012; Felt and Wynne Citation2007).

Safety-by-design and responsibility

Safe-by-design technologies constitute a way of handling risk, not sufficient to deal with high scientific uncertainty (Poel and Robaey Citation2017). Technologies that involve environmental release, like synthetic biology and gene drives, require broadening methods of risk governance to address the balance of risks and benefits, social values, uncertainties, and the consideration of alternative innovations (Stirling, Hayes, and Delborne Citation2018). A steering force is required for research on biotechnological approaches to safety-by-design in biotechnology (Robaey Citation2018). Asin-Garcia et al. (Citation2020) underscore two considerations of safety-by-design for risk governance. First, the practical limitations of testing safeguards; experimental methods and metrics (i.e. escape frequency and fitness) are adapted to laboratory conditions, not field release, nor the effects of evolution and the environment. Second, the ubiquity of evolution: synthetic organisms have a tendency to evolve, overcoming safeguards; or ‘life finds a way,’ the memorable quote uttered by the character Dr. Ian Malcolm in the 1993 science fiction film Jurassic Park.

Safety-by-design has been further conceptualized as a social process. For Christopher Kelty (Citation2009), safety-by-design was positioned in nanotechnology as a practice of responsibility; he describes how nanoscientist Vicki Colvin convinced her peers that safety (framed as avoiding toxicity) should be a property of nanomaterials. The question became how materials can be engineered to be safer, and what data supports safety claims. According to McCarthy and Kelty (Citation2010), responsibility was formulated as a legitimate scientific endeavour, turning responsibility into a scientific problem, or ‘made do-able,’ for the nanotechnology community to adopt it.

For Schwarz-Plaschg, Kallhoff, and Eisenberger (Citation2017), safety has been instrumentalised as an enabler of innovation, a way of overcoming regulatory obstacles. They question its association with a linear model of innovation; knowledge production and product development involves incomplete knowledge at certain stages. For them, safety is a relational category, not a property to be added into products. Safety as an attribute of a product depends also on public perceptions of food and genetic engineering (Penders Citation2011). Van de Poel and Robaey (Citation2017), addressing the implications for democratic governance of built-in safe technologies, take issue with safety-by-design for being undemocratic and designing out indeterminacy, the notion that safety depends on the behaviour of actors in the value chain, like users or operators; they argue that this decreases the ability to deal with unexpected or unknown risks. For Robaey, Spruit, and Poel (Citation2018), assuming that researchers bear special responsibility for safety in technological development raises particular issues. They see as problematic the assumption that safety issues can be addressed and controlled in the R&D phase, which predicates that such type of issues can be known in advance, without considering how artefacts or materials may be used. Second, they ask if other actors should assume blame if adverse events occur. They extend responsibility to owners of technology as ‘responsible experimenters,’ even though owners or users may not be fully aware of the consequences of a technology or artefact. Safety-by-design also raises questions about the degrees of responsibility allocated to stakeholders involved, and whether some groups or actors bear greater responsibility (Bouchaut and Asveld Citation2020).

Regarding the governance of biocontainment, Sam Weiss Evans and Megan Palmer (Citation2018) classify biocontainment in gene drives as an adaptation strategy to the regulatory requirements of emerging technologies, by focusing attention on the (social) problems that need solving, more than whether the technology should be developed. Ben Hurlbut (Citation2017) conceptualizes biocontainment as a framework of governance codified into biological systems themselves; scientists determine design choices, restricting the range of questions to be asked about unintended outcomes of genetic engineering, while assuming that benefits are unlimited. Hence, intended release is legitimate if proven safe, within a narrow imagination of what can go wrong. In his words (82),

The idea of containment thus traded on the notion that risk could be treated as an engineering problem, and that the difficulties of legislating and enforcing mechanisms of risk management could be overcome by incorporating law and order into biotechnology itself. In figuring containment as the sole problem of governance, and biological containment as the most reliable mechanism to address it, governing science became a task of engineering organisms.

So far I have highlighted the implications for governance and democracy in science of safety-by-design and biocontainment, as more than strategies to keep risks under control. In the next section I show that design principles shape ethical choices, and are shaped by a consequentialist approach to responsibility.

Responsibility as designing for safety

The laboratory I studied conducted research in xenobiology and synthetic biology, with projects such as the construction of an orthogonal genetic replication system (OGRS) – a system to replicate XNA autonomously inside the cell;Footnote7 the OGRS was shaped by the expectation of achieving built-in safety – by separating the unnatural from the natural biological world. In dictating the parameters under which synthetic organisms must be constructed, xenobiologists invoke ideas and imaginaries about the place of the unnatural in society. Interviewee Susan,Footnote8 having recently completed her PhD in the US, explained that among the questions that guided her research was ‘can we isolate GMOs or engineered organisms genetically from the rest of the world?’ Synthetic biology lecturer Richard, referring to what funding bodies value about xenobiology, commented ‘one of our major pitches [for grant application in the EU] is that by creating XNA-based life, you can separate things. Separate genetic information.’

The OGRS is part of a larger vision about the research agenda of xenobiology and its association with safety (Herdewijn and Marlière Citation2009). In this regard, Markus Schmidt specified 10 design criteria for the development of biocontained XNA-based organisms (Schmidt Citation2010, 328). The first specifies that ‘Xeno-organisms must not loose [sic] their auxotrophic character.’ This relates to the second principle, not creating an XNA metabolism (a theme I address below): ‘Natural organisms must also not be able to produce these essential biochemicals, to avoid a symbiotic relationship with XNA.’

The design principles for biocontainment that Schmidt and other pioneers of xenobiology have suggested (cf. Marlière Citation2009; Herdewijn and Marlière Citation2009; Torres et al. Citation2016) follow a rationale established in the Asilomar Conference of 1975 on Recombinant DNA (Berg et al. Citation1975), an influential conference organized by scientists to discuss the potential biohazards and regulation of genetic engineering (cf. Wright Citation1994; Hurlbut Citation2015). Ben Hurlbut (Citation2017, 79) writes that biocontainment

builds law and order into the living systems themselves, thereby engineering out reliance on brittle rules and fallible human practices. This is not only a technical strategy, but also a jurisdictional one. By rendering governance ‘intrinsic’ to the living entity itself, its designers can claim to have eliminated any risk of losing control and thus any need for external oversight.

Synthetic biologists, on the contrary, focus on Asilomar's legacy as design principles, not its vision of governance. For example, Schmidt and de Lorenzo (Citation2012, 2199) write, ‘Asilomar laid the foundation for most of the biosafety measures in place today for both biological and physical containment. Inter alia, the conference expressed that containment should be made an essential consideration in any experimental design.’ This remains a lasting vision, as researchers from the laboratory of Farren Isaacs write in their article on genetic recoding of bacteria (Rovner et al. Citation2015, 89): synthetic biology has produced novel organisms ‘necessitating the development of safety and security measures first outlined in the 1975 Asilomar conference on recombinant DNA.’ Adding that developing biocontainment features ‘remains a defining challenge.’

Members of the laboratory I studied worked on a diverse set of projects in xenobiology that followed the design principles of separation mentioned above. In a laboratory meeting, researcher Greta presented advances on the engineering of an orthogonal genetic replication system (OGRS) based on XNA, with the purpose of the providing a mechanism for biocontainment through a genetic firewall and auxotrophy (she showed a diagram of two cells, one using DNA and the other using XNA, without communicating with each other). In another meeting, James presented preliminary results of the engineering of an enzyme required for the development of the OGRS, stating in a slide that the ‘aim/purpose’ of the OGRS was achieving ‘biosafety, redundancy within the system.’ During the laboratory meetings, the importance of biocontainment as a research goal was taken for granted, questions were about what pathways would lead to it. Assumptions about responsibility are embedded in something apparently simple, or technical, as designing components of an OGRS. Design principles of separation anticipate consequences and ways to manage them, as well as what applications are enabled. This is aligned with a view of responsibility as consequentialist, common in emerging technologies. This raises the question of how researchers in the life sciences think about design and responsibility. During a discussion with participants of the laboratory in their weekly meetings, I asked them about their perspectives of responsibility in science. A participant replied,

Responsibility is ambiguous. So, if you do something irresponsible, then something goes wrong, then it’s yours to blame. Which I know that is not the meaning that RRI is developing. […] Everything you do has a consequence, so you should be able to think or plan to the consequence before you do what you set yourself to do.

Another participant commented, ‘the idea of blame is that something will happen that you haven’t thought of.’ Scientists seek to develop xenobiological systems that will not get out of control, and perform within a narrow set of possibilities – a rationale of containment. This was widely shared by scientists I interviewed, including Susan, who considered responsibility as ‘doing no harm to the environment.’ Although unusual in the responses I received, responsibility also meant for her the ‘broader sense of what is our input with this project to a good society,’ however these themes were not addressed in her research project.

The design of xenobiological systems as the ‘ultimate safety tool’ (Schmidt Citation2010) raises epistemological questions about the safety of such systems, how to ensure they will behave as originally predicted. Concern over controlling consequences lead to a preoccupation over determining the safety of xenobiological systems, but at the same time, involves a division of labour, in which scientists are responsible for ensuring that design principles are correctly embodied, and tests that determine safety, but not unintended consequences (like environmental toxicity). Members of the laboratory were aware of the limitations of knowledge (uncertainty) and control inherent to biocontainment. During the discussion about responsibility introduced above, a participant added: ‘nobody knows everything. So you can only predict the consequences as far as your knowledge, and if your knowledge is incomplete, then your understanding of the consequence is incomplete. And that becomes an unknown consequence.’ The consequentialist approach to responsibility does not mean that unknown unknowns are not acknowledged, but may delineate discussions within the limits of knowledge. A participant commented, after I read the group a text (Dana et al. Citation2012) about the risks that synthetic organisms pose to the environment: ‘the only way to prove there is no risk, is by testing every organism on every environment, on every combination, which is unworkable.’ Nevertheless, researchers are aware of the impossibility of knowing all consequences, and their impacts; for example professor Joe commented,

biology is extremely versatile and flexible, and even if you put something into place, and works in a lab, there is always going to be a chance that something else evolves.

Addressing the limitations of testing, he added:

we have the opportunity to create new molecules which don’t exist in nature. And one of the things that I worry about is that we will create molecules which are very very toxic, and I’m thinking particularly of the potential exposure of researchers to that. […] If I create a new molecule, and it’s super toxic, it’s going to be difficult to find out until you really find out.

As biocontainment is framed as a design challenge, to be solved in the laboratory, questions then become about ensuring the design principles are properly incorporated (and ‘work well’), rather than questions about how to act in conditions of uncertainty. The use of technical language to address such issues limits discussions about safety (and uncertainty) and values in biotechnology. As McCarthy and Kelty (Citation2010, 426) write, ‘[t]he language of risks dominates the discussion both inside and outside of [nanotechnology institutions]; it is a language that structures what is possible to think and do.’ In an interview for radio, about the recently published work in developing semi-synthetic organisms (Mandell et al. Citation2015) of researcher Dan Mandell (in the laboratory of George Church),Footnote9 science journalist Ira Flatow asked Mandell about the possibility of organisms ‘escaping’ their containment, or ‘acquire the changes necessary to live out there in the wild?’ (for which Mandell and colleagues engineered auxotrophy based on the need of non-standard amino-acids for survival); Mandell answered:

We tested them in three critical ways. The first is to make sure that they can’t escape by mutating, and to test that, we grew up about a trillion different bacteria in the presence of the synthetic amino-acid, and then we take it away. We observe that none of them could survive, meaning that they can’t escape the dependency by mutating their genomes, even when they’re grown into large numbers of cells.

Flatow seems to question the degree of certainty that scientists confer to their creations, whether it is possible to anticipate and control the unexpected. For Mandell, the question is rather what type of tests can provide evidence of achieving control, the environment is reduced to the controlled conditions of the laboratory. These questions cannot be answered without ‘leaving the lab’ and testing in the field (Gross Citation2010, 169). When I addressed this point in a discussion with the laboratory I studied, participants commented that, while acknowledging the inherent uncertainty of biological systems, more safety layers could be incorporated.

Conversely, even though researchers are aware of the need for testing safety, they are constrained in their capacity to address these concerns. Inherent limitations to addressing responsibility in the life sciences leave open the question of who is responsible for conducting long-term safety studies that verify safety and who is responsible for funding them (Asin-Garcia et al. Citation2020). As the group leader (GL) explained in a discussion, the grants for XNA that the laboratory has received did not include additional funding for conducting safety studies, but he would be enthusiastic about such experiments if he had the funds and required resources.

A consequentialist approach to responsibility shapes the type of research and experimentation conducted in the (xenobiology and synthetic biology) laboratory; when designing biosafe organisms, following design principles is prioritized. In turn, experiments and knowledge production in the laboratory influence what is understood and enacted as responsibility. Xenobiologists make promises of biocontainment because the type of research they conduct in the laboratory – mostly based on molecular biology and organic chemistry – allows them to construct biocontained organisms, insofar safety is defined as establishing ‘genetic firewalls.’ So far, I have addressed the intricate relationship (or co-production) between design principles in xenobiology (i.e. the parameters of the OGRS), the orientation toward consequences that this allows, and the type of questions this orientation entails. While it is laudable that xenobiologists aim to design safe xenobiological systems and anticipate their potential downsides, this occurs by favouring a set of questions and approximations to responsibility, within the limits of technological design. Next, I show that decisions about biocontainment are not only a matter of design principles, but choices made contingent to the range of actors and institutions that participate in the processes of shaping innovation.

That would break the containment

The laboratory I studied sought to develop components (i.e. enzymes) of an Orthogonal Replication System (OGRS),Footnote10 however, such plans did not include developing features that would enable an XNA metabolism. The group leader (GL) explained to me in an interview that a colleague had asked him in a conference about creating the XNA metabolism, ‘the machinery for the cell to make the XNA [nucleotides] in vivo. From building blocks that are easily accessible to the cell. So essentially create the XNA metabolism’ (Note that the synthesis of XNA is a different process from the replication of XNA). The colleague was interested in the XNA metabolism as a research problem that would push the limits of what is possible in biology and could lead to prestigious publications. However, the GL declined such possibility, because the laboratory had already committed to developing biocontainment features (and in my view, would deviate from Schmidt’s principles listed above). According to the GL, ‘on multiple conversations, they just cannot get the fact that if I do that, then I’m weakening the containment, which is exactly what I’m building the technology for.’ The GL further explained that colleagues probably were not aware that ‘[the XNA metabolism] is a commitment already made, that underlies the research agenda of xenobiology. I cannot build a bridge that I don’t want to build, because that then undermines everything else I’m doing.’ The commitment to the design principles of biocontainment is the result of perspectives about responsibility and expectations of the role of scientists – expanding biology within the limits of what is safe, and limitations about the range of questions about purpose and technological pathways that can enter the laboratory.

The GL further explained to me that using ‘the resources that the funding bodies have given me for containment, to undermine containment … there’s an ethical issue there’ (emphasis added). Besides a ‘defining challenge,’ biocontainment seems for the GL to acquire a moral dimension, its importance goes beyond fulfilling promises to funding agencies, or following design principles. The ‘ethical issue’ could be interpreted in the sense of a moral duty to develop a technology that will not be harmful or wreak havoc. This reflects the priority of containment for the GL, it means more than a research goal, besides a concern over being reprimanded by granting agencies if promised results were not be delivered in time.

The commitment to not ‘building bridges’ reflects that perspectives about responsibility – and the associated design principles – are determined not only by the control of possible outcomes of xenobiological systems (i.e. how they might perform in the world), but also the extended network of actors that may use or modify the technology. For instance, Ahmed, a doctoral student, commented on how responsibility is collectively determined by scientists themselves: ‘[scientists as a group] bear the responsibility for what they do, and the response they get from their peers usually guides scientists as to what’s acceptable [and] what’s not acceptable.’ Responsibility is shaped by funding bodies, colleagues, and the anticipation of actions by scientists or users; in addition to commitment to technological trajectories, like containment. Expectations about how scientific work should be conducted and financed shape attitudes towards responsibility.

Furthermore, biocontainment requires controlling the possibility of modifying the technology. Some projects related to xenobiology in the laboratory I studied were funded as part of a consortia of research laboratories, hence decisions made regarding access to XNA materials and tools depended on collaborators of the consortia. The GL explained to me that the technologies that were being developed could be licensed (as such technologies were planned to be patented), insofar licensees would commit to ‘not develop[ing] a way of metabolizing those [XNA] nucleotides. [Licensees would] accept that you will not develop a way of manufacturing those nucleotides in vivo, because that would break the containment’ (emphasis added). If the biocontained (and designed) system could be modified, then it would not be sufficiently biocontained. In this way, biocontainment not only encircles a commitment to design principles, but to the context in which technology or artefacts may be used or modified. Biocontainment is also a regulatory device, by incorporating proprietary ‘rights’ in the artefacts themselves (Van Dooren Citation2007). The enforcement of such proprietary rights depends on practices and agreements that cannot be addressed through design principles alone.

A similar idea emerged when I asked participants in the laboratory whether biocontainment would expand access of biotechnology to non-experts, by virtue of its safety; their response was that putting xenobiological tools in the public domain would open the possibility of modifying the principles that sustain containment. This has been referred to as ‘designing out indeterminacy’ of users – reducing the consequences of human behaviour, which may appear undemocratic as it leaves the shaping of technology (and values therein) in the hands of scientists (Poel and Robaey Citation2017). Despite restricting access to the modification of xenobiological systems, it is not possible to ensure that these will be used in predetermined ways, or follow a script (Wynne Citation1988); users determine how technologies perform and can tweak them in ways that the original creators could not anticipate.

Jasanoff and Kim (Citation2009) remind us of the many layers that containment (in a broad sense) involves through the history of nuclear power in the US. The containment of the increasingly ‘world-destroying capabilities’ of the user and developer (scientists and the state); as well as the challenge of containment or the ‘taming of fear’ of a citizenry concerned about a hazardous technology, that remained suspicious about the possibility of users containing themselves; an interplay that made difficult the process of transitioning towards peaceful nuclear power generation. The idea that biocontainment depends on the containment of users is reminiscent of a Mode 1 organization of research, in which one of the main features is the role of the expert, working with well-defined problems or objectives (Gibbons et al. Citation1994). Paul Rabinow and Gaymon Bennett (Citation2012, 132) reflect on the logic of Asilomar as framing biocontainment as a problem not only of preventing accidents, but ‘distinguish[ing] between good and bad actors and intentions.’ Safety is not only established in design but also in use and misuse (Robaey, Spruit, and Poel Citation2018). As Emma Frow (Citation2020, 1055–1056) writes,

core governance questions in synthetic biology are expanding from a biosafety–oriented focus on containing the objects and organisms of synthetic biology research, to practices concerned with defining and managing groups of concern who threaten the pursuit of innovation in various ways.

Moreover, a commitment to biocontainment may contain questions about purposes of innovation from entering the research agenda of xenobiology, as problems have already been laid out as engineering problems. As Ben Hurlbut (Citation2017, 80) asks, ‘if risk could be contained to the laboratory, why should the wider public be involved?’ adding that ‘by imagining risks as problems of containment, and containment as primarily a matter of engineering safe biological systems, the range of questions that could be asked about the potential for harm was significantly truncated’ (82). In this section I have drawn attention to considerations that restrict the range of questions besides safety that can be asked regarding the pathways sought in the research agendas of xenobiology. As I show next, safety is thought to lead to public trust and acceptance of new technologies, so that innovation is not truncated.

Biocontainment, a technology of compliance

In what follows, I suggest that actors (e.g. scientists, regulators, policymakers) consider that the safety that biocontainment offers could help attain public acceptability of new biotechnologies, providing a rationale for the support of biocontainment in xenobiology. I provide a brief overview of the construction of the public in scientific and government circles to situate the ensuing discussion. The lack of trust in governmental and scientific institutions in modern Western societies has been attributed to the consolidation of epistemic and institutional patterns of thinking that have excluded the normative dimensions of risk that shape innovation processes, or the narrowly scientific ‘framing’ of risk issues (Felt and Wynne Citation2007; Wynne Citation2001). Such institutionalization of risk has been suggested to contribute to recent controversies, writes Grove-White (Citation2006, 173) about GMOs,

Relevant concerns about particular GM constructs should only be understood in terms of specific definable ‘risks’, that the only relevant knowledge gaps or uncertainties are those already specifiable in agreed scientific terms, and that when no such specifiable ‘risks’ (or ‘uncertainties’) can be identified through available scientific knowledge, there are no reasonable grounds for preventing particular new GM artefacts from being released more widely. (emphasis original)

STS commentators have also examined factors shaping people’s attitudes towards science and technology, often characterized as displaying irrational fears toward the risks of emerging technologies; this construction of the public as inherently sceptical and risk-adverse has been discredited, as publics are concerned about normative issues and ‘unknown unknowns’ (Irwin and Wynne Citation1996). Public attitudes toward science and institutions have been found to be ambivalent, with both enthusiasm and antipathy toward technology, as a way of making sense of power relations built into the development of new technologies (Kearnes and Wynne Citation2007). Arie Rip (Citation2006) coined the term nanophobia-phobia to capture that researchers in nanotechnology project a public fearful about nanotechnology – a exaggerated interpretation of public concerns – absent empirical support or data about public views. In other words, the concern of scientists that the public would reject nanotechnology. More recently, Claire Marris (Citation2015) built upon this conceptualization to suggest a similar phenomenon in synthetic biology, where fear of the public’s fear of synthetic biology is present in policy and scientific communities – public attitudes are seen as an obstacle to the benefits of synthetic biology, despite decades of social science research claiming the contrary.

I suggest that responsibility, mobilized as the safe use of biological systems and a technical challenge, is thought to lead to adoption and acceptance of novel biotechnologies, responding to the expectations of an imagined public (Marris Citation2015). As synthetic biologist Marlon manifested in an interview (and other examples that I enounce afterwards):

They’re [scientists] aware of the need for that [biosafety] because of legislation, because of what the public might think, and also nongovernmental organisations who maybe wanting to lobby against the kind of technology. So, unless we are deliberately designing all these safety features in, right from the start, then we’ll not be able to satisfy all these people.

The responsible, safe side of synthetic biology is presented as a necessity for the development of the field, even with a sense or urgency. Moe-Behrens, Davis, and Haynes (Citation2013, 2) wrote in their review about biosafety practices in synthetic biology about the importance of responsibility to ensure the progress of the field:

If synthetic organisms and their derivatives are to become as ubiquitous as electronic devices, then synthetic biologists must openly address the responsible and safe use of synthetic biological systems. We can assuage fear and foster familiarity with synthetic biology through effective efforts to inform the public of the actual risks of synthetic biology research, the steps we can take to address the risks, and how this technology can be harnessed to meet society’s needs. (emphasis added)

A similar stance is present in the introduction of an article published in Nature by the laboratory of synthetic biologist George Church (Mandell et al. Citation2015, 55):

In order to protect natural ecosystems and address public concern it is critical that the scientific community implements robust biocontainment mechanisms to prevent unintended proliferation of GMOs. (emphasis added)

William, a synthetic biology professor, commented about whether there was interest in built-in safety approaches:

I can tell you that having had conversations with people from multiple companies, that it is very high in their agenda to engineer in ways that show that they are taking into account safety and security. […] they want to show that they’re doing something, right? also is useful for them in terms of helping them with regulators, with governments, who might say, look at this before you approve it, with customers, with investors, if they could show it does something [for safety].

These are examples of how the vision of biocontainment is driven by additional considerations than incorporating safety features. Responsibility, as safety, can be interpreted as a ‘language’ to address the concerns of a public perceived to be sceptical of biotechnology. Scientists and policymakers imagine the public as fearful of synthetic biology and expectant of safety in part due to misinterpreting past controversies about science and technology in society (Rip Citation2006; Smallman Citation2018; Marris Citation2015). This is further exemplified as participants, when asked to reflect about public attitudes toward biotechnology, often invoked the GMO controversy in Europe in the 1990s as a path to avoid. Postdoctoral researcher Olivia commented ‘you don’t want [synthetic biology] go down the path of the GMOs, you don’t want this technology for people to be scared of, and then just push it on the side.’ Or Ferdinand, group leader, when asked about what responsible research involved, answered joining the GMO controversy with responsibility:

Common sense will tell you that you should avoid doing crazy things. I think that scientists are alienated by some reactions, particularly synthetic biologists are quite alienated by some reactions in connection to GMOs and all these things […] one wonders whether this focus on governance is connected to a public demand that, you know, you can trace down protests to GMOs and things like that, rather than a genuine necessity to do it. (Emphasis added)

It is noteworthy that Alfred, a member of the laboratory I studied, commented on the importance of biocontainment:

You need to ensure that when you have a GMO, whatever genes you add in there are not going to escape into the environment and spread to other organisms. Which is one of the reasons why GMOs are so looked down upon in Europe […] people have this view that if you put a gene into a plant it’s going to spread all over the environment and it’s going to cause problems. (emphasis added)

I propose that xenobiology serves as a ‘technology of compliance’ to reflect an imaginary that assumes that design principles could satisfy the (imagined) expectations of the public and regulators about safe technologies, and navigate a complex science–society landscape by means of technological design. For example, in the interview mentioned above by Ira Flatow, he asked STS scholar David Guston, ‘while those safeguards may reassure a scientist, will they convince people who are sceptical of genetic engineering that these organisms are truly safe?’ Guston replied extensively that genetic safeguards were not sufficient to gain public confidence around genetic modification and the full agenda of synthetic biology. On the other hand, Dan Mandell referred to the benefits of genetic engineering, and addressed a point Guston made about the possible impacts of the release of synthetic organisms in the environment; however, maintaining the conversation as a scientific problem, within the boundaries of the system:

[…] we definitely want to be careful about spreading around the synthetic amino acid which we are trying to use as sort of a safety key. One possible way of doing this is to load up the bacteria with the amino-acid ahead of time, and they would survive for about a generation or so. Let them do their work for about an hour, at which point they would die. So we would basically need to replenish the bacteria, but of course, they are extremely cheap, because they produce themselves. We could also think about using amino acids that can be degraded or destabilized in various ways. […]

The term ‘technology of compliance’ seeks to capture the presentation of a technology as a license to operate in the world. It describes an approximation to governing and ordering the relations between science and society, as biocontainment complies with a particular imagination of the expectations of societies about new biotechnologies. It relates to what Sheila Jasanoff (Citation2003) has referred to as ‘technologies of hubris,’ which seek to reassure the public about new technologies with overconfidence in controlling uncertainty and ambiguity, as well as pre-empting political discussion and participation. Jasanoff advocates for the opposite, a change of culture of governance and modes of interactions between stakeholders, ‘to make apparent the possibility of unforeseen consequences; to make explicit the normative that lurks within the technical; and to acknowledge from the start the need for plural viewpoints and collective learning’ (Jasanoff Citation2003, 240).

The term technologies of compliance suggest the intermediation of technological artefacts in maintaining science–society relations. As biocontained synthetic organisms may be released in the environment, scientists step into the messy landscape of governance through these artefacts and the ways they have been designed. If the attributes of an object are perceived as safe and responsible, then the creators of the objects must be equivalent: responsible scientists. The rationale of biocontainment and its origin in the Asilomar Conference of 1975 suggests it could possibly inhibit public discussions about the purposes and ends of innovation (Hurlbut Citation2015): if scientists ‘comply’ (or satisfy) by developing safe xenobiological systems according to what they perceive as the expectations of the public, this could diminish the need for scientists to engage with the public and address the social challenges that biotechnology could raise or solve, and associated concerns. However, the concept may bring attention to science and society relations that need more integration, including broadening the participation of the public in steering the trajectories of technological development.

Conclusion

In this paper I analyse the emerging field of xenobiology, oriented toward the ‘exploration’ of the unnatural biological world and the development of alternative genetic systems. As Markus Schmidt (Citation2010) has suggested to embrace the unnatural in a ‘responsible way,’ what meaning does responsibility acquire in xenobiology with its promises of biocontainment? Matters of risk and safety are also matters about power and authority. In this paper I have suggested that a research and design agenda in xenobiology is co-produced with a consequentialist form of responsibility, having outlined limitations for a broader approach to prospective responsibility in xenobiology; the terms in which responsibility is conceived and practiced in xenobiology offer valuable lessons for RRI.

While the implications of safety-by-design for science and democracy have been recognized (Hurlbut Citation2017; McCarthy and Kelty Citation2010; Robaey, Spruit, and Poel Citation2018), I argue that some design principles of xenobiology (e.g. promoting auxotrophy but not developing an XNA metabolism) are the consequence of a way of thinking about emerging technologies as narrowly defined by their potential risks. Nonetheless, design principles alone do not lead the decisions and trajectories pursued in a xenobiology laboratory. Researchers are also constrained by their commitments to funding agencies or the ways in which technologies might be used, hence biocontainment can be interpreted as containing both microorganisms and users. The focus on achieving biosafety can potentially deviate attention from other pressing issues, such as ownership and control of technology concentrated in large and unaccountable actors, with different interests from the public good (Kearnes and Wynne Citation2007). Insofar problems of safety – and responsibility – are framed as problems to be addressed in the laboratory, it is difficult to ‘open up’ (Stirling Citation2008) to scrutiny the imaginaries and habits of thought that are written into research practices.

Conversations about biocontainment can become trapped around whether sufficient safety can be achieved, or whether uncertainty is embraced. I suggest that researchers in synthetic biology do take these considerations seriously, although framed as scientific questions. Researchers in STS and RRI can make visible questions about the framing of problems in xenobiology, asking, for what problems is biocontainment a solution? Does the research agenda of xenobiology contribute to social and sustainable goals? If not, is it worth pursuing? More than ensuring safety, Grunwald (Citation2014, 259) suggests that ‘taking over responsibility therefore means being responsible for current processes of research, defining the research agenda, determining objectives and goals and supporting current societal debates on synthetic biology instead of talking about responsible or irresponsible future outcomes of synthetic biology.’

In shifting attention from design principles to the ‘work’ that biocontainment does for governance, in constraining or defining actors and matters of concern, I have suggested the term ‘technology of compliance’ to capture the imaginary shared by scientists and key actors that developing built-in safety features may meet the perceived requirements of the public in Western societies to trust scientific and government institutions, reinforcing a construction of the public as fearful of new technologies (a projection that has been discredited) (Rip Citation2006). Technologies of compliance might inhibit public discussions about the purposes and ends of innovation, restricting the need to meaningfully engage with the public and address social challenges that biotechnology could fulfil. Following Sheila Jasanoff’s distinction between ‘technologies of hubris’ and ‘technologies of humility,’ the term technology of compliance seeks to bring attention to the ways in which technological design – the principles of biocontainment in xenobiology – incorporate values and ways of imagining the benefits and place of the unnatural in society, inviting to revisit and incorporate in scientific practice the recognition that technologies can enhance or diminish democratic participation, social equality, human freedom, and the public good; what Langdon Winner calls political imagination (Winner Citation1990).

However difficult it might be to break up dichotomies beyond risk and safety, the natural and the unnatural, or the laboratory and the outside world, there is a significant opportunity for STS to introduce new narratives into the laboratory and expand the conversation to include social values and the world-building capacity of biotechnology. Xenobiology, in this regard, with its potential to expand the genetic basis of life, offers an opportunity to revisit ideas about governance in the life sciences and responsibility in science and technology.

Acknowledgements

This work has benefited from valuable conversations with Sheila Jasanoff and colleagues in the Harvard Program on Science, Technology, & Society; as well as colleagues in the Department of Science and Technology Studies of UCL. The author wishes to thank Jack Stilgoe, Brian Balmer, and Deborah Scott for suggestions on ways to improve the article, as well as comments from two anonymous reviewers.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

Doctoral studies of A. Aparicio were funded by a scholarship from the Colombian Ministry of Science, Technology and Innovation; [Doctoral training call 568 of 2012].

Notes on contributors

Alberto Aparicio

Alberto Aparicio research focuses on the governance and sociopolitical aspects of biotechnology and emerging technologies. Trained in STS and science policy, Alberto is currently postdoctoral research fellow at the Instituto de Investigación de Recursos Biológicos Alexander von Humboldt, where he is conducting research on the intersection of responsibility with narratives of bioeconomy and conservation biology.

Notes

1 Video available in Vimeo, titled ‘The XB1 Conference,’ uploaded July 1, 2014 by BioFaction’s account. https://vimeo.com/99627227 [accessed July 19, 2020]. The website of the conference is no longer available.

2 See http://www.sculptingevolution.org/philosophy [accessed November 15, 2019].

4 XNA comprises a variety of nucleic acid analogues, with different chemical structures that replace the sugar–phosphate backbone of DNA. For reviews, see the edition on xenobiology of the journal ChemBioChem https://chemistry-europe.onlinelibrary.wiley.com/doi/toc/10.1002/(ISSN)1439-7633.Xenobiology [accessed July 19, 2020].

7 I assigned this term, in exchange of the technical term the laboratory used, for anonymity purposes. For the OGRS, see for example Chaput, Herdewijn, and Hollenstein (Citation2020).

8 Names of laboratory members and interviewees have been changed for anonymity.

9 Interview in the radio show Science Friday, see http://www.sciencefriday.com/segment/01/23/2015/scientists-engineer-bacteria-with-genetic-kill-switch [accessed October 23, 2017].

10 See Note 7.

References

  • Asin-Garcia, Enrique, Amalia Kallergi, Laurens Landeweerd, and Vitor A. P. Martins dos Santos. 2020. “Genetic Safeguards for Safety-by-Design: So Close Yet So Far.” Trends in Biotechnology 38 (12): 1308–1312.
  • Asselt, Marjolein B. A., and Ortwin Renn. 2011. “Risk Governance.” Journal of Risk Research 14 (4): 431–449.
  • Aven, Terje, and Ortwin Renn. 2010. Risk Management and Risk Governance. London: Springer.
  • Berg, Paul, David, Baltimore, Sydney, Brenner, Richard O. Roblin, and Maxine F. Singer. 1975. “Summary Statement of the Asilomar Conference on Recombinant DNA Molecules.” Proceedings of the National Academy of Sciences of the United States of America 72 (6): 1981–1984.
  • Bijker, Wiebe E., Thomas Parke Hughes, and Trevor Pinch, eds. 1987. The Social Construction of Technological Systems: New Directions in the History and Sociology of Technology. Cambridge, MA: The MIT Press.
  • Bouchaut, Britte, and Lotte Asveld. 2020. “Safe-by-Design: Stakeholders’ Perceptions and Expectations of How to Deal with Uncertain Risks of Emerging Biotechnologies in the Netherlands.” Risk Analysis 40 (8): 1632–1644.
  • Burgess, Adam. 2016. “Introduction.” In Routledge Handbook of Risk Studies, edited by Adam Burgess, Alberto Alemanno, and Jens O. Zinn, 1–14. London: Routledge.
  • Cases, Ildefonso, and Víctor de Lorenzo. 2005. “Genetically Modified Organisms for the Environment: Stories of Success and Failure and What We Have Learned from Them.” International Microbiology 8: 213–222.
  • Chaput, John C., Piet Herdewijn, and Marcel Hollenstein. 2020. “Orthogonal Genetic Systems.” ChemBioChem (February).
  • Collingridge, David. 1980. The Social Control of Technology. London: Frances Pinter.
  • Dooren, Thom Van. 2007. “Terminated Seed: Death, Proprietary Kinship and the Production of (Bio)Wealth.” Science as Culture 16 (1): 71–94.
  • Doorn, Neelke. 2012. “Responsibility Ascriptions in Technology Development and Engineering: Three Perspectives.” Science and Engineering Ethics 18 (1): 69–90.
  • Driebe, D. J., and R. R. McDaniel. 2005. “Complexity, Uncertainty and Surprise: An Integrated View.” In Uncertainty and Surprise in Complex Systems: Questions on Working with the Unexpected, edited by R. R. McDaniel and D. J. Driebe, pp. 19–30. Berlin and Heidelberg: Springer.
  • Evans, Sam Weiss, and Megan J. Palmer. 2018. “Anomaly Handling and the Politics of Gene Drives.” Journal of Responsible Innovation 5 (sup1): S223–S242.
  • Felt, Ulrike, and Brian Wynne, eds. 2007. Science and Governance: Taking European Knowledge Society Seriously. Report of the Expert Group on Science and Governance to the Science, Economy and Society Directorate, Directorate-General for Research, European Commission, Brussels.
  • Fisher, Erik, and Roop L Mahajan. 2006. “Contradictory Intent? US Federal Legislation on Integrating Societal Concerns into Nanotechnology Research and Development.” Science and Public Policy 33 (1): 5–16.
  • Florin, Marie-Valentine. 2019. “Risk Governance and ‘Responsible Research and Innovation’ Can Be Mutually Supportive.” Journal of Risk Research (September): 1–15.
  • Frow, Emma. 2020. “From ‘Experiments of Concern’ to ‘Groups of Concern’: Constructing and Containing Citizens in Synthetic Biology.” Science, Technology, & Human Values 45 (6): 1038–1064.
  • Funtowicz, Silvio O., and Jerome R. Ravetz. 1990. Uncertainty and Quality in Knowledge for Policy. Dordrecht: Kluwer Academic Publishers.
  • Genya V. Dana, Todd Kuiken, David Rejeski, and Allison A. Snow. 2012. “Synthetic Biology: Four Steps to Avoid a Synthetic-Biology Disaster.” Nature 483 (7387): 29. doi:10.1038/483029a.
  • Gibbons, Michael, Helga Nowotny, Camille Limoges, Simon Schwartzman, and Peter Scott. 1994. The New Production of Knowledge: The Dynamics of Science and Research in Contemporary Societies. London: SAGE Publications.
  • Giddens, Anthony. 2000. Runaway World: How Globalization is Reshaping Our Lives. New York: Routledge.
  • Grinbaum, Alexei, and Christopher Groves. 2013. “What Is ‘Responsible’ About Responsible Innovation? Understanding the Ethical Issues.” In Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society, edited by Richard Owen, John Bessant, and Maggy Heintz, 119–142. Chichester, UK: Wiley.
  • Gross, Matthias. 2010. Ignorance and Surprise: Science, Society, and Ecological Design. Cambridge, MA: The MIT Press.
  • Grove-White, Robin. 2006. “Britain’s Genetically Modified Crop Controversies: The Agriculture and Environment Biotechnology Commission and the Negotiation of ‘Uncertainty’.” Community Genetics 9 (3): 170–177.
  • Grunwald, Armin. 2012. Responsible Nanobiotechnology: Philosophy and Ethics. Boca Raton, FL: CRC Press, Taylor & Francis Group.
  • Grunwald, Armin. 2014. “Synthetic Biology as Technoscience and the EEE Concept of Responsibility.” In Synthetic Biology: Character and Impact, edited by Bernd Giese, Christian Pade, Henning Wigger, and Arnim von Gleich, 249–265. Cham: Springer.
  • Herdewijn, Piet, and Philippe Marlière. 2009. “Toward Safe Genetically Modified Organisms Through the Chemical Diversification of Nucleic Acids.” Chemistry & Biodiversity 6 (6): 791–808.
  • Hilgartner, Stephen. 2015. “Capturing the Imaginary: Vanguards, Visions and the Synthetic Biology Revolution.” In Science & Democracy: Making Knowledge and Making Power in the Biosciences and Beyond, edited by Stephen Hilgartner, Clark Miller, and Rob Hagendijk, 33–55. New York: Routledge.
  • Hurlbut, J. B. 2015. “Remembering the Future: Science, Law and the Legacy of Asilomar.” In Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power, edited by Jasanoff, Sheila, and Kim, Sang-Hyun, 126–151. Chicago: The University of Chicago Press.
  • Hurlbut, James Benjamin. 2017. “Laws of Containment: Control Without Limits in the New Biology: Life Beyond the Human.” In Gene Editing, Law, and the Environment, edited by Irus Braverman, 77–93. London: Routledge.
  • Hurley, Paul. 2009. Beyond Consequentialism. Oxford, UK: Oxford University Press.
  • Irwin, Alan, and Brian Wynne, eds. 1996. Misunderstanding Science? The Public Reconstruction of Science and Technology. Cambridge, UK: Cambridge University Press.
  • Jasanoff, Sheila. 2003. “Technologies of Humility: Citizen Participation in Governing Science.” Minerva 41 (3): 223–244.
  • Jasanoff, Sheila. 2004. States of Knowledge: The Co-Production of Science and Social Order. London, UK: Routledge.
  • Jasanoff, Sheila. 2005. Designs on Nature: Science and Democracy in Europe and United States. Princeton, NJ: Princeton University Press.
  • Jasanoff, Sheila, and Sang-Hyun Kim. 2009. “Containing the Atom: Sociotechnical Imaginaries and Nuclear Power in the United States and South Korea.” Minerva 47 (2): 119–146.
  • Kahn, Alfred E. 1966. “The Tyranny of Small Decisions: Market Failures, Imperfections, and the Limits of Economics.” Kyklos 19 (1): 23–47.
  • Kearnes, Matthew, and Brian Wynne. 2007. “On Nanotechnology and Ambivalence: The Politics of Enthusiasm.” NanoEthics 1 (2): 131–142.
  • Kelty, Christopher M. 2009. “Beyond Implications and Applications: The Story of ‘Safety by Design’.” NanoEthics 3 (2): 79–96.
  • Latour, Bruno. 1992. “Where Are the Missing Masses? The Sociology of a Few Mundane Artifacts.” In Shaping Technology/Building Society: Studies in Sociotechnical Change, edited by Wiebe J Bijker and John Law, 225–258. Cambridge, MA: The MIT Press.
  • MacKenzie, Donald. 1993. Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance. Cambridge, MA: The MIT Press.
  • Macnaghten, Phil. 2020. The Making of Responsible Innovation. Cambridge, UK: Cambridge University Press.
  • Malyshev, Denis A., Kirandeep Dhami, Thomas Lavergne, Tingjian Chen, Nan Dai, Jeremy M. Foster, Ivan R. Corrêa, and Floyd E. Romesberg. 2014. “A Semi-Synthetic Organism with an Expanded Genetic Alphabet.” Nature 509 (7500): 385–388.
  • Mandell, Daniel J., Marc J. Lajoie, Michael T. Mee, Ryo Takeuchi, Gleb Kuznetsov, Julie E. Norville, Christopher J. Gregg, Barry L. Stoddard, and George M. Church. 2015. “Biocontainment of Genetically Modified Organisms by Synthetic Protein Design.” Nature 518 (7537): 55–60.
  • Marlière, Philippe. 2009. “The Farther, the Safer: A Manifesto for Securely Navigating Synthetic Species Away from the Old Living World.” Systems and Synthetic Biology 3 (1–4): 77–84.
  • Marris, Claire. 2015. “The Construction of Imaginaries of the Public as a Threat to Synthetic Biology.” Science as Culture 24 (1): 83–98.
  • Marris, Claire, and Jane Calvert. 2020. “Science and Technology Studies in Policy: The UK Synthetic Biology Roadmap.” Science, Technology, & Human Values 45 (1): 34–61.
  • Marris, Claire, and Catherine Jefferson. 2013. Workshop on ‘Synthetic Biology: Containment and Release of Engineered Micro-Organisms’ Held on 29 April 2013 at King’s College London: Scoping Report. London: King’s College, London.
  • McCarthy, Elise, and Christopher M. Kelty. 2010. “Responsibility and Nanotechnology.” Social Studies of Science 40 (3): 405–432.
  • Moe-Behrens, Gerd H. G., Rene Davis, and Karmella A. Haynes. 2013. “Preparing Synthetic Biology for the World.” Frontiers in Microbiology 4 (January): 1–10.
  • Nissenbaum, Helen 1996. “Accountability in a Computerized Society.” Science and Engineering Ethics 2: 25–42.
  • Owen, Richard, John Bessant, and Maggy Heintz, eds. 2013. Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society. Chichester, UK: Wiley.
  • Owen, Richard, Phil Macnaghten, and Jack Stilgoe. 2012. “Responsible Research and Innovation: From Science in Society to Science for Society, with Society.” Science and Public Policy 39 (6): 751–760.
  • Penders, Bart. 2011. “Cool and Safe: Multiplicity in Safe Innovation at Unilever.” Bulletin of Science, Technology & Society 31 (6): 472–481.
  • Perrow, Charles. 1984. Normal Accidents: Living with High Risk Technologies. New York: Basic Books.
  • Poel, Ibo Van de, and Zoë Robaey. 2017. “Safe-by-Design: From Safety to Responsibility.” NanoEthics 11 (3): 297–306.
  • Presidential Commission for the Study of Bioethical Issues. 2010. New Directions: The Ethics of Synthetic Biology and Emerging Technologies. Washington, DC.
  • Rabinow, Paul, and Gaymon Bennett. 2012. Designing Human Practices: An Experiment with Synthetic Biology. Chicago: The University of Chicago Press.
  • Renn, Ortwin. 2008. Risk Governance: Coping with Uncertainty in a Complex World. London: Earthscan Publications.
  • Rip, Arie. 2006. “Folk Theories of Nanotechnologists.” Science as Culture 15 (4): 349–365.
  • Rip, Arie. 2014. “The Past and Future of RRI.” Life Sciences, Society and Policy 10 (1): 17.
  • Rip, Arie. 2016. “The Clothes of the Emperor. An Essay on RRI in and Around Brussels.” Journal of Responsible Innovation 3 (3): 290–304.
  • Robaey, Zoë. 2018. “Dealing with Risks of Biotechnology: Understanding the Potential of Safe-by-Design”.
  • Robaey, Zoë, Shannon L. Spruit, and Ibo van de Poel. 2018. “The Food Warden: An Exploration of Issues in Distributing Responsibilities for Safe-by-Design Synthetic Biology Applications.” Science and Engineering Ethics 24 (September): 1673–1696.
  • Rosa, Eugene A., Ortig, Renn, and Aaron M. McCright. 2014. The Risk Society Revisited: Social Theory and Governance. Philadelphia: Temple University Press.
  • Rovner, Alexis J., Adrian D. Haimovich, Spencer R. Katz, Zhe Li, Michael W. Grome, Brandon M. Gassaway, Miriam Amiram, et al. 2015. “Recoded Organisms Engineered to Depend on Synthetic Amino Acids.” Nature 518 (7537): 89–93.
  • de Saille, Stevienna. 2015. “Innovating Innovation Policy: The Emergence of ‘Responsible Research and Innovation’.” Journal of Responsible Innovation 2 (2): 152–168.
  • Schmidt, Markus. 2010. “Xenobiology: A New Form of Life as the Ultimate Biosafety Tool.” BioEssays 32 (4): 322–331.
  • Schmidt, Markus, and Víctor de Lorenzo. 2012. “Synthetic Constructs in/for the Environment: Managing the Interplay Between Natural and Engineered Biology.” FEBS Letters 586 (15): 2199–2206.
  • Schmidt, Markus, Kelle, Alexander, Ganguli-Mitra, Agomoni, and de Vriend, Huib, eds. 2010. Synthetic Biology: The Technoscience and Its Societal Consequences. Cham: Springer.
  • Schwarz-Plaschg, Claudia, Angela Kallhoff, and Iris Eisenberger. 2017. “Making Nanomaterials Safer by Design?” NanoEthics 11 (3): 277–281.
  • Smallman, Melanie. 2018. “Science to the Rescue or Contingent Progress? Comparing 10 Years of Public, Expert and Policy Discourses on New and Emerging Science and Technology in the United Kingdom.” Public Understanding of Science 27 (6): 655–673.
  • Stilgoe, Jack, and David H. Guston. 2017. “Responsible Research and Innovation.” In The Handbook of Science and Technology Studies. 4th ed., edited by Ulrike Felt, Rayvon Fouché, Clark Miller, and Laurel Smith-Doerr, 853–880. Cambridge, MA: The MIT Press.
  • Stilgoe, Jack, Richard Owen, and Phil Macnaghten. 2013. “Developing a Framework for Responsible Innovation.” Research Policy 42 (9): 1568–1580.
  • Stirling, Andy. 2008. “‘Opening Up’ and ‘Closing Down’.” Science, Technology, & Human Values 33 (2): 262–294.
  • Stirling, Andy. 2012. “Opening Up the Politics of Knowledge and Power in Bioscience.” PLoS Biology 10 (1): e1001233.
  • Stirling, Andy. 2019. “Engineering and Sustainability: Control and Care in Unfoldings of Modernity.” SWPS, 2019–06.
  • Stirling, Andrew, K. R. Hayes, and Jason Delborne. 2018. “Towards Inclusive Social Appraisal: Risk, Participation and Democracy in Governance of Synthetic Biology.” BMC Proceedings 12 (S8): 15.
  • Swierstra, Tsjalling, and Arie Rip. 2007. “Nano-Ethics as NEST-Ethics: Patterns of Moral Argumentation About New and Emerging Science and Technology.” NanoEthics 1 (1): 3–20.
  • Torres, Leticia, Antje Krüger, Eszter Csibra, Edoardo Gianni, and Vitor B. Pinheiro. 2016. “Synthetic Biology Approaches to Biological Containment: Pre-Emptively Tackling Potential Risks.” Essays in Biochemistry 60 (4): 393–410.
  • Winner, Langdon. 1977. Autonomous Technology: Technics-Out-of-Control as a Theme for Political Thought. Cambridge, MA: The MIT Press.
  • Winner, Langdon. 1986. The Whale and the Reactor: A Search for Limits in an Age of High Technology. Chicago: The University of Chicago Press.
  • Winner, Langdon. 1990. “Engineering Ethics and Political Imagination.” In Broad and Narrow Interpretations of Philosophy of Technology, edited by Paul T. Durbin, 53–64. Dordrecht: Springer Netherlands.
  • Wright, Susan. 1994. Molecular Politics: Developing American and British Regulatory Policy for Genetic Engineering, 1972–1982. Chicago: The University of Chicago Press.
  • Wright, Oliver, Guy-Bart Stan, and Tom Ellis. 2013. “Building-in Biosafety for Synthetic Biology.” Microbiology 159 (Pt 7): 1221–1235.
  • Wynne, Brian. 1988. “Unruly Technology: Practical Rules, Impractical Discourses and Public Understanding.” Social Studies of Science 18 (1): 147–167.
  • Wynne, Brian. 1992. “Uncertainty and Environmental Learning: Reconceiving Science and Policy in the Preventive Paradigm.” Global Environmental Change 2 (2): 111–127.
  • Wynne, Brian. 2001. “Creating Public Alienation: Expert Cultures of Risk and Ethics on GMOs.” Science as Culture 10 (4): 445–481.
  • Wynne, Brian. 2002. “Risk and Environment as Legitimatory Discourses of Technology: Reflexivity Inside Out?” Current Sociology 50 (3): 459–477.
  • Zwart, Hub, Laurens Landeweerd, and Arjan van Rooij. 2014. “Adapt or Perish? Assessing the Recent Shift in the European Research Funding Arena from ‘ELSA’ to ‘RRI’.” Life Sciences Society and Policy 10 (1): 11.