266
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Mismatched and misaligned: responsibility narratives in American research labs for synthetic biotechnologies

ORCID Icon
Article: 2355682 | Received 15 Dec 2023, Accepted 11 May 2024, Published online: 30 May 2024

ABSTRACT

This article examines what responsibility means in the context of synthetic biotechnologies, based on academic researchers in the American west who are using/developing synthetic biology, engineering biology, and synthetic genomics. Advancements in technical capacity are ushering in imminent/current possibilities of creating whole genomes/organisms from scratch, yet extant narratives about ‘responsibility’ have neither been fleshed out, nor compared against normative frameworks (such as ELSI and its critiques). Through empirical data collection (e.g. discourse analysis), this paper examines interviews with biotechnologists (N = 16) to analyze responsibility narratives on the ground, which include: being responsible towards grand challenges, national values, and research relations involving other beings in the lab, both human and more-than-human. The analyses presented here offer feminist and multispecies critiques for studying the relational webs of responsible (response-able) research and concludes with a discussion about the mismatch between how responsibilities are narrativized across different actors within academic research institutions.

This article is part of the following collections:
Critique in, for, with, and of Responsible Innovation

Introduction

Technological capacities in genomics have progressed significantly since the Human Genome Project (HGP). The latest technologies engage with whole genomes, shifting attention from reading larger swaths of genetic sequences to writing them (Chari and Church Citation2017, Ostrov et al. Citation2019, Powell Citation2018). In the US, research communities such as Genome Project-Write (known colloquially as GP-Write) epitomize the Richard Feynman quotation ‘what I cannot create, I do not understand’ through explanations like this one on their website: ‘[m]any scientists now believe that to truly understand our genetic blueprint, it is necessary to ‘write’ DNA and build human (and other) genomes from scratch.’Footnote1 Other groups focus on building whole synthetic cells (i.e. Build-a-Cell) or aim for applied genome-scale engineering that ‘[i]f deployed responsibly, such progress will improve the world’ (Genome Writers Guild, Citationn.d., para.2, emphasis added). Crucially, what responsible innovation entails in these setting remains a broad, if undifferentiated, goal in these scientific communities.

The increased technological capacity has been accompanied by a short but fraught history about what responsible or ethical research means amongst the researchers working in this area. For instance, Boeke et al. argue for the twin needs of ‘technology and an ethical framework for genome-scale engineering’ and they enumerate their understanding of responsible innovation to mean increased public discourse, biosafety, and equitable distribution (Citation2016, 126, emphasis added). Just weeks prior to publishing this commentary, Boeke and others hosted an invitation-only meeting to discuss whether and how to synthesize a human genome from scratch. In an open letter dated for the day of the meeting, Endy and Zoloth counter with: ‘just because something becomes possible, how should we determine if it is ethical to proceed? […]. Discussions to synthesize, for the first time, a human genome should not occur in closed rooms’ (Citation2016, 2). Juxtaposing these stances shows how the technical capacity to synthesize genomes does not come with consensus from the communities developing these technologies. While debates in these circles continue, normative frameworks for addressing non-technical concerns have not matured to explicitly address what it means to create genomes – or the organisms they encode – from scratch.

In the American context, non-technical concerns became the foci for the Ethical, Legal, and Social Implications (ELSI) program, which was developed in the 1990s across two American governmental agencies (the National Institute of Health and the Department of Energy) to accompany the completion of the HGP (NHGRI Citation2012a). ELSI was tasked with creating working groups, funding extramural research, and identifying the societal impacts of discoveries related to genome sequencing through topics such as genetic privacy and informed consent in population testing or the role of race as a population category (NHGRI Citation2012b). Through these working groups, and adjacently in the field of Science and Technology Studies (STS), social scientists have opened up spaces to critically analyze and critique ELSI's efficacy, noting that: ELSI research only affirms and enables emergent biotechnologies (Rabinow and Bennett Citation2012, 18; Balmer and Bulpin Citation2013), it engages too late in the process of innovation (Calvert and Martin Citation2009, Hurlbut Citation2015b, Myskja, Nydal, and Myhr Citation2014), and it bifurcates natural and social sciences in trying to deliver on ELSI assessments (Balmer et al. Citation2012, Balmer et al. Citation2016, Schyfter and Calvert Citation2015). The empirical insights in this paper seem to suggest that these critiques from up to a decade ago still apply. What is different now is that, at least with some researchers, ‘a major burden of responsibility for setting standards rests with the scientists and their community’ (Boeke et al. Citation2016, 126; see also Sc2.Citation0 Statement of Ethics and Governance, Citation2013). It appears that American intentions to carve out space for non-technical discussions indeed separated ELSI as a framework for understanding and implementing responsibility as a top-down measure.Footnote2 In turn, this separation disconnects regulatory frameworks of responsibility from the bottom-up and extant understandings of responsibility.

Debates about responsibility are nothing new, with critiques and counter-critiques on what it entails or where it fails. While most discussions about responsible research in this area have involved key actors from social science and policy spheres, this paper takes a qualitative approach to study how responsibility is perceived, described, and implemented by the very researchers working with/in synthetic biotechnologies.Footnote3 In other words, this paper ‘addresses the paucity of studies on scientists’ sense-making of calls for responsibility’ (Glerup, Davies, and Horst, Citation2017, 321) and considers how bottom-up practices ‘have tended to be overlooked or under-reported’ by other scholars studying responsible research and innovation (329). As such, the empirical research presented here contributes to the growing corpus of studies that examine responsibility as it is imagined, named, made sense of, and implemented in both mundane and institutional ways.

By studying the extant narratives about responsibility outside of frameworks such as ELSI, this paper builds on what others have termed ‘bottom-up practices’ (324) and ‘de facto’ practices (Randles et al. Citation2016, 32) in the context of emerging biotechnologies. So instead of measuring ELSI compliance, assess the efficacy of ELSI as a problem knot, or define responsibility through a normative approach (Stilgoe, Owen, and Macnaghten Citation2013), this paper analyzes how researchers narrativize responsibility in extant but also thematic ways. Studying these themes across a regional context and across different academic research settings shows how the very people doing the work of developing technical capacity are making sense of responsibility through language and community practices. This approach offers insights about what exactly constitutes responsibility ‘on the ground’ relative to how it is operationalized (or not) and how it has evolved (or not) with top-down, normative frameworks such as ELSI.

Meanwhile, the novelty of tinkering at the level of whole genomes and organisms ushers in heretofore unconsidered non-technical concerns, ‘forcing a conceptual reevaluation’ about the relationship between parts and wholes (Palsson Citation2000). Working at the scale of whole cells, whole genomes, and whole populations requires higher specificity (Wolfe Citation2015) as well as extra-disciplinary approaches (Szymanski et al. Citation2020) to grapple with the complexity of bringing about new life through genetic means. What working at the whole genome/organism scale makes visible are tensions between top-down and bottom-up notions of responsibility: normative frameworks (e.g. ELSI, National Science Foundation's Broader Impacts) may be seen as an impediment or detractor to research aiming for whole genome/organism research precisely because genomes/organisms are seen as applied, impactful, and timely interventions; and, in this framing, bottom-up notions of responsibility are thus interpreted and implemented separately – to expedite rigorous research, to scale up for commercialization, and to maximize the transformative potential of synthetic biotechnologies.

The paper is structured in the following manner: Section 2 provides background to the American context with ways that responsible research has been formulated and structured in normative frameworks. Section 3 describes the methodological approach to data collection nd analysis. Section 4 reports on empirical insights to responsibility narratives of researchers in synthetic biotechnologies such as synthetic biology, engineering biology, and synthetic genomics. By focusing on the researchers themselves, this section examines how responsibility is constructed by the very people using, developing, and promoting these technologies. The responsibility narratives discussed below include feeling responsible towards grand challenges, national values, and research relations. Section 5 discusses the tension between top-down (i.e. normative) and bottom-up (i.e. extant) notions of responsibility, including implications for future research in this geographic and field-specific context. Section 6 concludes by connecting this paper to a growing corpus of extant responsibility narratives on the ground and contributing to conversations about how and when differential notions of responsibility align. The paper ends with a call for more feminist and multispecies approaches to qualitative research in this area.

Background

Sequencing and the allure of manipulating genomes captured the scientific ethos of the early 2000s, with milestones such as the completion of the HGP, the first synthetic biology conference (colloquially called SB1.0), and the National Science Foundation (NSF) funding the Synthetic Biology Engineering Research Center (SynBERC) around the same time (Si and Zhao Citation2016). With the founding of SynBERC came a mandate to study the non-technical dimensions of synthetic biology, then headed up by anthropologist Paul Rabinow and bioethicist Gaymon Bennett in a field they termed ‘human practices.’ Part of their move away from ELSI was in figuring synthetic biology as a ‘post-ELSI’ technology (Citation2012, 19; see also Balmer et al. Citation2012; Balmer and Bulpin Citation2013; National Academy of Sciences Citation2013), with the explanation that ELSI programs ‘are themselves limited in their scope by their original mandate to operate downstream and outside of the [genetic] sequencing efforts’ (Rabinow and Bennett Citation2012, 13).

Until 2010, ELSI in synthetic biology focused on concerns about economic, political, and cultural issues such as identifying the field's values and navigating intellectual property rights of novel biological constructs. But a confluence of third-party reports (e.g. the Fink Report on dual-use, the Sloan report on synthetic biology) and carryover from a Bush-era relaxation of governmental regulation prompted the NSF to shift priorities of ELSI to explicitly focus on security and safety – especially with regard to malicious actors and/or unintended effects (Kelley and Associates Citation2014, 197; Rabinow and Bennett Citation2012, 173). By 2010, researchers at the J. Craig Venter Institute (JCVI) had successfully assembled a synthetic genome in Mycoplasma mycoides (Gibson et al. Citation2010), to which the Presidential Commission for the Study of Bioethical Issues responded with an analysis of ethical dimensions to guide emerging technologies such as synthetic biology (Presidential Commission for the Study of Bioethical Issues Citation2010). Although the report explicitly names responsibility as one of the ethical principles (11), the phrasing surrounding the term connotes being a good actor, steward, and citizen in the context of biosafety/biosecurity (see also Frow Citation2017). Only recently has there been a sobering and salient call that ‘all research is inherently dual use’ (Evans Citation2022, original emphasis), exposing ‘the assumption that good intentions lead to good outcomes’ (85).

Other governmental bodies promulgated their own programs in past decade. The Defense Advanced Research Projects Agency (DARPA) convened its own Legal, Ethical, Environmental, Dual-use, and Responsible Innovation (LEEDR) activities since 2017. Under the Obama administration, the Department of Defense founded its latest Manufacturing Innovation Institute, BioMADE, in 2021, where responsibility is conceptualized as an alliteration of safety, security, sustainability, and social responsibility (4S). Both DARPA and BioMADE double as funding agencies for innovative research in synthetic biotechnologies, yet their attempts to codify and implement responsible innovation is also limited. As mentioned in a footnote of a JCVI report (Carter et al. Citation2014, 13), American ‘regulatory agencies have limited authority and capacity to address non-physical harms such as social and ethical issues’ that inhere to innovations with (bio)economic value. The limitation is because risk assessments for such innovations require quantifiable criteria: with no way to preemptively assess innovation risks (which are new and thus yet unknown by definition), and where monitoring techniques cannot be known in advance, the non-technical concerns can fall outside the remit of these regulatory parameters. Thus, a sticking point for the American context is that regulatory bodies have limited authority to enforce compliance.Footnote4

Methodological approach

This paper is empirically grounded in interviews with researchers in/adjacent to academic research labs located in the western and midwestern United States. These conversations were supplemented by laboratory visits and field-specific conferences to sense-check preliminary findings. All data collection took place in 2022 from March to November. Although 36 interviews were conducted as part of a larger study that examines the shift from parts-based engineering (e.g. genetic circuits) towards whole genome/organism engineering, this paper focuses only on a subset of informants (N = 16), with 5 early-career researchers/trainees (abbreviated ECR here on), 6 senior researchers and principal investigators holding their own labs (SrR), and 5 researchers/directors working in biosafety/biosecurity (Non-Tech) who actively discuss responsibility during the interview (see ). Throughout data analysis, Non-Tech perspectives were not cordoned off because these people work directly with or were once themselves lab researchers, and because they are part of the community-level efforts to make sense of responsibility.

Table 1. A snapshot of informants. Organism names have been redacted based on the agreements made in informed consent forms. Only two of the eleven lab researchers used so-called canonical model organisms (see Anteky and Leonelli Citation2020, 2). Members from the ECR and SrR groups are not always from the same labs, although there are three sets of trainee-PI pairings which may point to the possibility of discussions about responsibility following a lab-specific lineage. All informants will be referenced using singular they/them pronouns.

The interviews were coded using grounded theory (detailed below) with attention to relationality and to what or to whom an informant felt responsible. 14 out of the 16 interviews were held in person (the two online were with informants in the Non-Tech group) and these interviews ranged from 40 min to 3 h, with an average length of 62 min. Semi-structured interview questions focused on what responsible research looks like now and in the future, given that emergent biotechnologies are imminently or already moving towards the scale of whole cells/organisms.

Interlocutors and field sites were chosen based on their desire to work at the whole genome/organism level (e.g. Arabidopsis, Aedes aegypti), not necessarily the technologies employed (e.g. CRISPR, TALENs). Most interviews were framed as trying to understand the shift from parts-based genetic engineering towards whole cell, whole genome, or whole population engineering, given that genome implies an organism, and we tend to relate to whole organisms differently than genetic circuits and chromosomes (Calvert and Szymanski Citation2020). Part of the rationale for studying the American west was to bring a grounded perspective from US actors commonly overlooked (i.e. not centered on the likes of Wyss Institute or Ivy Leagues; see e.g. Dan-Cohen Citation2021, Roosth Citation2017, Shapiro Citation2020) to shed light on under-researched locales and contexts.

A key goal during interviews was to prioritize building relationships with interlocutors precisely because questions about responsibility can seem interrogative, even accusatory, during first meetings – a caution given to me by a key informant. For instance, the interview questions discussing responsibility came towards the end of our conversation, which was a decision made given the extant relationships between natural and social scientists being tainted with histories of compliance rhetoric (Hurlbut, Saha, and Jasanoff Citation2015, para.13; Hurlbut Citation2015a). With these field-specific histories in mind, a grounded approach was employed to observe how responsibility was conceptualized instead of asking interlocutors to compare their definitions against formal academic or regulatory literatures. Since grounded theory acknowledges that ‘language confers form and meaning on observed realities’ (Charmaz Citation2006, 47), interviews were kept dialogic to ensure details and nuance could emerge. The grounded approach also allowed data collection to be iterative and see the ‘emergent connections between the emerging code’ (Glaser Citation1978, 39). Participant observation during lab visits and discourse analysis (of lab websites, of papers/commentary authored by informants, of science communication literature featuring or penned by informants) complemented data analysis, providing insights into the relational web that make up this scientific community.

Results

This section presents empirical data, beginning with the range of different notions of responsibility, including direct statements about it and/or critiques related to frameworks like ELSI (section 4.1). Three themes will follow: being responsible to grand challenges (4.2), to national values (4.3), and to research relations (4.4). Throughout, I call upon concepts in feminist technoscience, communication studies, and multispecies studies to examine the varying ways that responsibility is conceptualized, performed, and discussed.

Existing ideas of responsibility, and extant critiques of ELSI

Existing ideas about responsibility call upon language from institutional ethics boards on research conduct, which align self-conduct with predefined notions of responsible research conceived at the institutional level. Here, responsibility is made synonymous with research integrity, usually characterized by avoiding its most common offenses such as the falsification of data, breaching national security, or disregarding conflicts of interest. Or, as Non-Tech3 explained, conducting risk assessments (e.g. to indigenous communities or lands) is seen as a sign of responsibility today. It may be that these forms of responsibility – as moral and individual action – resonate with researchers since they are more proximal to their daily practices than broader questions about responsible research or innovation for an entire field. Two informants mentioned the (then pending) Theranos/Elizabeth Holmes case, and one pointed to ongoing paper retractions within the academy, suggesting that scientists are indeed questioned about their intentions, and thus ought to practice a degree of self-questioning. ECR5 pointed out the need for better ethics training in this regard, and lamented the reality that they did not have the infrastructure to cultivate the skills for self-reflexivity: ‘I mean, if I have an ethical question, who do I go to?’ (18 October 2022). They then enumerated how colleagues would not have the wherewithal to engage in such a conversation, let alone the time to do so.

Predictably, all of the ECR and SrR interviewees (i.e. lab researchers), except one, had more to say about what responsible research looks like than on formal ELSI structures. SrR2 describes responsibility as being ‘in the eye of the beholder,’ an elusive and subjective ideal like beauty. Similarly, ECR1 explained that their philosophy is not to predefine what responsibility is or looks like precisely because its criteria need to be co-created with other stakeholders: ‘What ‘responsible’ exactly will mean is partially a developing goalpost trying to understand what people need. It does not mean pure consensus because we’ll never get there. But it does mean approaching the science with more people in mind than the people using the science’ (06 May 2022). With a nod to stakeholder engagement, these considerations align with RRI's definitions of inclusivity and reflexivity, without calling it such. Non-Tech2 echoes a similar need to include multiple perspectives, stating that critical approaches cannot be captured in any one acronym: ‘I feel like at the end of the day, a lot of this is about really building trusted relationships between very different disciplines that can help to recognize the different norms they bring into study, like a process of deciding which questions are actually worth paying attention to’ (21 June 2022). Both ECR1 and Non-Tech2 demonstrate how it matters what questions get taken up and with whom.

Other reflections on responsibility were formulated as critiques of ELSI, explaining how it lacks teeth as a framework (5 out of 5 Non-Tech informants, 2 SrRs), how ELSI belonged more in the biomedical realm of patient rights than in the academic realm of basic research (4 Non-Techs), and how biosecurity and biosafety were actionable concerns whereas ELSI was too diffuse in scope (3 Non-Tech, 3 SrRs, 1 ECR). What unites these critiques is that ELSI is seen as a compliance metric, representing a view from the outside, neatly cleaved from the laboratory and, as a result, is seen as an extra item tacked onto projects. In turn, the appended ELSI falls to the wayside precisely because responsibility seems to be already operationalized in ways that are distinct from top-down mechanisms. Non-Tech1 describes this tension, especially in the context of university pressures to produce so-called cutting-edge research:

there's a conflict between avoiding risks and taking risks, because you need to take risks to be able to innovate, and that's super important. But the whole point of safety and security is to eliminate risks to the extent possible. And so I think it's a complicated relationship with ELSI in that you’d like to consider the ethical, legal and societal implications through a security lens but then also through other lenses. I mean, in general, I think the ELSI acronym is a little limiting. Safety, security, those could be in there. You could have sustainability. But there's no perfect acronym that encompasses all of the things besides the benchwork. (Non-Tech1, 19 October 2022, emphasis added)

With no perfect acronym to capture ‘all of the things besides the benchwork,’ it appears that researchers are following their own narratives for what responsibility entails.

Offhand phrases like ‘that ELSI stuff’ indicate that ELSI exists as a separate sphere, or at least separate enough to be perceived as having its own communities, discourses, and pathways to knowledge production/mobilization that it seldom overlaps with its scientific counterparts. Informal conversations about ELSI (e.g. during breaks or meals at conferences) reinforce this separation, with comments such as ‘I just do the science,’ and that whatever people do with it afterwards is not the responsibility of lab researchers. This tension – how scientists view ELSI as an externality and a domain for others to take up – gives way to scientists having their own enactments of responsibility, thematically categorized in the next sections as responsibility towards grand challenges, national values, and research relations.

Responsibility towards grand challenges

To be responsible in this context means foregrounding the scientific means to meet societal challenges. The connection between engineered solutions and grand challenges is explicit. As other studies have found (Välikangas Citation2022, 95; Kuhlmann and Rip Citation2018; Robinson et al. Citation2021, 212), the narrative of grand challenges aims to revolutionize how contemporary science could redress dire problems in health or climate crises. Implied throughout these narratives are moments that responsibilize scientists to address the grand challenges of today, best epitomized by one interlocutor who shared a frequent talking point:

[This is] the question I ask my students in one of the undergrad courses about genome editing or GMOs: we have a drought in California, what are we going to do? Let's say we know genes from Arabidopsis, exactly [the one] that can control drought response. Is it ethical or unethical to genome edit the tree to save it from the drought and climate change that we inflicted? (SrR3, 09 June 2022)

SrR3's example shows how science should be addressing challenges ‘that we inflicted’ while framing its use in terms of ‘ethical or unethical’ action. This binary also frames ethical decisions about what one ought to do as a matter of taking scientific action (i.e. conflating ‘taking action’ with ‘doing good science’ with ‘ethical’ behavior).

Grand challenges are defined by scale. Permeating the interviews was the concern for this need to scale up (6/6 SrR, 3/5 ECR), both in terms of academic capacity (e.g. beyond a singular proof-of-concept to standardization across the field) and in terms of translating so-called basic discovery to industrial settings. Often in the same breath were aspirational phrases that voiced the desire to have a positive influence at grand scales: ‘And so the idea is: can you scale it? Can you scale the system? And so the answer is yes. We can. And so [the question now is] how to transition because we’ve shown it's possible; can we be impactful and relevant?’ (SrR1, 22 March 2022). Or, consider this paraphrased goal from the perspective of two ECRs: I want to do cool science and make a difference in the world. This desire to be impactful epitomizes the alignment between academic research success (on a personal scale), scientific progress (field scale), and making a difference in the world (societal scale) such that these three scales mutually reinforce one another to conflate scientific impact with responsible action.

To speak of one's research in terms of grand challenges also suggests a generational shift within the scientific community. For one, trainees were quick to voice concerns about their career prospects when multiple, global crises loom ahead: ‘what's the point of moving forward some of this [organism redacted] genetic engineering research, if it's all going to make no difference 30 years from now’ (ECR4, 30 June 2022). Here, basic research implies a kind of futility where one may not make it – colloquially, professionally, or existentially – unless one frames their research in terms of innovation that can intervene in the concerns of today, like public health crises and climate crisis. This focus on impact is relatively new, as seen in this set of extracts:

What I really want to achieve is helping people and society in some way that I can tangibly see it helping. So whether that be doing research and making new discoveries of like, antimicrobials or ways that we can improve plants, I would only really want to do stuff that I still see as having tangible, good coming out of it. (ECR3, 17 June 2022)

Today, especially with a lot of young trainees, they want to have an impact on society. So they’re framing their reasons for coming into science and academia, as coming from that purpose-driven research area. So from that standpoint, innovation is absolutely the metric with which you judge purpose-driven research. It's not the metric by which you judge curiosity-driven research. And when I was in my training, curiosity-driven research was in some ways over-emphasized. (SrR5, 17 October 2022)

SrR5, who serves as department chair and advisor to new PIs, observes how purpose-driven research carries more valence in the context of the academy. This is consistent with what Dabars and Dwyer have described as the trifecta of ‘education, research, and public service at the scales needed to respond to the opportunities and challenges facing society today’ (Citation2022, 120; see also Kleinman Citation2003). For ECR3, it seems that tangible goods – through material goods like antimicrobials or better plant strains – are the metric for doing responsible research today.

Responsibility towards national agendas and staying in the lead

Another theme in responsibility narratives foregrounds American values, including individualism, entrepreneurship, and resilience in a global supply chain. Steeped in historical inflections of manifest destiny, this responsibility narrative emphasizes America being the vanguard of emerging biotechnologies. It communicates less of a nationalistic pride and more of an internalized race to be – and stay – in the lead against global competitors. Securing this lead is done through infrastructural means (e.g. institutes), material means (e.g. patents and publications) as well as through softer, discursive means (e.g. presidential speeches). Even if ‘America’ may not have been named explicitly, placeholder terms like ‘home turf’ and ‘domestic’ and ‘local’ were coded as part of this theme.

Most interviewees rarely discussed national agendas in an explicit manner, but many referred to ‘the bioeconomy’ as a higher order rationale for pursuing innovation, especially after September 2022. At that time, President Biden issued an Executive Order formalizing the advancement of biotechnologies to grow capacity for biomanufacturing while also bolstering American leadership in the global economy (Biden Citation2022). Here, bioeconomy implies the American bioeconomy through references such as pursuing ‘homegrown’ economic solutions (Department of Energy, Citation2022, 51). Strategic documents describe the U.S. bioeconomy as the path for a ‘resilient and competitive future,’ made possible by a ‘concerted national effort – a ‘warp speed’-type’ to manifest American biomanufacturing (Hodgson, Alper, and Maxon, Citation2022, 18). By framing the bioeconomy as a national project, developing the technology becomes a way to ensure a promising future for citizens. In line with this kind of thinking, SrR1 remarks how ‘US taxpayers are paying for [this research]; it belongs to us taxpayers. It's part of the bioeconomy. Your kids, my kids, the future of their jobs? We need to keep it here’ (22 March 2022). For SrR1, securing the lead (in technological innovation) and securing work for future generations are intertwined. Similarly, SrR2 was quick to point out that the recently established BioMADE was under the purview of the Department of Defense, not commerce, to serve America's national security interests through economic means (Department of Defense, Citation2020). Thus, the bioeconomy – as a catchphrase – conflates technological prowess with global leadership and taking care of a domestic workforce.

The perception of being the first and the fastest (or at least on track to becoming first) in a bioeconomy is often distilled in the form of scientific roadmaps, or official statements that synthesize current developments and future directions for a burgeoning research area. Most informants spoke of these roadmaps favorably, pointing to them as emblems of a research community coming together. However, one interview discussed roadmaps as a performative, flag-planting exercise (e.g. EBRC has published one annually since 2019). SrR6 criticized that the field does not need another roadmap because a map, by definition, does not spell out the desires about where one wants to go. It only shows possibility. And in not naming where the field wants to or ought to go next, these roadmaps obscure who would take responsibility for steering a field in a certain direction:

I mean, where are we going? Since culturally it's been difficult to talk about biotechnology, let alone synthetic biology, instead, we’ve discovered that we can use the word bioeconomy. Jobs and money, as a proxy for reality[…]. So [these leaders] abdicate responsibility, particularly where we’re going. It's just: we’re gonna keep going […] because if you have the best tools, you’re going to make the best tools and you can be in the lead. (10 June 2022)

This extract highlights conflicting ideas about what responsibility means. First, to just ‘keep going’ and ‘be in the lead’ is seen as the responsible thing to do by at least some members of this scientific community who pen these roadmaps. These scientists view responsibility as leading and naming future directions of a bioeconomy that develops and benefits from the best tools. SrR6 sees this form of responsibility as being directionless, where tool development for the sake of being in the lead becomes proof that they ‘abdicate responsibility.’ This mismatch demonstrates how in not naming a direction to pursue, the roadmap functions as a political enunciation disguised as an apolitical one.Footnote5 It rallies for perpetual (bioeconomic) growth without naming it as such.

To act with urgency in these nationalized contexts requires that people promote biotechnology, both to advance its technical capacity and to advertise its use within the confines of a national regulatory framework. Thought in this way, ECR1 describes how responsible engineering ought to act quickly:

This is a powerful technology that can help a lot of people and just benefit the world at large. And so, we want to make sure that the technology does not sit in a corner in a tube somewhere for ages and people don't see the benefit or don't receive the benefit of it. And so in that sense, [we] also promote genome engineering to actually hit the market to actually move forward. (06 May 2022)

ECR1's concern hinges on society not seeing the fruits of science's labor, where implied roadblocks (such as national-level regulations) would withhold the benefits that science could confer. In a similar way, several interviewees (SrR1, SrR4, SrR6) lambasted the moratoria placed on transgenic research after the 1975 Asilomar convention, expressing weariness about how such barriers cannot be placed again. Another interlocutor echoed ECR1's desire for ‘the right amount of caution’ but explains it in relation to pacing:

what's the cost of being really slow or not doing a thing? Of course, do the thing carefully and responsibly, but with the kind of power it [the technology] can possibly have, being slow can be quite dangerous. And you see that with, for instance, the pandemic. It was amazing that we managed to get vaccines as quickly as we did. But, what if we could have printed out that vaccine within a day of the virus and distributed it to everyone in the world? We probably could have done that on a technical timeline. And so, what was the cost of being very careful about development? (ECR5, 18 October 2022)

Their description frames responsibility as developing tools quickly, pointing to the deleterious effects of waiting in their pandemic-vaccine example.

Imbued throughout these conversations is the sense of inevitability: the dawn of genome engineering has already broken, the technical power already unleashed,Footnote6 making now the time to act. The momentum generated by this new technology is best captured in this comment by a senior researcher: ‘I hear people say things like, ‘if you’re not part of the steamroller, you’re part of the road. What do you want to be?’’ (10 June 2022). This axiom references Stewart Brand (Citation1987, 22), a tech-futurist credited with establishing the visions for Silicon Valley to flourish, whose ideology combined the entrepreneurial spirit to outdo foreign competitors. Thus, tied in with this urgency is often the hope for industrial growth, coupling technosocial problems with domestic economic solutions.

Responsibility towards research relations: response-ability and being in relation with (more-than-human) others in labs

The third theme shows scientists describing the relationships that comprise their day-to-day work, which can be analyzed in terms of responsible research, even when they do not call it so. Glerup et al. (Citation2017) found similar bottom-up practices ‘that rarely used the language of responsibility or ethics but which nonetheless involved efforts to ensure the production of ‘good science’ – meaning, science which was morally robust as well as technically excellent’ (324). Based on observation, it appears that efforts were directed at and framed in terms of maintaining and fostering relationships with other people (in the scientific community) as well as other more-than-human beings (in the lab). To better understand these relations, I borrow from literature in the subfields of multispecies studies and feminist technoscience who have broadened the understanding of responsibility towards one of response-ability. Based on the data collected, response-ability in this context means the ability to respond (to colleagues, to research organisms, among others) to help scientists produce ‘good science’ which in turn, helps them make sense of responsible research and innovation.

Broadly, fhe term response-ability highlights the relationships that one listens to and responds, focusing on how one becomes capable of response to other beings – human or otherwise. Response-able research shifts the focus away from organisms as objects of study (i.e. working on organismal research) towards the always and already co-constituted subjectivities of researcher-and-researched (i.e. working with organisms). In the words of Vinciane Despret, researchers working with animals ‘construct the possibility of engaging both the animals and themselves, through an embodied communication, into a ‘responsible’ relation. They become responsible through this relation’ (Citation2013, 70; see also Despret Citation2008, 129). To be response-able towards other beings (like research organisms) entails attending to their liveliness (upon which good science depends) and maintaining good relations with them. Over time, research relations become characterized by co-responding to each other (i.e. co-respondence).

In practice, response-ability accounts for the dynamicity of living systems and attuning to their specificities, such that one's responses in such systems are idiosyncratic and multiplicitous, not universal or the same every time. So when informants were discussing the relationships they have with their organisms, they often narrated their daily encounters, their stories of co-discovery over time, as well as their care/maintenance protocols, quirks, and affective moments that made up their research journeys with their particular organism. Consistent with other studies that examine human-nonhuman research in this entangled manner (horses in Despret Citation2008, dogs in Haraway Citation2008, cup corals in Hayward Citation2010, yeast in Calvert and Szymanski Citation2020), the informants had to be capable of response to even get this far into their respective research careers, and then continually render themselves capable of response for any novel research to unfold.

Being in relation with these organisms was both part and parcel of responsible (and response-able) research. In general, ECR's were more likely to be thinking about the specificities of their organism, whereas senior scholars often ‘trained in one organism and stuck with it’ (SrR4, 23 June 2022). For instance, ECR3 works with a non-model organism that enables more efficient research, compared to Arabidopsis/Nicotiana or maize/sorghum which take several weeks and several months to grow, respectively (i.e. the phenotypic results make the success of the experiment explicit and clear, but these results take the lifetime of the organism to show the efficacy of scientific intervention). They delighted in explaining to me that ‘I can just do so many more experiments in a shorter amount of time’ (17 June 2022). They also noted the difficulty of finding consistent funding because the organism neither has a sizable research community nor does it match the priorities of academic funding agencies – a critique already acknowledged within the scientific community (Gilbert Citation2009; Ostrov, Citationn.d.). But even with this noted disadvantage, ECR3's organism allows for quicker discoveries, including negative results that someone else in the scientific community does not have to pursue. In this way, the response-able relations that one has with their organism goes on to shape the research outputs and thus the relations with other colleagues.

These collegial relations are another way that scientists seem to cultivate a sense of response-ability. Being in relation with colleagues means offering better alternatives to resource-intensive protocols in the lab and saving time so that other (usually future trainees) are spared the trouble. For instance, one lab focuses on building methodological capacity for an entire species family (e.g. through DIY bots) to new modes for efficient data collection. In a different lab, ECR2 speaks of their work on tissue culturing, and ‘trying to avoid that completely, to reduce the amount of time and effort that people have to usually do it. It's been the main bottleneck in plant biotechnology’ (06 May 2022). Later, when asked about the drivers for pursuing this work, they couple painful pasts with hopeful futures:

I mean, I poured my soul into tissue culture in graduate school, and any … any light that tells me that in the next whatever years, we can do away with that, that makes me pretty excited because I don't really want anybody to go through tissue culture.’ (ECR2, 06 May 2022, original emphasis)

For ECR2 (as well as ECR4 and ECR5), streamlining the process of inquiry is the responsible thing to do to serve the greater scientific community. As part of this endeavor, choosing ‘better’ organisms increases responsibility towards each other as collegial scientists, even if the organismal relation is an instrumentalized one (see Despret Citation2016 and Haraway Citation2008 for discussions on co-constituted work relations and shared suffering).

One of the non-technical informants (Non-Tech2) knew of the multispecies/feminist discourse on ‘response-ability’ prior to our conversation. They noted an observation that biotechnologists of today were more likely to practice a kind of ‘ecological-mindedness’ (19 October 2022) which acknowledged how humans do not exist in a vacuum and are always/already a part of environments shared with other creatures. They suggested that framing relationships to – and through – nature could open up questions about shared responsibilities and problematize who gets to decide what to do. (See also the work of Natalie Kofler at Editing Nature.) So rather than ‘making moral claims’ about responsibility, they are interested in using response-ability as a way to think through all relations: ecological, interpersonal, societal and otherwise:

It would be interesting to see the extent to which that [questions of response-ability] could shift how people think like, ‘we’re going to save the world’ … like, what if we’re just going to, first of all, listen to what actually people want saving from, and then build scientific research projects around that. Then you are doing the response. So responsibility would be kind of a built-in. (19 October 2022).

It may be that cultivating response-ability (the capacity to respond to specific needs of other beings) stands at odds with the greater academic industrial complex where the pressures of time in such a complex – publish or perish, grant applications, teaching loads – continue impinging upon one's capacity to respond. The sense of crunch time was most palpable amongst the ECRs and SrRs but even Non-Tech informants were sensitive to the primacy placed on time. Non-Tech1 notes how the pressures of time affect what gets reported in ELSI and National Science Foundation's Broader Impacts statements, because they rely on disclosing information to people in oversight roles:

And researchers will tell them things if they have a good relationship. And part of maintaining a good relationship is not wasting time. And so I think it's helpful to be efficient in the conversations that folks in oversight roles have with researchers, because researchers might get mad if they perceive you as butting in on what they think is their core responsibility of making science. (Non-Tech1, 19 October 2022, original emphasis)

To highlight some of Non-Tech1's comments: reporting hinges on having ‘a good relationship’ that avoids ‘wasting time’ because to interject is considered a hindrance to the internalized ‘core responsibility of making science.’ Everything outside of science is considered an impediment, which may include normative frameworks as a formality. Responsible research, in this regard, is about letting scientific research continue to be swift-acting.

Discussion

The responsibility narratives presented here – towards grand challenges, national values, and research relations – are all in service of making good science. Consistent with the findings of Glerup et al., bottom-up practices are ‘centred around responsibility for ensuring robust scientific process’ (324–325). While it could be said that aiming for academic rigor (e.g. to pass muster for eventually scaling up and commercializing) is part of research excellence across all scientific fields, the data suggests that organismal context helps align the three responsibility narratives into one: choosing better organisms allows one to conduct more efficient research, even when there is no funding for it, and conducting organismal research subsists upon and strengthens strong relational/collegial networks for even more efficiency, which enables one to operate with urgency and keep pace with the political/infrastructural momentum of a country galvanized by its bioeconomy.

This particular sample (in the American west) also shows the potential of market and industry ties infrastructurally (e.g. EBRC-hosted industry visits for trainees) as well as location-based hotbeds (e.g. La Jolla, Minneapolis) where graduates and lab personnel hold dual positions. While marketability falls beyond the scope of this paper, preliminary data suggests that responsibilities blur when industry-related interests enter. That is, purpose-driven research becomes ripe for commercializing and for making an impact where academic research falters.

The three responsibility narratives confirm an early diagnosis by J. Benjamin Hurlbut, who observed that ‘the field has constructed itself as able to respond, and thus as the right response, to basic problems of human welfare and security’ (Citation2014, original emphasis). A decade later, the data suggests that normative frameworks such as ELSI have not adapted, clarified, or otherwise integrated notions of understanding with the extant practices happening ‘on the ground.’ With responsibility narratives existing on separate planes from what has been colloquially referred to as ‘that ELSI stuff,’ there seems to be a mismatch and misalignment.Footnote7

How extant responsibility narratives explain and perpetuate existing ELSI critiques

The three main critiques of ELSI – on being too affirmative of new technologies, too late when called into interventions, and too divisive of the natural and social sciences – still hold true but, crucially, the empirical data presented here neither negates the need for ELSI research nor attends to its critiques. If anything, it exemplifies a mismatch between how responsibilities are narrativized across different actors within and adjacent to academic research institutions. The misalignment between responsibility narratives and normative frameworks like ELSI may be unsurprising for some, but left unaddressed, ELSI and the fields of synthetic biotechnologies risk continuing on their separate ways of operationalizing responsibility, which is symptomatic of a bifurcation already started with synthetic biotechnologies being diagnosed as a post-ELSI phenomenon.Footnote8

Responsibility narratives prioritize tool development (versus, say, equitable access to the technology), such that the making of science, for grand purposes, urgently, and in line with American values leaves little room for contestation. As a result, these narratives still enlist social scientists too late, usually in the affirmative stance.Footnote9 With productivity as the metric for success, affirming the technology paves the way for grants, patents, and publications to accumulate in a self-justifying manner (Hurlbut Citation2014, 1). In turn, the primacy placed on these outputs, like publications, can relegate responsibility practices to a lesser priority (Sigl, Felt, and Fochler Citation2020). Framing responsibility narratives in this productivist manner inadvertently sets up a contentious ‘problem framing’ that dismisses alternative, non-biotechnological approaches (Delborne, Kokotovich, and Lunshof Citation2020), and discourages critical questioning around if genetic approaches ought to be pursued (Kuzma Citation2021) as well as ‘thinking deeply and creatively about whether a synthetic biology project is responsible and good for the world’ (iGEM Citation2022, emphasis added). Since robust tool development is the goal, anything that impedes the process stokes ‘the harms of inaction’ (Brister, Holbrook and Palmer Citation2021, 7) and a reticence to slow down or proceed with caution. And so the critique persists.

The two responsibility narratives also benefit from a continued split between the natural and social sciences, especially with regard to quickly and productively making science. However, the one choke point frequently raised in interviews was that of public engagement, public acceptance, and public misunderstanding because they necessitate non-technical expertise. (On several occasions, interlocutors engaged with me only because they interpreted my role and project as an effort to communicate novel science towards a public-facing audience.) This public element represents the one knot that cannot untether the social sciences from the natural scientists, and, seen as such, reactions from my interlocutors ranged from skepticism (‘how are you connected to all this again?’) to skittishness (‘you’re not the ELSI police are you?’) to curiosity (‘what exactly do you do?’) to exasperation (‘if only people knew the science’). Nevertheless, most conversations ended with the mantra of needing better and more science communicators.

And while science communication is certainly important, framing the perceived issue (differential notions of responsibility in science) as a ‘knowledge gap’ (about scientific mechanisms) reinforces an array of critiques. For one, this perpetuates a deficit model and the ways it obfuscates other non-communicative shortcomings (McNeil Citation2013, Nerlich Citation2017, Nisbet and Scheufele Citation2009, Wynne Citation2006). In wanting ‘to tell the story right’ (to borrow a phrase from one interlocutor), science communication can become a search for a public relations coordinator who can promote the technology in the advertising sense of the term, which falls in line with the critiques about ‘the wifely role’ of dutifully brokering affective relations, communications, and the semiotics of success (Balmer et al. Citation2015, 10–12). Gregory Bateson's concept of the double bind (Citation1956) might be useful here, in that natural scientists are faced with the conflicting realities of both needing and not wanting to need social scientists. The same goes for social scientists. In both cases, damned if one ‘tells the story right’ and affirms the technology; damned if one does not and the original problem persists.

What counts as ‘a problem’ and what gets slotted as ‘the solution’ to it are too malleable to fit the needs of a power-hungry industry, full of academic prestige and (mostly male) egos, where being able to say ‘we created life’ becomes a coveted status symbol. For most natural/social scientists, the existing terms of encounter (e.g. project-based grants) do not guide research in the direction of what Rabinow and Bennett have called ‘a shared field of problems’ or ‘dialogic and contingent forms of engagement’ when both are necessary for collaboration to take place (33). In this regard, the perceived need for more communication may instead be symptomatic of needing better relations: ‘instead of talking about people who need to be responsible, we need to talk about mutually obliging relationships that make responsibility possible’ (Szymanski, Smith, and Calvert Citation2021, 263, original emphasis). These include, but are not limited to, the relationships between researchers, research infrastructures, research cultures, research funding, research participants, research organisms, research metrics, and their various entanglements (Hey and Szymanski, Citation2022). Paying attention to these configurations will make the difference in how non-technical concerns and nuanced ideas of responsibility come to flourish.

How response-able research relies on more-than-human relations

The shift from parts-to-whole may not be what reconfigures responsibility per se, but, compared to, say, in silico efforts that completely remove the organism, the informants pointed to their organisms as the meaningful unit of responsible research. (For Non-Tech informants, their collegial relations with labs was the equivalent.) Whether because of the complexity of working with larger genomes or with organisms that fulfill a specific solution, informants framed their narratives in terms of applied, impactful, and timely interventions.

It may be that the ghosts of compliance past continue to haunt researchers in ways that compel them to narrativize their research in (human-centered) grand terms, when, in some instances, I observed more response-able research in the multispecies sense of the term. Because phenotypic results are often markers of notable data and experimental success, researchers had to have been familiar with their organism's peculiarities, especially if they were using a non-model organism for whom indexical resources are not as common or robust. As Vinciane Despret reminds us, the relations between researcher and subject are neither predetermined nor stable: ‘learning how to address the creatures being studied is not the result of scientific theoretical understanding, it is the condition of this understanding’ (Citation2004, 131, original emphasis). In turn, caring for these organisms becomes the means by which scientists can make good science. Here we see contingent relations: rendering oneself capable of response to these organisms renders one capable of doing good science, in which good science remains the gold standard of any notion of responsible research. Cultivating response-ability thus requires researchers to stay attuned to their organisms in knowledge-production settings.

Response-ability also calls into question the stance of ethical immunity enunciated by at least some interlocutors (i.e. ‘I just do the science’). Donna Haraway, for instance, uses urine to explain how responsibility weaves together canine-equine-human relations in pharmacological uses of Premarin® (so named because it is derived from PREgnant MARe's urINe). Premarin® is administered to menopausal women as an estrogen-based hormone replacement, but its histories draw on the lives of pregnant Canadian women, pregnant mares in Manitoba, pregnant zebras in Berlin, researchers at McGill university in the 1930s, contract farmers, and breast cancer patients.Footnote10 She argues that taking Premarin® is not simply a mode of self-medicating; it implicates all of the relations that went into making the Premarin® a medication for her body (i.e. rendering herself capable of response to the histories, beings, and places involved). The takeaway here is that response-ability traverses time (historically) and spaces like ‘corporations, farms, clinics, labs, homes’ such that ‘sciences, technologies, and multispecies lives are entangled in multiscalar, multitemporal, multimaterial worlding, but the details matter. The details link actual beings to actual response-abilities’ (Citation2012, 312). Similarly, Maria Puig della Bellacasa (Citation2010) mobilizes the phrase ‘matters of care’ to describe the ethico-political engagements that drive human actions. Details matter here too: ‘a way of caring over here could kill over there. Caring is more about a transformative ethos than an ethical application. We need to ask ‘how to care’ in each situation’ (100). Both response-ability and matters of care suggest that the material doings of responsible research are context-specific and situated, not just an abstraction about what is (or is not) ethical. Yet naming this kind of attention towards organisms is rarely explicit or considered part of research training. Or worse, it could become part of a rote checklist.

As seen with the work of Barbara McClintock, it is the cumulative observations and understandings that one develops a ‘feeling for the organism,’ which, in turn, embraces the partiality of scientific knowing (Keller Citation1983, 198–199). This kind of organismal approach also demands attention to context-specific and value-laden research settings, because organisms come with their inherent attachments to histories, lore, and cultural significance, let alone their material affordances and limitations. Consider how maize evokes the creation stories of mesoamerica (Stross Citation2006) whereas transposons in maize do not. Since whole organisms come with historical, cultural, and environmental values that cells and genetic parts do not, perhaps it would do us all well to think about the (geo- and bio-) political manifestations of organismal engineering in a speculative fashion, so that ‘those who study things can participate in their possible becomings’ (Puig della Bellacasa, 100) and not disengage in the way of Dr. Frankenstein.

Concluding thoughts

As responsible innovation often emphasizes, there is something to be said for maintaining the flexibility to decide what responsibility means, especially to prevent pre- or over-determining what it ought to be. This flexibility allows for responsible research to remain adaptive and context-dependent (Doezema et al. Citation2019). As Randles et al. (Citation2016) have found, the data presented here show how actors who participate in research and innovation govern according to whatever is conceived to be responsible at the time, fluctuating based on to whom or for what one feels responsible. Consistent with this literature, the data suggests that American researchers are indeed fashioning their own narratives about responsibility. Their responsibility narratives do not align with existing normative frameworks like ELSI. And, the data confirms that subfields of synthetic biotechnologies do not hold unified ideas about responsibility while, simultaneously, these groups perceive that they are being regulated as if there was cohesion around its definition.

Amidst this mismatch, what stabilizes the responsibility narratives in this particular (American) setting is scientific activity: doing something about looming crises and having something to show for it, especially for peers, colleagues, and communities involved in this niche sector. This relational aspect may provide future insights for how to keep notions of responsibility adaptive and dynamic (i.e. to escape the risk of reproducing static notions of either-yes-or-no, responsible-or-irresponsible research).

One area of future research might focus on the work of non-technical personnel, specifically because they are effectively embedded in these communities and working with them, thereby building the capacity to respond from within the communities in which they operate. Looking at how scientists internalize nationally and culturally shared notions of responsibility, I echo the feminist call to prioritize situated notions and practices of responsible research. I also call for multispecies approaches that foreground more-than-human and attuned ways of becoming response-able towards other beings. Engaging with non-technical personnel and scientists who already practice response-ability (without knowing it or calling it as such) could lay down the infrastructural and community-level groundwork for future research as scientific fields increasingly engage with larger scales of life.

Acknowledgements

Thanks to the anonymous reviewers who provided valuable insights to strengthen this paper. My gratitude also goes to Tess Doezema whose patience and rigor have taught me so much about academic generosity.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Additional information

Funding

Open access funded by Helsinki University Library. Data collection was supported by National Science Foundation [grant number 2114750].

Notes on contributors

Maya Hey

Maya Hey is a postdoctoral researcher at the University of Helsinki. Her research examines human response-ability in a more-than-human world, calling upon experiential, non-Western, and multispecies approaches to collaborative projects.

Notes

1 GP-write's emphasis on human cells comes from the group's original naming. The group named itself Human Genome Project-Write (HGP-write) to honor the completion of the HGP, which they refer to as ‘HGP-read.’ See also Strickland (Citation2021) for GP-write's technical capacity for writing entire genomes.

2 This critique exists internally with ELSI researchers as well, where ELSI is seen as an added tax since ‘ELSI research shares funding with the science it observes’ (Dolan, Lee, and Cho, Citation2022, 6).

3 Naming fields can inadvertently draw boundaries and pledge allegiances, so I use the phrase ‘synthetic biotechnologies’ to mean the likes of synthesis biology, engineering biology, and synthetic genomics, without erasing their respective lineages or melding them into a unified discipline. For instance, naming ‘synthetic genomics’ may imply a defined group or subdiscipline (e.g. Viridos, f.k.a. Synthetic Genomics Inc.) but I use it here to indicate an emerging field of practitioners who are aiming beyond the scale of genetic circuits and chromosomes.

4 If anything, US regulatory bodies cause massive delays in approvals (Schairer et al. Citation2021) or place moratoriums (e.g., as was done with a 2002 project that made polio virus from scratch, which led to cancelled public funding for research on DNA synthesis).

5 Roadmapping and increasing jobs in a national context are not unique to the American context. Marris and Calvert (Citation2020) also note how a roadmapping exercise in the UK used the justification that ‘other countries would take the lead’ to speed up the roadmap's production timeline (48).

6 The benefits of using synthetic biotechnology were often narrated in terms of power, as in ‘a powerful technology’ or ‘having the power to change society.’ Often these phrases would have ‘responsibility’ nearby, echoing the Peter Parker-esque axiom ‘with great power comes great responsibility’ which may point to a cultural artefact of the American psyche.

7 The Japanese idiom ‘kamiawanai’ (噛み合わない) comes to mind with this mention of mismatch and misalignment. The phrase literally refers to ‘not matching in bite’ as when the teeth of a top and bottom jaw do not align. As a figure of speech, the phrase is used to describe when conversations are cross-talking, when people in dialogue are not on the same page. It can also be used to describe when a gear's teeth do not sync up with another's.

8 Although focusing on the Norwegian context, Myskja, Nydal, and Myhr (Citation2014, 15) have critiqued the post-ELSI move altogether, arguing that to separate post-ELSI from ELSI research can be delimiting and obscures the integrative and collaborative ‘side-by-side’ work on non-technical concerns.

9 Notable exceptions exist, such as project-specific collaborations involving entire centers and institutes with teams dedicated to studying the social, political, and legal dimensions of biotechnology (e.g. Institute for Practical Ethics, Kavli Center for Ethics, Science, and the Public). However, some social scientists note how their roles are predetermined by the project scope, the funding agency, or the way employment contracts are written, distributed, or (sometimes not) renewed. To keep these work arrangements from collapsing into a perfunctory or performative relationship, natural and social scientists would do well to examine the infrastructural, intersectional, and political forces that underwrite all collaborations. 

10 There is a whole B-side to Haraway's article focusing on diethylstilbestrol (DES), a synthetic drug used to treat incontinence (in dogs until the early 2000s) and advanced cancers (in humans until the 1990s). It was also used as a growth promoter in cattle in the 1950s and 1960s. Like with the Premarin example, Haraway knots together the implicated lives both human and more-than-human, across spaces that range from clinics to farms to homes.

References

  • Ankeny, Rachel, and Sabina Leonelli. 2020. Model Organisms. Cambridgeshire: Cambridge University Press.
  • Balmer, Andrew S., and Kate J. Bulpin. 2013. “Left to Their Own Devices: Post-ELSI, Ethical Equipment, and the International Genetically Engineered Machine (iGEM) Competition.” BioSocieties 8 (3): 311–335. https://doi.org/10.1057/biosoc.2013.13.
  • Balmer, Andrew S., Kate J. Bulpin, Jane Calvert, Matthew Kearnes, Adrian Mackenzie, Claire Marris, Paul Martin, et al. 2012. “Towards a Manifesto for Experimental Collaborations Between Social and Natural Scientists.” Accessed February 11, 2022. https://experimentalcollaborations.wordpress.com/2012/07/03/towards-a-manifesto-for-experimental-collaborations-between-social-and-natural-scientists/.
  • Balmer, Andrew S., Jane Calvert, Claire Marris, Susan Molyneux Hodgson, Emma Frow, Matthew Kearnes, Kate Bulpin, et al. 2015. “Taking Roles in Interdisciplinary Collaborations: Reflections on Working in Post-ELSI Spaces in the UK Synthetic Biology Community.” Science & Technology Studies 28 (3): 3–25. https://doi.org/10.23987/sts.55340.
  • Balmer, Andrew S., Claire Marris, Jane Calvert, Susan Molyneux-Hodgson, Matthew Kearnes, Kate J. Bulpin, Emma Frow, et al. 2016. “Five Rules of Thumb for Post-ELSI Interdisciplinary Collaborations.” Journal of Responsible Research and Innovation 3 (1): 73–80. https://doi.org/10.1080/23299460.2016.1177867.
  • Bateson, Gregory, Don D. Jackson, Jay Haley, and John Weakland. 1956. “Toward a Theory of Schizophrenia.” Behavioral Science 1: 251–264. https://doi.org/10.1002/bs.3830010402.
  • Biden, Joseph. 2022. “Executive Order on Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure American Bioeconomy.” Accessed September 15, 2022. https://www.whitehouse.gov/briefing-room/presidential-actions/2022/09/12/executive-order-on-advancing-biotechnology-and-biomanufacturing-innovation-for-a-sustainable-safe-and-secure-american-bioeconomy/.
  • Boeke, Jef D., George Church, Andrew Hessel, Nancy J. Kelley, Adam Arkin, Yizhi Cai, Rob Carlson, et al. 2016. “The Genome Project-Write.” Science 353 (6295): 126–127. https://doi.org/10.1126/science.aaf6850.
  • Brand, Stewart. 1987. The Media Lab: Inventing the Future at MIT. New York: Viking Press.
  • Brister, Evelyn, J. Britt Holbrook, and Megan J. Palmer. 2021. “Conservation Science and the Ethos of Restraint.” Conservation Science and Practice 3 (e381): 1–9. https://doi.org/10.1111/csp2.381.
  • Calvert, Jane, and Paul Martin. 2009. “The Role of Social Scientists in Synthetic Biology.” EMBO Reports 10 (3): 201–204. https://doi.org/10.1038/embor.2009.15.
  • Calvert, Jane, and Erika A. Szymanski. 2020. “A Feeling for the (Micro)organism? Yeastiness, Organism Agnosticism and Whole Genome Synthesis.” New Genetics and Society 39 (4): 385–403. https://doi.org/10.1080/14636778.2020.1736537.
  • Carter, Sarah R., Michael Rodemeyer, Michele S. Garfinkel, and Robert M. Friedman. 2014. “Synthetic Biology and the U.S. Biotechnology Regulatory System: Challenges and Options.” https://www.jcvi.org/sites/default/files/assets/projects/synthetic-biology-and-the-us-regulatory-system/full-report.pdf/.
  • Chari, Raj, and George Church. 2017. “Beyond Editing to Writing Large Genomes.” Nature Reviews Genetics 18 (12): 749–760. https://doi.org/10.1038/nrg.2017.59.
  • Charmaz, Kathy. 2006. Constructing grounded theory: A Practical Guide through Qualitative Analysis. London: Sage.
  • Dabars, William B., and Kevin T. Dwyer. 2022. “Toward Institutionalization of Responsible Innovation in the Contemporary Research University: Insights from Case Studies of Arizona State University.” Journal of Responsible Innovation 9 (1): 114–123. https://doi.org/10.1080/23299460.2022.2042983.
  • Dan-Cohen, Talia. 2021. A Simpler Life: Synthetic Biological Experiments. Ithaca, NY: Cornell UP.
  • Delborne, Jason A., Adam E. Kokotovich, and Jeantine E. Lunshof. 2020. “Social License and Synthetic Biology: the Trouble with Mining Terms.” Journal of Responsible Innovation 7 (3): 280–297. https://doi.org/10.1080/23299460.2020.1738023.
  • Department of Defense. 2020. “DOD Approves $87 Million for Newest Bioindustrial Manufacturing Innovation Institute.” Press Release. Accessed March 10, 2022. https://www.defense.gov/News/Releases/Release/Article/2388087/dod-approves-87-million-for-newest-bioindustrial-manufacturing-innovation-insti/.
  • Department of Energy. 2022. “Department of Energy FY 2023 Congressional Budget Request.” Dept. of Energy, U.S.A. Accessed November 12, 2022. https://www.energy.gov/sites/ default/files/2022-04/doe-fy2023-budget-volume-4.pdf.
  • Despret, Vinciane. 2004. “The Body We Care for: Figures of Anthropo-Zoo-Genesis.” Body & Society 10 (2-3): 111–134. https://doi.org/10.1177/1357034X04042938.
  • Despret, Vinciane. 2008. “The Becomings of Subjectivity in Animal Worlds.” Subjectivity 23 (1): 123–139. https://doi.org/10.1057/sub.2008.15.
  • Despret, Vinciane. 2013. “Responding Bodies and Partial Affinities in Human–Animal Worlds.” Theory, Culture & Society 30 (7-8): 51–76. https://doi.org/10.1177/0263276413496852.
  • Despret, Vinciane. 2016. What Would Animals Say If We Asked the Right Questions? Minneapolis: University of Minnesota Press.
  • Doezema, Tess, David Ludwig, Phil Macnaghten, Clare Shelley-Egan, and Ellen-Marie Forsberg. 2019. “Translation, transduction, and transformation: expanding practices of responsibility across borders.” Journal of Responsible Innovation 6 (3): 323–331. https://doi.org/10.1080/23299460.2019.1653155.
  • Dolan, Deanne Dunbar, Sandra Soo-Lin Lee, and Mildred K. Cho. 2022. “Three Decades of Ethical, Legal, and Social Implications Research: Looking Back to Chart a Path Forward.” Cell Genomics 2 (7): 100150. https://doi.org/10.1016/j.xgen.2022.100150.
  • Endy, Drew, and Laurie Zoloth. 2016. “Should We Synthesize a Human Genome?” Accessed September 15, 2021. https://dspace.mit.edu/bitstream/handle/1721.1/102449/ShouldWeGenome.pdf/.
  • Evans, Sam Weiss. 2022. “When All Research Is Dual Use.” Issues in Science and Technology 38 (3): 84–87. https://issues.org/wp-content/uploads/2022/05/84-87-Evans-Dual-Use-Spring-2022.pdf/.
  • Frow, Emma. 2017. “From ‘Experiments of Concern’ to ‘Groups of Concern’: Constructing and Containing Citizens in Synthetic Biology.” Science, Technology, & Human Values 45 (6): 1038–1064. https://doi.org/10.1177/0162243917735382.
  • c. n.d. Accessed October 3, 2021. https://www.genomewritersguild.org/.
  • Gibson, Daniel G., John I. Glass, Carole Lartigue, Vladimir N. Noskov, Ray-Yuan Chuang, Mikkel A. Algire, Gwynedd A. Benders, et al. 2010. “Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome.” Science 329 (5987): 52–56. https://doi.org/10.1126/science.1190719.
  • Gilbert, Scott F. 2009. “The Adequacy of Model Systems for Evo-Devo: Modeling the Formation of Organisms/Modeling the Formation of Society.” In Mapping the Future of Biology, edited by Anouk Barberousse, Michel Morange, and Thomas Pradeu, Boston Studies in the Philosophy of Science, 57–68. https://doi.org/10.1007/978-1-4020-9636-5.
  • Glaser, Barney G. 1978. Theoretical Sensitivity: Advances in the Methodology of Grounded Theory. Mill Valley: Sociology Press.
  • Glerup, Cecilie, Sarah R. Davies, and Maja Horst. 2017. “‘Nothing Really Responsible Goes on Here’: Scientists’ Experience and Practice of Responsibility.” Journal of Responsible Research and Innovation 4 (3): 319–336. https://doi.org/10.1080/23299460.2017.1378462.
  • Haraway, Donna J. 2008. When Species Meet. Minneapolis: Minnesota University Press.
  • Haraway, Donna J. 2012. “Awash in Urine: DES and Premarin® in Multispecies Response-ability.” Women's Studies Quarterly 40 (1-2): 301–316. https://doi.org/10.1353/wsq.2012.0005
  • Hayward, Eva. 2010. “FINGERYEYES: Impressions of Cup Corals.” Cultural Anthropology 25 (4): 577–599. https://doi.org/10.1111/j.1548-1360.2010.01070.x.
  • Hey, Maya, and Erika Szymanski. 2022. “Following the Organism to Map Synthetic Genomics.” Biotechnology Notes 3: 50–53. https://doi.org/10.1016/j.biotno.2022.07.001
  • Hodgson, Andrea, Joe Alper, and Mary E. Maxon. 2022. The U.S. Bioeconomy: Charting a Course for a Resilient and Competitive Future. New York: Schmidt Futures. https://doi.org/10.55879/d2hrs7zwc.
  • Hurlbut, J. Benjamin. 2014. “Reimagining Responsibility in Synthetic Biology.” Workshop on the Research Agendas in the Societal Aspects of Synthetic Biology. http://cns.asu.edu/sites/default/files/hurlbutb_synbiopaper_2014.pdf/.
  • Hurlbut, J. Benjamin. 2015a. “Limits of Responsibility: Genome Editing, Asilomar, and the Politics of Deliberation.” Hastings Center Report 45 (5): 11–14. https://doi.org/10.1002/hast.484.
  • Hurlbut, J. 2015b. “Reimagining Responsibility in Synthetic Biology.” Journal of Responsible Innovation 2 (1): 113–116. https://doi.org/10.1080/23299460.2015.1010770.
  • Hurlbut, J. Benjamin, Krishanu Saha, and Sheila Jasanoff. 2015. “CRISPR Democracy: Gene Editing and the Need for Inclusive Deliberation.” Issues in Science and Technology 32 (1). https://issues.org/crispr-democracy-gene-editing-inclusive-deliberation/.
  • iGEM. 2022. “Navigating Towards a Responsible Future with Synthetic Biology.” Accessed 10 March 2022. https://responsibility.igem.org/.
  • Keller, Evelyn Fox. 1983. A Feeling for the Organism: The Life and Work of Barbara McClintock. San Francisco: W. H. Freeman and Company.
  • Kelley, Nancy J. and Associates. 2014. “SynBERC Sustainability Initiative Initial Findings & Recommendations.” Accessed March 10, 2022. https://nancyjkelley.com/wp-content/uploads/Final-Synberc-Sustainability-Report.pdf/.
  • Kleinman, Daniel Lee. 2003. Impure Cultures: University Biology and the World of Commerce. Madison: University of Wisconsin Press.
  • Kuhlmann, Stefan, and Arie Rip. 2018. “Next-Generation Innovation Policy and Grand Challenges.” Science and Public Policy 45 (4): 448–454. https://doi.org/10.1093/scipol/scy011.
  • Kuzma, Jennifer. 2021. “Procedurally Robust Risk Assessment Framework for Novel Genetically Engineered Organisms and Gene Drives.” Regulation and Governance 15 (4): 1144–1165. https://doi.org/10.1111/rego.12245.
  • Marris, Claire, and Jane Calvert. 2020. “Science and Technology Studies in Policy: The UK Synthetic Biology Roadmap.” Science, Technology, & Human Values 45 (1): 34–61. https://doi.org/10.1177/0162243919828107.
  • McNeil, Maureen. 2013. “Between a Rock and a Hard Place: The Deficit Model, the Diffusion Model and Publics in STS.” Science as Culture 22 (4): 589–608. https://doi.org/10.1080/14636778.2013.764068.
  • Myskja, Bjørn Kåre, Rune Nydal, Anne Ingeborg Myhr. 2014. “We Have Never Been ELSI Researchers—There is No Need for a Post-ELSI Shift.” Life Sciences, Society and Policy 10 (9). https://doi.org/10.1186/s40504-014-0009-4.
  • National Academy of Sciences. 2013. “2, Synthetic Biology: Science and Technology for the New Millennium.” In Positioning Synthetic Biology to Meet the Challenges of the 21st Century: Summary Report of a Six Academies Symposium Series, edited by Committee on Science, Technology, and Law; Policy and Global Affairs; Board on Life Sciences; Division on Earth and Life Sciences; National Academy of Engineering; National Research Council, 7–16. Washington (DC): National Academies Press (US). https://www.ncbi.nlm.nih.gov/books/NBK202049/.
  • National Human Genome Research Institute. 2012a. “Ethical, Legal and Social Implications (ELSI) Research.” Accessed March 10, 2022. https://www.genome.gov/10002229/elsi-research-new-goals-for-the-next-5-years.
  • National Human Genome Research Institute. 2012b. “Review of the Ethical, Legal and Social Implications Research Program and Related Activities (1990–1995).” Accessed March 10, 2022. https://www.genome.gov/10001747/elsi-program-review-19901995.
  • Nerlich, Brigitte. 2017. “Digging for the Roots of Deficit Model.” UoN Blogs: Making Science Public. Accessed May 20, 2022. https://blogs.nottingham.ac.uk/makingsciencepublic/ 2017/02/25/digging-for-the-deficit-model/.
  • Nisbet, Matthew C., and Dietram A. Scheufele. 2009. “What's Next for Science Communication? Promising Directions and Lingering Distractions.” American Journal of Botany 96 (10): 1767–1778. https://doi.org/10.3732/ajb.0900041.
  • Ostrov, Nili, Jacob Beal, Tom Ellis, D. Benjamin Gordon, Bogumil J. Karas, Henry H. Lee, Scott C. Lenaghan, et al. 2019. “Technological Challenges and Milestones for Writing Genomes: synthetic genomics requires improved technologies.” Science 366 (6463): 310–312. https://doi.org/10.1126/science.aay0339.
  • Ostrov, Nili. n.d. “Why Is It So Hard to Do Research in Non-Model Organisms?” Cultivarium. Accessed 11 February 2023. https://www.cultivarium.org/news/non-model-organisms/.
  • Palsson, Bernhard. 2000. “The Challenges of In silico Biology.” Nature Biotechnology 18 (11): 1147–1150. https://doi.org/10.1038/81125.
  • Powell, Kendall. 2018. “How Biologists are Creating Life-like Cells from Scratch.” Nature 563 (7730): 172–175. https://doi.org/10.1038/d41586-018-07289-x.
  • Presidential Commission for the Study of Bioethical Issues. 2010. New Directions: The Ethics of Synthetic Biology and Emerging Technologies. https://www.genome.gov/27542921/the-ethics-of-synthetic-biology-and-emerging-technologies/.
  • Puig della Bellacasa, Maria. 2010. “Matters of Care in Technoscience: Assembling Neglected Things.” Social Studies of Science 41 (1): 85–106. https://doi.org/10.1177/0306312710380301.
  • Rabinow, Paul, and Gaymon Bennett. 2012. Designing Human Practices: An Experiment with Synthetic Biology. Chicago: University of Chicago Press.
  • Randles, Sally, Philippe Laredo, Allison Loconto, Bart Walhout, Ralf Lindner, et al. 2016. “Framings and Frameworks: Six Grand Narratives of de facto rri.” In Navigating Towards Shared Responsibility, edited by Ralf Lindner, Stefan Kuhlmann, Sally Randles, Bjørn Bedsted, Guido Gorgoni, Erich Griessler, and Allison Loconto, 31–36. Res-AGorA. https://pure.au.dk/portal/files/98634660/RES_AGorA_ebook.pdf/.
  • Robinson, Douglas K. R., Angela Simone, and Marzia Mazzonetto. 2021. “RRI Legacies: Co-creation for Responsible, Equitable and Fair Innovation in Horizon Europe.” Journal of Responsible Innovation 8 (2): 209–216. https://doi.org/10.1080/23299460.2020.1842633.
  • Roosth, Sophia. 2017. Synthetic: How Life Got Made. Chicago: University of Chicago Press.
  • Sc2.0 Statement of Ethics and Governance. 2013. Accessed October 3, 2021. https://www.mq.edu.au/__data/assets/pdf_file/0005/605966/Sc2_EthicsAndGovernanceAgreement_131124final.pdf.
  • Schairer, Cindy, James Najera, Anthony A. James, Omar S. Akbari, and Cinnamon S. Bloss. 2021. “Oxitec and MosquitoMate in the United States: lessons for the future of gene drive mosquito control.” Pathogens and Global Health 115 (6): 365–376. https://doi.org/10.1080/20477724.2021.1919378.
  • Schyfter, Pablo., and Jane Calvert. 2015. “Intentions, Expectations, and Institutions: Engineering the Future of Synthetic Biology in the USA and the UK.” Science as Culture 24 (4): 359–383. https://doi.org/10.1080/09505431.2015.1037827.
  • Shapiro, Beth. 2020. How to Clone a Mammoth: The Science of De-Extinction. Princeton: Princeton UP.
  • Si, Tong, and Huimin Zhao. 2016. “A Brief Overview of Synthetic Biology Research Programs and Roadmap Studies in the United States.” Synthetic and Systems Biotechnology 1 (4): 258–264. https://doi.org/10.1016/j.synbio.2016.08.003.
  • Sigl, Lisa, Ulrike Felt, and Maximilian Fochler. 2020. “‘I am Primarily Paid for Publishing … ’: The Narrative Framing of Societal Responsibilities in Academic Life Science Research.” Science and Engineering Ethics 26 (3): 1569–1593. https://doi.org/10.1007/s11948-020-00191-8.
  • Stilgoe, Jack, Richard Owen, and Phil Macnaghten. 2013. “Developing a Framework for Responsible Innovation.” Research Policy 42 (9): 1568–1580. https://doi.org/10.1016/j.respol.2013.05.008.
  • Strickland, Eliza. 2021. “With This CAD for Genomes, You Can Design New Organisms Forthcoming Software from the GP-Write Consortium Aims to Make Large-Scale Genome Editing and Design More Accessible.” IEEE Spectrum. Accessed March 10, 2022. https://spectrum.ieee.org/with-this-cad-for-genomes-you-can-design-new-organisms/.
  • Stross, Brian. 2006. “Maize in Word and Image in Southeastern Mesoamerica.” In Histories of Maize: Multidisciplinary Approaches to the Prehistory, Linguistics, Biogeography, Domestication, and Evolution of Maize, edited by John Staller, Robert Tykot, and Bruce Benz, 578–599. New York: Routledge. https://doi.org/10.4324/9781315427331.
  • Szymanski, Erika A., Robert D. J. Smith, and Jane Calvert. 2021. “Responsible Research and Innovation Meets Multispecies Studies: Why RRI Needs to be a More-Than-Human Exercise.” Journal of Responsible Innovation 8 (2): 261–266. https://doi.org/10.1080/23299460.2021.1906040.
  • Szymanski, Erika, Tarsh Bates, Elise Cachat, Jane Calvert, Oron Catts, Lenny J. Nelson, Susan J. Rosser, Robert D. J. Smith, and Ionat Zurr. 2020. “Crossing Kingdoms: How Can Art Open Up New Ways of Thinking About Science?” Frontiers in Bioengineering and Biotechnology 8 (715): 1–7. https://doi.org/10.3389/fbioe.2020.00715.
  • Välikangas, Anita. 2022. “The Uses of Grand Challenges in Research Policy and University Management: Something for Everyone.” Journal of Responsible Innovation 9 (1): 93–113. https://doi.org/10.1080/23299460.2022.2040870.
  • Wolfe, Amy K. 2015. “Societal Aspects of Synthetic Biology: Organisms and Applications Matter!.” Journal of Responsible Innovation 2 (1): 121–123. https://doi.org/10.1080/23299460.2014.1001972.
  • Wynne, Brian. 2006. “Public Engagement as a Means of Restoring Public Trust in Science – Hitting the Notes, but Missing the Music?” Community Genetics 9 (3): 211–220. https://doi.org/10.1159/000092659.