562
Views
2
CrossRef citations to date
0
Altmetric
Perspectives

What's the matter with biosecurity?

ORCID Icon
Pages 88-91 | Received 07 Nov 2014, Accepted 21 Dec 2014, Published online: 19 Jan 2015

Abstract

What constitutes a threat in synthetic biology depends on what we think needs to be secured. Research on biosecurity needs to focus more on how to value and govern contesting definitions of security and society.

‘If you don't have security, you don't have society', might be a good way to describe the general thinking about the role that a perception of security plays in ensuring we can do all the other things that constitute society. Such a simple statement, however, misses two basic points about security: those that have the power to define the objects, subjects, and governance mechanisms of security are also fundamentally shaping the type of society in which we live; but also, most of the things we value in society flourish only when we do not need to be concerned with security issues. Who decides what conceptions of security dominate, and whether an issue is about security or something else, should be a matter of deliberation rather than assumed to fall to one group or another.

The biosecurity community is well versed in the need for security to be a central topic of discourse for states, and biosecurity professionals have argued for broadening the conception of security beyond traditional state concerns to include things like health, agriculture, and building design. But in doing so, the argument is usually for those other communities to take on a security discourse, rather than for the security community to promote its goals by taking on the discourses of health, etc. (e.g. Bernard Citation2013). In the USA, such arguments are based on the assumption that security is the language that Washington listens to, and that the places that are worried about security are the places with money for research. That these assumptions are often borne out speaks to the power the security discourse has in shaping (at least American) society.

But what type of security are these arguments talking about? Rabinow and Bennett (Citation2012) argue strongly that the pervasive understanding of security is one based on a framing of the ‘dual-use’ problem: that there are bad actors out there who must be prevented from using our knowledge and technology against us. While there are likely many cases where we can know the enemy, know the technology that might harm, and be able to prevent the two coming together with destructive results, there are just as many cases where the subjects, objects, and actions of security concern are not known. Focusing only on the former is tantamount to the old adage of looking for your keys under the streetlamp because that is where the light is. Export controls, Institutional Biosafety Committees, even the newly minted US Government Dual-Use Research of Concern (USG DURC) policies fall into this camp.Footnote1 So what would it mean to govern security concerns that are not yet known? In particular, how should we determine whether an area of synthetic biology research does or does not constitute a security concern, and who should have a role in that process?

These questions should be a central strand of research and action in the coming decade. Answering them means finding meaningful ways past the traditional framing of security as a dual-use concern where this framing is found to be lacking, such as when we are not able to tell the pathogenicity of novel organisms. It means fully appreciating that potential security concerns may also be health concerns, or concerns about the environment, economy, or morality. We could say that the recent ebola outbreak in Africa is a security issue, but crucially, this line of research should go against calls to think of everything as a potential security problem. But what would be gained, and more importantly lost, if we considered ebola a security rather than health issue? This is not a call for securitization. Rather, it is a call to reassess the ways that we make the subjects and objects of security concern (Rappert Citation2014). For example, what assumptions about the innovation process, the role of science in society, and the relationship between science and security are we making when we build governance mechanisms that rely on scientists as the ones who raise potential security concerns about work they are undertaking? This is the structure of the USG DURC policy. How would those assumptions need to change if, for instance, we instituted broader systems of prior research approval?

Security should not be the trump card it is often wielded as today in the USA, particularly when threats are contested, unknown, or ambiguous. Instead, it should be a discourse of equal standing to the many others that form our society. In the same vein as the first statement, perhaps we could also say, ‘As war is politics by other means,Footnote2 talking about security is talking about the economy, health, and environment by other means.’ The trouble is, as with war, talking about security tends to prevent, or at least overshadow, these other ways of addressing an issue. Security framings reshape what constitutes other parts of society, just as Clausewitz argued that war reshapes politics (Strachan Citation2007, 176). We should therefore be careful when we use them, lest we lose what we are hoping to preserve.

Work within Science and Technology Studies (STS), particularly the responsible research and innovation literature as highlighted in this Journal (see also Owen, Heintz, and Bessant Citation2 Citation013), has been promoting alternatives to many ways of understanding, building, and governing innovation systems, but much work still needs to be done, particularly in studying alternatives to traditional ways of framing and governing security concerns.

But this is not just a field of study. STS has a history of being quite poorly integrated into American political thought and governing institutions. The National Nanotechnology Initiative was a preliminary step to change that, and synthetic biology and perhaps next geoengineering are areas where STS scholars’ insights have an opportunity to be influential in shaping future governing and thinking. The recent work of Ken Oye and the Wilson Center in the USA, and of several scholars in the UK are examples of reshaping at work (Kuiken et al. Citation2014; Jefferson, Lentzos, and Marris Citation2014). Oye, for instance, is encouraging early engagement from a broad range of people to establish a process for deciding whether and how to conduct further research on gene drives (Oye et al. Citation2014), though whether this will have any perceptible impact on policy has yet to be seen.

The National Science Foundation and other funding bodies have a key role to play reshaping our understanding of what it means to engage in biosecurity governance. If they continue to fund work on societal aspects of emerging technology as an add-on to other research, such as was the case with Synberc (Rabinow and Bennett Citation2012, 16), they can only expect to get back findings that show how the current system works (or more often, doesn't). Strengthening the independent STS funding stream in NSF is highly desirable, though the specific work on governance should be able to demonstrate a strong empirical grounding in both the science being studied and the policies and regulations being addressed. Finding ways to put together a new process whereby research and development proceed in step with the development of governance and public engagement would be an excellent use of funds.

How might synthetic biology research be different if we pay closer attention to how we decide what constitutes a security concern, and who decides? We would certainly spend more time thinking about the type of society we want to live in, and what would be the basic conditions for the existence of that society. We might also start searching for paths forward that represent, not the best solution from any particular idea of what society should look like, but rather a common course of action that would allow different moral and epistemological justifications for that action (Verweij and Thompson Citation2006). ‘If you don't have society, you don't have security.’

Acknowledgement

The author is grateful to Megan Palmer, Emma Frow, and Ken Oye for discussions that helped form the ideas expressed here, and for the comments of two anonymous reviewers.

Funding

This work is supported by the British ESRC/AHRC/DSTL under Science and Security grant [ES/K011308/1], and by the American NSF Synthetic Biology Engineering Research Center (Synberc).

Notes on contributor

Sam Weiss Evans studies the ways security concerns are identified, constructed, and governed in areas of emerging technology. For the last three years he has been working in the area of synthetic biology. His other areas of focus include export controls and cybersecurity.

Notes

1. For export controls, see Evans (Citation2014, 44–48). For Institutional Biosafety Committees, see Tucker (Citation2010, 41–48). The United States Government Policy for Institutional Oversight of Life Sciences Dual Use Research of Concern (September 2014) is available at http://www.phe.gov/s3/dualuse/Documents/durc-policy.pdf. For an analysis of DURC as a governance tool, see Rappert (Citation2014).

2. A nod to Clausewitz (Citation1989 [Citation1832]).

References

  • Bernard, K. W. 2013. “Health and National Security: A Contemporary Collision of Cultures.” Biosecurity and Bioterrorism 11 (2): 157–162. doi: 10.1089/bsp.2013.8522
  • Clausewitz, C. von. 1989 [1832]. On War, Indexed Edition. Translated by M. E. Howard and P. Paret. Princeton, NJ: Princeton University Press.
  • Evans, S. A. W. 2014. Revising Export Control Lists. Brussels: Flemish Peace Institute.
  • Jefferson, C., F. Lentzos, and C. Marris. 2014. “Synthetic Biology and Biosecurity: Challenging the ‘Myths.’” Infectious Diseases 2, article no. 115.
  • Kuiken, T., G. Dana, K. Oye, and D. Rejeski. 2014. “Shaping Ecological Risk Research for Synthetic Biology.” Journal of Environmental Studies and Sciences 4: 191–199. doi: 10.1007/s13412-014-0171-2
  • Owen, R., M. Heintz, and J. R. Bessant, eds. 2013. Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society. London: Wiley.
  • Oye, K. A., K. Esvelt, E. Appleton, F. Catteruccia, G. Church, T. Kuiken, and J. P. Collins. 2014. “Regulating Gene Drives.” Science 345 (6197): 626–628. doi: 10.1126/science.1254287
  • Rabinow, P., and G. Bennett. 2012. Designing Human Practices: An Experiment with Synthetic Biology. Chicago: University of Chicago Press.
  • Rappert, B. 2014. “Why has not there Been More Research of Concern?” Frontiers in Public Health 2, article no. 74. doi: 10.3389/fpubh.2014.00074
  • Strachan, H. 2007. Clausewitz's On War: A Biography. New York: Atlantic Monthly Press.
  • Tucker, J. B. 2010. Double-Edged Innovations: Preventing the Misuse of Emerging Biological/Chemical Technologies (No. ASCO 2010 018). Defense Threat Reduction Agency Advanced Systems and Concepts Office. pp. 44–49. http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA556984.
  • Verweij, M., & M. Thompson, eds. 2006. Clumsy Solutions for a Complex World. London: Palgrave Macmillan.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.