2,645
Views
38
CrossRef citations to date
0
Altmetric
Research articles

‘Nothing really responsible goes on here’: scientists’ experience and practice of responsibility

, &
Pages 319-336 | Received 29 Jun 2016, Accepted 07 Jul 2017, Published online: 10 Oct 2017

ABSTRACT

Scientists face increasing demands to integrate practices of ‘responsibility’ into their working lives. In this paper, we explore these developments by discussing findings from a research project that investigated how publically funded scientists perceived and practiced responsibility. We show that, though the scientists in this study mostly viewed policy discourses such as Responsible Research and Innovation (RRI) as irrelevant to them, they articulated and practiced a range of ‘bottom-up’ responsibilities, including for producing sound science, taking care of employees, creating ‘impact’ and carrying out publically legitimate science. The practice of these responsibilities was often shaped by wider dynamics in the governance of knowledge production, such as academic capitalism and the marketisation of universities. Based on these findings, we suggest that RRI scholarship should, first, work to develop a shared language of responsibility with scientists, and, second, more actively address the political context of contemporary scientific research.

1. Introduction

Scientists across the Western world have, over the last two decades, faced increasing calls to enhance the ‘responsibility’ of science and innovation. Broadly speaking, the hope expressed by science governance scholars is that through ‘responsibilisation’ the trajectories of emergent technologies can be nudged toward desired futures (Stilgoe Citation2013, 12).Footnote1 Such nudging might include scientists’ reflection on potential outcomes of their work, the integration of ethical assessment into science, policy responsiveness to emerging scientific uncertainties, and the inclusion of publics in decisions about the direction of particular research projects (Grunwald Citation2011; Owen, Macnaghten, and Stilgoe Citation2012). Discussions of responsibility in science have been particularly focused on new and emerging science and technology. Debates about nanotechnology, synthetic biology and geoengineering have seen science governance practitioners propose that these technologies are an opportunity to move public discourse about new technologies from discussions about risk and safety to broader and more deliberative discussions of the direction of science and desirable outcomes (Wood, Jones, and Geldart Citation2007; RAE Citation2009; Yearley Citation2009).

Calls for responsibility have emerged in both policy and academic literature (Owen, Macnaghten, and Stilgoe Citation2012), and are visible in the use of terms such as ‘responsible development’ (Roco et al. Citation2011), the responsible care programme (Prakash Citation2000), ‘Responsible Innovation’ (Stilgoe, Owen, and Macnaghten Citation2013) or ‘Responsible Research and Innovation’ (RRI; von Schomberg Citation2013). It is the latter that we focus on in this paper. RRI is preoccupied with the establishment and enhancement of ‘responsibility’ through reflection, anticipation, engagement and responsiveness (Owen, Macnaghten, and Stilgoe Citation2012). It is centrally concerned with science and innovation as being able to produce ‘socially desirable ends’ (Owen, Macnaghten, and Stilgoe Citation2012, 754), and views internal reflection and the ‘opening up’ of technoscience to public deliberation as key to this. As Von Schomberg puts it, RRI is a:

… process by which societal actors and innovators become mutually responsive to each other with a view to the (ethical) acceptability, sustainability and societal desirability of the innovation process and its marketable products. (Citation2013, 51)

At the international policy level, these concerns are visible in soft law initiatives by the EU and UN – such as the European Commission’s integration of RRI into its Horizon 2020 funding programme (de Saille Citation2015) – but are also expressed in national policies on scientific funding and development (Koops et al. Citation2015). Demands for responsibility are not articulated around a clear definition or precise problematique. However, several scholars suggest that demands for enhanced responsibility in science share an interest in combining economic growth with democratic accountability of development of new knowledge and technologies (Swierstra and Rip Citation2007; Kearnes and Rip Citation2009).

Despite these calls for RRI, there has been rather less analysis of how it is being experienced and articulated in the daily activities of scientific practice or how scientists should transform RRI insights into specific forms of responsiveness. One important corpus of work treats a number of different in vivo engagement exercises with scientists and engineers – projects which seek to integrate reflection on responsibility into daily laboratory practices by embedding scholars from social and human sciences (Van der Burg and van Gorp Citation2005; Fisher Citation2007; Schuurbiers Citation2011; Fisher et al. Citation2015; Balmer et al. Citation2016; de Jong et al. Citation2011). Such projects shed light on the diverse social and ethical challenges that scientists face in daily work, and have shown noted success in enhancing scientists’ reflections about the nature of that work (Schuurbiers Citation2011), but literature about them tends to focus on the efforts of social scientists and humanities scholars to foster debate and reflection rather than on extant scientific practice. Recent empirical studies suggest that scientific engagement with ideas of social responsibility is often operationalised through concerns about implementation, sense-making and administration. McCarthy and Kelty (Citation2010), for instance, found in a long-term ethnographic study that making responsibility ‘do-able’ was essentially a practical problem for scientists. How could they respond to calls for responsibility in a way that made sense to them, practically and professionally? More recently, Frankel has reported a survey of scientists, engineers and health professionals which showed a uniformly high commitment to different aspects of responsible scientific practice, but which revealed little about how scientists made sense of calls for enhanced responsibility (Citation2015). Similarly, Schuurbiers, Osseweijer, and Kinderlerer Citation2009 report from the implementation of ethical codes of conduct in the Netherlands that while all informants are highly appreciative of the idea of research and work ethics in academia, they are not much in favour of written codes of conduct and are ambivalent about how to use them in practice.

This paper addresses the paucity of studies on scientists’ sense-making of calls for responsibility. It does so by describing ways that scientists experience responsibility in practice and discussing the relations between daily scientific life and contemporary political trends in science policy. It reports findings from a research project that investigated how publically funded scientists working in fields of emerging technology, including synthetic biology and nanotechnology, perceived and practiced ‘responsibility’ in their daily work. On this basis we will focus primarily on the ‘research’ aspect of RRI, addressing its ‘innovation’ aspect only to the degree that these publically funded researchers considered innovation as a relevant part of their work. We start by briefly introducing the project from which these results are drawn before discussing our key findings: we first describe how political discussions about RRI are considered irrelevant to the practice of science; second suggest that, despite this, many forms of ‘bottom-up responsibility’ are immanent to the everyday work of scientists with regard to the process, organisation and outcomes of science; and third reflect on the ways that top-down policies, and in particular the rise of ‘academic capitalism’, are interacting with these bottom-up responsibility practices. We then discuss what these findings mean for academic and policy thinking on responsibility and RRI.

2. Methods and context

This analysis draws on work carried out in the ‘Scientific Social Responsibility’ (SSR) research project. Funded by the Danish Free Research Council, and running between 2010 and 2015, the project explored the operationalisation of and meanings attributed to discourses of responsibility in emerging science and technology (for other results from this project, see Glerup and Horst Citation2014; Davies and Horst Citation2015a, Citation2015b; Glerup Citation2015).

The project explored meanings and practices of responsibility in three minority world national contexts: Denmark, the UK and the US. These countries were chosen with a concern for national diversity combined with the intent to study countries often framed as ‘paradigmatic’ (Flyvbjerg Citation2006). While the three countries have different science governance cultures (Jasanoff Citation2005), they all have traditions relevant to the study of RRI practices. The US has a long tradition of large, strategic research programmes and informal debate about emergent technologies. In 1988 it formally introduced inclusion and outreach as part of the Humane Genome Project, in the form of ELSI (Ethical, Legal and Social Issues of science) programmes and this development has continued with funding for social implications of nanotechnology in relation to the National Nanotechnology Initiative in 2000 and funding for engaging policy-makers, scientists and the public in responsible development of synthetic biology in relation to funding for the multi-institutional initiative, Synberg, in 2006. The UK has a tradition of technocratic governance of science and technologies but has, after controversies such as the outbreak of BSE disease in the mid-1990s and conflict over genetically modified organisms (GMOs) at the turn of the millenium, initiated deliberative and inclusive forms of science governance (Irwin Citation2006). Denmark, conversely, has traditionally had a focus on public participation in science governance, but has downplayed these aspects in order to give room for science-industry partnerships and enhanced organisational competition between public research units since higher education reform in 2003 (Mejlgaard Citation2009). Given that emerging technologies have been a focus for government initiatives that seek to foster ‘responsible’ developments (Fisher Citation2005), the research was focused on scientists working in the fields of synthetic biology and nanotechnology. We expected these areas to be especially relevant cases for studying the meanings and practices attributed to responsibility (Flyvbjerg Citation2006).

A number of empirical engagements were used to study these meanings and practices. First, 29 interviews were conducted with research managers, PIs and group leaders working in nanotechnology and synthetic biology, with 10 individuals in Denmark, 8 in the UK, and 11 in the US. Interviews generally lasted about an hour, though some were much longer (150 minutes) and some shorter (45 minutes). A topic guide was used to structure the conversation: this included questions on participants’ research background and history, their relationships and responsibilities in their immediate work setting, and their awareness and response to broader policy discourses of the notion of responsibility. Further details can be found in Davies and Horst (Citation2015b).

Second, an ethnography of laboratory work in public research laboratories was carried out in Denmark and the US (a suitable lab could not be recruited in the UK context). The lab ethnography involved extended periods of participant observation, with the Danish lab studied over two periods in 2011 and 2012, for six and four weeks, respectively, and the US lab studied through a 10-week-period of engagement in 2012. Participant observation involved a researcher being embedded in the lab in a manner inspired by the Socio-Technical Integration Research protocol (Fisher Citation2007), alongside the shadowing of different lab members and interviews being carried out. Extensive field notes were taken throughout. Full details can be found in Glerup (Citation2015).

Given that the question of what meanings and practices are attributed to responsibility by scientists drove all of the data collection, we treat this data as a single data set. This paper discusses the key themes that emerged across the data as we looked at patterns in the way scientists made sense of and practiced responsibility, with these patterns being identified through a process of reading and re-reading of field notes and interview transcripts, coding of emergent themes and discussion of the data between the project team (Silverman Citation2001).Footnote2 Themes discussed in this article are considered significant because they repeatedly emerged across the three different national contexts; they suggest, therefore, something about how ‘responsibility’ is being broadly conceptualised within international scientific practice. This qualitative analysis aims at interpreting patterns of discourse and practice from the point of view of how the scientists make sense of their own situation (Weick Citation1995). Our empirical ambition has been to understand what the world looks like to these scientists. Given this orientation, we have not focused on the number of articulations or observations of specific kinds; rather, working in the tradition of qualitative studies of meaning-making, we have identified recurring patterns of talk and action across sites.Footnote3

Finally, it is important to note that all the empirical engagements took place in public laboratories, and that these operate under particular conditions, not least a demand for public legitimacy (Du Gay Citation2000). When we speak of generic qualities, we are, therefore, confined to talking about public research settings across the minority world contexts we have engaged with. Although RRI discourse explicitly references both research and innovation, and despite the fact that the lines between those practices are notoriously blurred (Nowotny Citation1999), most of our informants identified themselves and their responsibilities as relating to scientific research and to universities (rather than innovation or industry). It is thus this aspect of RRI that is in focus.

3. Responsibility for the process of research

Our first key finding is that the majority of the scientists that we spoke to, whether working at the bench as graduate students or technicians or as research managers and PIs, were not familiar with recent policy drives to inculcate ‘responsibility’ into science. Those who were aware of these developments tended not to foreground them in their discussions of their roles, responsibilities and activities. Even if they were familiar with, for instance, the notion of RRI (such familiarity was particularly the case for synthetic biologists working in the UK, where the term has been integrated into research funding of the field; Lentzos Citation2009), this tended to come at the end of a rather long list of other responsibilities and aspects of their jobs as scientists. Ideas about RRI were, therefore, a low priority in the everyday practice of science.

This lack of awareness is not surprising. Other research has found that nanoscientists’ familiarity with the European Commission’s Code of Conduct on Responsible Nanosciences and Nanotechnologies Research was similarly low (Kjølberg and Strand Citation2011). Many of the scientists in this study simply did not see what appeared to be rather abstract policy discussion as being relevant to them, particularly in a context where there were many other pressing and urgent concerns (getting results and writing the thesis if they were graduate students; securing funding for more senior scientists; managing professional relationships within and beyond the group for all). For instance, we asked one Danish nanoscientist, Niels,Footnote4 if he had heard about policy discussion of the social responsibility of science, mentioning the EC Code of Conduct as a nanotechnology-specific example. No, he said, he was not ‘involved in this policy making and to be honest, it’s nothing that interests me’. He was not surprised that he had not come across these discussions; maybe it had been sent to him, but he had not had time to engage with it:

I think that’s the big problem of these policies, I mean [often you receive them] as email attachment and then you have 100 pages to read but I, in this case I’m pretty sure I never had anything and maybe it’s uploaded somewhere but I never heard it before and it’s not something that is discussed.

Niels was not suggesting that ethics, responsible practice or codes of conducts were not important in general. He was, in fact, active in running courses for PhD students on responsible conduct of research, and he noted that he was quite used to filling in the ethics section required by EU funding; this, he said, was ‘easy to fill out because it’s always no, so there are no ethical implications’. He thought that ethics training was important, but that it was simply not relevant to his own current research.

A similar sense of the irrelevance of responsibility came through in the ethnographic aspects of the study. It quickly became clear that it was counterproductive to talk specifically about ‘responsibility’ in conversation in the lab: informants were not familiar with the term’s use in the context of science policy, and the introduction of it confused them. ‘As I’m sure you will find out’, one scientist said in an early empirical engagement, ‘nothing really responsible goes on here’. Our interpretation of this remark is that he in fact meant the opposite – that their scientific projects were very responsible. He considered his own and his colleagues’ work of such quality that they simply did not need to talk about ‘responsibility’ and how to enhance it. The term was irrelevant because what they did was not problematic at all, and there was thus no need for the lab to engage with policy debates about responsibility, ethics or integrity.Footnote5 Terms such as ‘responsible development’ or ‘responsible research and innovation’ were thus alienating both because they were unfamiliar and because they appeared to be irrelevant. Their connotations were negative, and their introduction into conversation necessitated explanation and often resulted in defensiveness. Informants might explain that being ‘responsible’ demanded too much work and bureaucracy, and that they did not have time for that. As with Niels, there was a sense that these were rather meaningless external demands being applied to sites where there were already existing cultures of sufficiently high ethical standards.

At the same time, we repeatedly observed or were told about practices, which we would view as relating to responsibility, although they would not be described by scientists in these terms. What we observed was a set of ‘bottom-up’ practices of responsibility, one that rarely used the language of responsibility or ethics but which nonetheless involved efforts to ensure the production of ‘good science’ – meaning, science which was morally robust as well as technically excellent. For instance, all scientists in the study strongly identified with a vocational responsibility for doing what they considered to be good science. When asked to define what they meant by ‘good’, their descriptions resembled earlier descriptions of the scientific vocation as described, for instance, by Weber ([Citation1917] Citation2004) and Merton (Citation1973) or of the ‘role responsibility’ described by Douglas (Citation2003). They spoke about virtues such as objectivity, criticism and collegial community – echoing Merton’s CUDOS norms of Communalism, Universalism, Disinterestedness and Organised Skepticism (Citation1973).

We observed, for instance, that scientists critically scrutinised their own work and that of their colleagues in order to ensure that experiments lived up to their ideas of excellent science, with every step of experimental processes rigorously controlled. Significantly, they said that they did so in order to live up to public science’s reputation as a profession which delivers trustworthy results to the rest of society. As one scientist said concerning science’s responsibility in relation to other public bodies:

It’s the state’s responsibility to legislate in an area, and it’s the Council of Ethics’ responsibility to keep a check on the ethics and it’s the Ministry of Science’s responsibility to take care of the economic aspect of public science and then it’s my responsibility to actually do the science.

These scientists believed that their responsibility was to ‘actually do the science’, and to do it in meticulous ways that meant that it lived up to their professional standards. Vocational responsibility was thus closely connected to the professional scientific community as a whole, rather than relating only to the practices of a specific research organisation or unit. It was seen as a responsibility centred on the process of doing science well and as relating to the things that a scientist is supposed to do in order to carry out ‘proper’ science. The logic seems to be that without being very careful about the research process, the profession cannot deliver trustworthy results to wider society and so the profession’s role in society is lost. On this basis, we argue that a central ‘bottom-up’ practice of responsibility in science is centred around responsibility for ensuring a robust scientific process. This may not be surprising given that Weber and other prominent sociologists have described this scientific ethos in detail earlier (Weber [Citation1917] Citation2004; Merton Citation1973). But we think it is an important point for us and our fellow RRI scholars. Given our interest in outcomes generated by scientific innovation, we may tend to overlook the enormous amounts of work that goes into securing robust scientific practices. This is work that scientists are proud of and consider their core professional responsibility, and it is important that we recognise it as an aspect of RRI.

4. Responsibility for the organisation of research

Responsibility for the process of doing science was not the only one operating in the daily work life of scientists. We also observed a range of practices which were focused on the outcomes of scientific practices both at the organisational and at the societal level. It was clear, for instance, that scientists were deeply concerned with the maintenance and support of their organisation, at both the micro-level (their group or department) and the larger scale (their university or science itself). One example is the importance of ideas about ‘care’, ‘looking after’ and nurture that emerged from interviews with group leaders and PIs. These research managers argued that they had two key concerns in their work: looking after science, such that it is technically excellent, reliable and scientifically meaningful; and looking after people, which took the form of caring for the research group. Significantly, the former was dependent on the latter, so that well-cared for groups would produce good science. A large part of the PI role was, therefore, oriented towards ensuring happy research groups, sustainable careers, and mature and independent scholars. As such, PIs were clear that a large part of their role involved a suite of affective and interpersonal skills, as well as practical concerns such as ensuring that their students or post-docs were funded or had good career options. As one of the PIs put it, they wanted ‘decent persons’ as an outcome of the lab’s PhD education.

There was also a sense of responsibility to the organisation, be it the research group, a department or the entire university. This bottom-up responsibility practice came through as a recurrent concern for carefully managing resources – people, machines, money – in order to secure the survival of the entire research unit over the long term.Footnote6 Through the ethnographic studies, it became clear that this was something everybody participated in, and which involved tasks as varied as individual re-use of pipettes and plastic containers for experiments in order to save money to the ongoing collective work of writing grant applications. Managers, in particular, often reflected on how a good combination of ideas, people and machines will enable a strong organisation. One of the managers compared this responsibility to that of hosting a party:

You’ve got to hope that some of the people you have hired have the ability to make things work. I mean, for instance, if I’m throwing a party, then I also have to rely on that they [the guests] will make a good melting pot that works and that the party turns into a cosy place (…) then there’s also a bigger chance that your impact will be much greater.

This responsibility takes up a considerable amount of time for scientists, and they were not uniformly keen on all aspects of it. While they reported finding it both satisfying and interesting to take care of junior researchers, and to think about how to make a group of people work together in the best possible way, they also told us that grant application writing and political lobbying for their field of research took up too much of their time (see also Davies and Horst Citation2015b; Glerup Citation2015). Such activities were distractions from their core responsibility, that of doing thorough science. But exactly because they cared about doing such science, and wanted to ensure that their colleagues were able to do it also, these more burdensome tasks were also responsibilities that they took upon themselves.

5. Responsibility for the public legitimacy of research

A final responsibility that scientists referred to was that of working within the limits of what (they saw as) publically legitimate research. In discussion scientists tended to acknowledge that the public has a democratic right to have an opinion on research policy, and also that these opinions should influence priority-setting in public research. As such, they generally agreed that they had, broadly, a duty to engage with the public in order to demonstrate public accountability, but also to negotiate what legitimate science is and should be. Many of the scientists we interacted with had, in one way or another, participated in public debate about developments related to their own discipline or technological area. The American lab studied in the ethnographic work, for instance, was involved in research that aimed to develop new diagnostic devices, and had experimented with direct public participation in this research. The scientists in this lab had invited test subjects to comment on the design and general idea of their technology, and had adjusted their work accordingly. One of the scientists commented on what they had learned as follows:

We had this lady, who told us that she was so afraid of needles … And that was when we started thinking about doing saliva tests instead. And it turns out that blood and saliva actually have a lot of the same antibodies, so we could use saliva for the device.

While the scientist here makes this adjustment – a form of responsiveness – sound quite easy, it actually took a great deal of work on their part to create a saliva test that could work as well as a blood test. Nevertheless, based on feedback from a member of the public, the scientists undertook such work. This kind of direct public involvement was, however, rare. Scientists usually engaged with members of the public in more well-established ways: they might take part in panel debates about the future of their research area, invite the public to ‘public laboratory events’, or write debate pieces in the newspapers. In our view, such contributions should be understood as genuine attempts to perform accountability to wider society. Scientists described to us how, as a result of participating in these engagements, they tried to listen and adjust to public views, even if they found some of them ignorant or ill-founded. One research manager described how the lab voluntarily grew GM crops in closed tanks, even though it was legal to grow them in open fields. He did not think that it was a necessary environmental precaution, but he accepted that the public felt otherwise and followed suit. Scientists viewed such public influence on scientific practice as legitimate because universities are publically funded and, therefore, have to be accountable to tax-payers. As Linda, a Danish senior scientist put it:

So, I’m actually saying that, in one way or another – both in education and in research – society is an important actor for us. And if we, as a university, say, ‘we have nothing to do with you people on the other side’, then we have failed completely […] because it is tax money that sustains us.

The data thus demonstrate practices and concerns, which indicate responsiveness (a central aspect of RRI; Owen, Macnaghten, and Stilgoe Citation2012) or what Schuurbiers (Citation2011) has called ‘second-order reflections’ on the system of science. While the researchers we engaged with might insist, at one moment, that ‘responsibility’ was not relevant to their work, they would almost in the same breath describe aspects of their activities that seemed highly socially engaged. They participated in various ‘outreach’ activities to public audiences, were concerned about global problems such as hunger and infectious diseases and tried to find solutions to these through their research and embarked on informal political projects that criticised big business and current government. One leading scientist in the study had, for instance, sworn never to patent his work. He collaborated with a local hacker space and wrote debate pieces about synthetic biology – one with the headline, ‘Builders of the World, Unite’ echoing the Communist Manifesto. Another preferred to build his own start-ups rather than collaborating with the pharmaceutical industry, because he was afraid that the inventions would never reach the public if he did so. Without using the same terms as RRI discourse, our participants were continuously demonstrating reflexivity (or reflection ‘on underlying purposes, motivations and potential impacts’; Owen, Macnaghten, and Stilgoe Citation2012, 755); a concern with the social dimensions of their work (and the sense that it should be directed ‘towards socially desirable ends’; Owen, Macnaghten, and Stilgoe Citation2012, 754); and an openness to public engagement and debate (or ‘inclusive deliberation concerning the direction of travel for science and innovation’; Owen, Macnaghten, and Stilgoe Citation2012, 754) – all characteristics, as the quotes indicate, encouraged by frameworks for RRI. A key finding is thus that there are many practices of responsiveness and accountability described within science, even if they are not usually recognised as such by the scientists themselves. Further, there are significant similarities between these practices and the concepts (if not the terminology) of RRI literature.

6. Academic capitalism and the practice of responsibility

What we have suggested thus far is that while scientists are largely unaware of the language of RRI and other soft law means for enforcing ‘responsibility’, they mobilise a range of bottom-up practices of responsibility – from a commitment to meticulous science to an interest in engaging with public stakeholders and users of research – that in some instances resembles central dimensions of RRI. In this final empirical section, we want to develop the story further by suggesting that while scientists and research managers may claim ignorance of current research policy – that they might, like the scientist quoted in Section 3, say ‘it’s nothing that interests me’ – they simultaneously provide numerous examples of how in practice their activities are shaped by such policy. Research policy (though not perhaps RRI) implicitly constitutes the ways in which ‘bottom-up’ responsibilities may be performed.

One important contemporary policy trend is the increased expectation that universities and other research organisations should operate in a wider marketplace of research, teaching and knowledge production. In different ways, multiple STS and research policy scholars have argued that the political economy of scientific research is shifting to become more entwined with business (Etzkowitz et al. Citation2000) and increasingly globalised (Slaughter and Leslie Citation1997) and marketised (Meek Citation2000). These developments have been summarised as the rise of ‘academic capitalism’ (Hackett Citation2014). Capital, writes Ed Hackett,

 … constructs and equips laboratories, supports research and researchers (sometimes directly with personal or corporate funds), bestows ‘gifts’ that smuggle donors’ demands and desires into programs of education and research, and enriches university endowments. More subtly, capital shapes research agendas by raising alarm about gathering storms of offshore competitors and by concentrating research on topics that enhance economic performance and national security, while steering funds away from those that deepen fundamental knowledge or enhance human capabilities and well-being. (Citation2014, 636)

These dynamics were apparent in the experiences of our interviewees and informants. We have already noted, for instance, the constant pressure that many research managers and PIs felt themselves to be under to secure further funding for themselves, their groups and their research. This manifested in descriptions of their roles as largely involving work outside of the lab: liaising with funders, managing human resources, writing and negotiating funding applications. It was also apparent in the way in which they at times spoke about tensions between what they saw as robust science and the intensely competitive environment they were operating in. ‘You have to be careful’, one PI said, to strike ‘a balance’, so that you are actively ‘selling’ your science while at the same time making sure that ‘it’s not just wild ideas that would sound sexy to somebody who would give me money’. Care is required exactly because there are others out there, competing for the same funds, who might not be so concerned about their ‘wild ideas that … sound sexy’ being viable. Part of the craftwork of research leadership, then, was portrayed as the ability to operate in an environment of academic capitalism whilst not compromising important vocational responsibilities, such as those described in Section 3.

More specifically we might note the effects of a widespread move towards interest in ‘impact’ on the part of governments and other research funders. Hackett gestures towards this, in the extract above, by noting that research funds are increasingly focused on areas ‘that enhance economic performance and national security’; more specifically, this has been articulated through demands that research will have clearly identifiable (and often quantifiable) scientific and societal impacts (Donovan Citation2011; Bornmann Citation2013).Footnote7 Overall, this focus on impact was one policy development that was extremely visible to scientists, perhaps because it concretely affected how they wrote their grant proposals. This, for instance, is the US researcher Rita talking about how she develops NSF projects:

… a portion of the proposal that is judged is the broader impacts of the work, and so you have to be able to describe and show how you’re going to […] educate, how you’re going to impact society in a positive way, how it’s going to benefit society. And it can’t just be, we’re going to conduct the research, and therefore create new knowledge and we’re going to publish papers. That’s not enough.

For Rita and other scientists involved in developing research proposals, the need for impact beyond scientific excellence was a tangible force in shaping their research. They knew about impact as a particularly kind of terminology, and were able to mobilise this language as they designed their projects. The degree of cynicism with which they did this varied. Some – generally those whose work could easily be understood as ‘impactful’ – were enthusiastic about calls for impact, viewing this as a means of ensuring the forms of accountability and public legitimacy described in Section 5. Others saw it as another request to make their research ‘sexy’ in a particular kind of way. One highly successful UK group leader, Victor, was explicit that use of this language was, for him, simply a particular kind of ‘spin’. He was, he told us, happy to use the word impact ‘100 times in a grant proposal’ if that is what was required; ultimately, though, these demands were ‘just fads and fashions’. ‘I can play the game’, he said, ‘but my honest opinion is it’s stupid’. A similar dynamic is seen in the tension, portrayed to us, between ‘basic’ and ‘applied’ research. Many scientists argued for the value of basic research. But the arguments they used in defence of this research were often about as of yet unknown applications and future technologies, thereby reinforcing the impression that basic science indeed needed a defence: that it would, one day, be applicable and lead to concrete innovations.

In all of these ways the responsibility practices of scientific research is potentially being shaped by particular policy decisions and developments. It is therefore not correct to understand the comments of Niels – the Danish nanoscientist quoted in Section 3 – that ‘this policy making’ is of ‘no interest’ to him as meaning that it does not concern him. He might not find it interesting, but it still has consequences for him. In a similar fashion, we would argue that although Victor (quoted above) thinks it is ‘stupid’, he is reinforcing ‘the game’ by playing it. Scientific practice is being shaped by political dynamics; and policy decisions, whether at the level of international actors, national governments or specific research funders, are having an effect on the everyday life of responsibility practices in science. But programmes such as RRI, responsible development, or other forms of technology assessment do not necessarily mediate the dynamics between the political and organisational level. Rather, it seems to be the trends of marketisation and academic capitalism that play the biggest role in how scientists experience and are able to perform their responsibilities (cf. Martin Citation2011).

7. Discussion

As discussed in the methodology section, our results are based on the experiences of publically funded scientists in specific countries. It is not clear what they will mean for other contexts. However, our findings identify ‘bottom-up’ responsibilities that have tended to be overlooked or under-reported by other RRI scholars – perhaps because most RRI-studies are case studies where RRI or similar tools have been used actively as a way of enhancing social responsibility (e.g. Schuurbiers Citation2011; Owen, Macnaghten, and Stilgoe Citation2012: Koops et al. Citation2015). We can also testify that despite scientists’ lack of awareness of research policy, their practices are evidently shaped by trends in the marketisation and heightened competition of academic work. At least two key issues stem from these findings.

The first concerns the wider relation between policy articulations of RRI and ‘bottom-up’ descriptions of the responsibilities of scientific life. In Sections 3–5, we suggested that we can identify a range of different forms of ‘bottom-up’ responsibilities – practices of caring for data, for organisations and individuals and for the legitimacy of science – that were rarely framed as a way of performing ‘responsibility’ but which were nevertheless seen as essential to ‘good’, morally robust science. We also noted that many of these practices are similar to formal articulations of RRI or related terms (e.g. Owen, Macnaghten, and Stilgoe Citation2012; von Schomberg Citation2013; Guston et al. Citation2014, 3). We might say, then, that many scientists in our study are de facto participants in the RRI agenda. The key challenge to this view is that at the same time scientists were generally ignorant of formal discussions of RRI, and at times actively hostile to what they saw as a potentially alienating policy discourse. To talk of ‘responsibility’ was often read as an accusation, rather than as an extension of existing, rather mundane, practices of everyday scientific life. To these scientists RRI and related policy agendas were viewed as imposed by actors outside of science who do not necessarily understand the demands of scientific practice. And yet, as we showed in Section 6, our findings suggest that research policy (even if this does not yet extend to RRI in particular) shapes scientific practices of responsibility in multiple and perhaps even some of the most salient ways. That said, RRI also risks being subsumed under the general agenda of ‘impact’, meaning that it is likely to be viewed as adding merely another bureaucratic layer of difficulty onto the ability of scientists to secure the necessary resources and permissions to do science.

Accordingly, as scholars of RRI, one project for our own research and reflection might thus be to assess how the language of ‘bottom-up’ practices of responsibility, and indeed those practices themselves, might be better integrated with our work and with policy promoting RRI. We need to ensure that the promotion of responsibility does not alienate those working in science and innovation, but rather has traction on their lived experiences, concerns and terminologies. We perhaps also need also to ask some hard questions about what we are demanding of those scientists, technologists and engineers. Spruit, Hoople, and Rolfe (Citation2015) have demonstrated that the agency of those working in science is experienced as severely limited, and that they have little control over wider pathways of innovation. To talk of individual responsibility, they argue, is largely meaningless. How meaningful is public deliberation in the context of basic science, or how relevant is RRI’s focus on socially robust outcomes in the context of mundane scientific practice? What can an emphasis on the immediacy of the research environment – as in ‘mid-stream modulation’ (Fisher, Mahajan, and Mitcham Citation2006) – do in terms of the wider structural conditions under which science now operates? Historically, it is far from clear that ‘top-down’ efforts at ‘responsibilisation’ are straightforwardly accepted by communities of practice (Mitcham Citation2003). It is therefore important that we reflect not only on the exact characteristics of RRI, but the very terms on which it is framed and promoted.

We have three concrete suggestions for further developing shared interpretations and practices of responsibility among RRI scholars and scientists. First, programmes for and discussion of RRI should take their departure in existing practices of responsibility within the laboratories and research and innovation practices they are aimed at. If scientists are to find RRI meaningful, then discussion of it must be grounded in (though not limited by) their language and experiences. One recent example of such an approach uses the concept of life cycle analysis (LCA) as a means of operationalising RRI within industry (Wender et al. Citation2014). Second, RRI needs to be anchored in the practical organisation of mundane scientific work. If RRI is not to be seen as an additional burden or imposition, it has to be acknowledged that any kind of scientific reflection or wider engagement takes planning, time and skills. RRI thus needs to be embedded within systems through which the labour it entails can be acknowledged and valued. At the moment, academic value systems do not reward participation in activities such as public engagement (Watermeyer Citation2015; Felt et al. Citation2016): policy promotion of RRI needs to find ways of overcoming these structural limitations. Finally, it seems clear that the most fruitful approaches to developing and ensuring widespread use of RRI methods occur through close collaborations between natural and social scientists, and over extended periods of engagement (as with the STIR project: Fisher and Schuurbiers Citation2013). Projects such as that described in Spruit, Hoople, and Rolfe (Citation2015), in which a detailed analysis of what ‘responsibility’ should mean for scientific practice was developed through long-term collaboration and reflection between an ethicist and a group of engineers, offer an important way forward for ensuring a broader uptake of RRI.

The second key issue concerns RRI and its relation to other trends in the governance of universities and knowledge production. As discussed in Section 6, practices directly relating to RRI are not integrated into science to the same extent as work tasks relating to what we have summarised as academic capitalism. Indeed, we would suggest that the structures of academic capitalism might to a large degree structure or overtake practices of responsibility. A lab’s survival is increasingly dependent on external funding, competition and marketable results: fitting in the task of ensuring democratic accountability and responsiveness, something that takes time and preparation without receiving tangible rewards in the form of citations or funding, is unlikely to be a priority. RRI may itself be reduced to a tool that enables commercialisation by using strategies such as public participation (cf. Kearnes and Rip Citation2009). Consider, for instance, the story in Section 5 in which a lab changed a diagnostic device from being based on blood to being based on saliva after public feedback. While the scientists were proud to present this as an example of responsiveness to public views, it could also be read as an early usability test in preparing a product for a market. In such contexts practices of responsibility may easily be co-opted into dominant agendas of commercialisation. Both academic capitalism and RRI are oriented towards outcomes, and the purposeful direction of research and innovation to particular ends (broadly, economic productivity or democratically agreed priorities); they may, then, have more in common that may at first appear to be the case.

We, therefore, believe that scholars of RRI need to reflect more on its interaction with the structures of contemporary knowledge production. It might have adverse effects to suggest that scientists take on additional responsibility, or that they require more oversight, without considering these wider dynamics. Our central recommendation is that RRI research should develop a stronger stance vis-à-vis academic capitalism and, in particular, the ways in which RRI policy interacts with pressures for innovation, economic benefit and commercialisation. It is often argued that RRI aims for research that results in innovations that are both economically productive and socially robust (v. de Saille Citation2015). What happens when those aims conflict, how that is experienced at the level of bench research, and what other forces are pushing for one or the other all deserve serious and ongoing consideration from scientists, social researchers and policy-makers.

8. Conclusion

In this paper, we have argued that a range of ‘bottom-up’ responsibilities are articulated and practiced among scientists working on new and emerging technologies. While scientists mostly viewed policy discourses such as RRI as irrelevant or meaningless in relation to their daily work, this was not necessarily because they saw these ideas as contradicting the classic scientific ethos. Rather, they framed themselves as having so many daily responsibilities that taking more upon themselves was problematic. But while they in one instance underlined that they did not have time for ‘extra’ responsibility, they would almost in the same sentence describe projects which seemed highly socially engaged. We showed that scientists described responsibilities for producing sound science, taking care of employees, handling the lab’s financial situation, creating ‘impact’, and producing science that is legitimate in the eyes of the public. While some of these responsibilities resonate well with RRI discourse, the practice of science is also being shaped by academic capitalism and the marketisation of universities. Based on these findings we have made two recommendations: that RRI scholars such as ourselves become better at developing a shared language of responsibility with scientists; and that RRI scholars more actively consider the contemporary political situation at modern universities, where time for outreach and reflection competes with time for making industrial partnerships and patents. As it is, the responsibilities that stem from academic capitalism are currently much more the focus of scientists’ attention than those pertaining to socially robust research and innovation.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes on contributors

Cecilie Glerup holds a PhD in Sociology of Science from Copenhagen Business School. Her research interests primarily lie in organisational effects of public sector reforms – especially in public science. She is currently employed as Post.Doc at Department of Food and Resource Economics, where she researches public and expert perceptions of synthetic biology vaccines for farm animals.

Sarah R. Davies is Associate professor at Department of Media, Cognition and Communication at Copenhagen University. Her research focuses on the theory and practice of science communication, public engagement and deliberation on science, and the governance of new and emerging technologies.

Maja Horst is Professor and Head of Department at Media, Cognition and Communication at Copenhagen University. She is broadly interested in the complex interplays between science and public society and has therefore published extensively in the areas of science communication, public participation, governance of emergent technologies and research management.

Additional information

Funding

This research was funded by the Independent Research Fund Denmark - Social Sciences [grant number 0602-01038B].

Notes

1. There is, of course, a long history of promoting social responsibility in science, from 1970s radical science to post-war efforts to turn science to social use (see, e.g. Kleinman Citation1995; Martin Citation1993). But it seems clear that there has been a recent intensification of (external) calls for responsibilisation (de Saille Citation2015).

2. Such a use of mixed methods has a long tradition within STS. We are particularly inspired by Law (Citation1994), Knorr Cetina (Citation1999) and Hackett (Citation2005) in mixing ethnography with interviews.

3. In order to increase transparency, we have made references to previous publications where a particular aspect of this analysis is unfolded in more detail.

4. All names have been changed.

5. Of course, just because no ethical issues are reported by informants does not mean that nothing that others – including RRI scholars – would class as ethical issues are not present. As described in Section 2, our priority is to understand the lifeworlds of our informants rather than to arbitrate as to whether what they say is ‘real’ or not (cf. Weick Citation1995).

6. A concern that has also been reported elsewhere; see, for instance, Fisher (Citation2007) and Flipse, van der Sanden, and Osseweijer (Citation2012).

7. In the UK, for instance, applications to Research Councils UK (the central public funding body for academic research) must include a statement relating to the research’s ‘Pathways to Impact’. In the US, the NSF is similarly concerned with examples of the ‘Broader Impacts’ of funded research (Holbrook Citation2012). The language of ‘impact’ is not present in the Danish funding system in the same unified way, though there are similar dynamics around demands for research that is economically and socially beneficial as in the UK and the US. This is visible in the growing focus on science’s potential for ‘innovation’ and for creating jobs. Some public funding has become conditional on ‘partnerships’ with industry – for instance in the newly established Innovation Fund Denmark.

References

  • Balmer, Andrew S., Jane Calvert, Claire Marris, Susan Molyneux-Hodgson, Emma Frow, Matthew Kearnes, Kate Bulpin, Pablo Schyfter, Adrian Mackenzie, and Paul Martin. 2016. “Five Rules of Thumb for Post-ELSI Interdisciplinary Collaborations.” Journal of Responsible Innovation 3: 73–80. doi: 10.1080/23299460.2016.1177867
  • Bornmann, Lutz. 2013. “What Is Societal Impact of Research and How Can It Be Assessed? A Literature Survey.” Journal of the American Society for Information Science and Technology 64 (2): 217–233. doi:10.1002/asi.22803.
  • Davies, S. R., and M. Horst. 2015a. “Crafting the Group: Care in Research Management.” Social Studies of Science 45 (3): 371–393. doi: 10.1177/0306312715585820
  • Davies, S. R., and M. Horst. 2015b. “Responsible Innovation in the US, UK and Denmark: Governance Landscapes.” In Responsible Innovation 2, edited by B.-J. Koops, I. Oosterlaken, H. Romijn, T. Swierstra, and J. van den Hoven, 37–56. Springer International Publishing. Accessed 6 May 2015. http://link.springer.com/chapter/10.1007/978-3-319-17308-5_3.
  • Donovan, Claire. 2011. “State of the Art in Assessing Research Impact: Introduction to a Special Issue.” Research Evaluation 20 (3): 175–179. http://rev.oxfordjournals.org/content/20/3/175.short. doi: 10.3152/095820211X13118583635918
  • Douglas, H. E. 2003. “The Moral Responsibilities of Scientists (Tensions Between Autonomy and Responsibility).” American Philosophical Quarterly. http://www.jstor.org/stable/20010097.
  • Du Gay, P. 2000. In Praise of Bureaucracy. London: Sage.
  • Etzkowitz, H., A. Webster, C. Gebhardt, and B. R. C. Terra. 2000. “The Future of the University and the University of the Future: Evolution of Ivory Tower to Entrepreneurial Paradigm.” Research Policy 29 (2): 313–330. http://www.sciencedirect.com/science/article/pii/S0048733399000694. doi: 10.1016/S0048-7333(99)00069-4
  • Felt, Ulrike, Judith Igelsböck, Andrea Schikowitz, and Thomas Völker. 2016. “Transdisciplinary Sustainability Research in Practice Between Imaginaries of Collective Experimentation and Entrenched Academic Value Orders.” Science, Technology & Human Values. doi:10.1177/0162243915626989.
  • Fisher, Erik. 2005. “Lessons Learned from the Ethical, Legal and Social Implications Program (ELSI): Planning Societal Implications Research for the National Nanotechnology Program.” Technology in Society 27 (3): 321–328. doi: 10.1016/j.techsoc.2005.04.006
  • Fisher, Erik. 2007. “Ethnographic Invention: Probing the Capacity of Laboratory Decisions.” NanoEthics 1 (2): 155–165. doi: 10.1007/s11569-007-0016-5
  • Fisher, E., R. L. Mahajan, and C. Mitcham. 2006. “Midstream Modulation of Technology: Governance from Within.” Bulletin of Science, Technology and Society 26 (6): 485–496. doi: 10.1177/0270467606295402
  • Fisher, Erik, Michael O’Rourke, Robert Evans, Eric B. Kennedy, Michael E. Gorman, and Thomas P. Seager. 2015. “Mapping the Integrative Field: Taking Stock of Socio-technical Collaborations.” Journal of Responsible Innovation 2 (1): 39–61. doi: 10.1080/23299460.2014.1001671
  • Fisher, Erik, and Daan Schuurbiers. 2013. “Socio-technical Integration Research: Collaborative Inquiry at the Midstream of Research and Development.” In Early Engagement and New Technologies: Opening up the Laboratory, 97–110. Dordrecht: Springer.
  • Flipse, S. M., M. C. A. van der Sanden, and P. Osseweijer. 2012. “Midstream Modulation in Biotechnology Industry: Redefining What is ‘Part of the Job’ of Researchers in Industry.” Science and Engineering Ethics 19 (3): 1141–1164. doi: 10.1007/s11948-012-9411-6
  • Flyvbjerg, Bent. 2006. “Five Misunderstandings about Case-study Research.” Qualitative Inquiry 12 (2): 219–245. doi: 10.1177/1077800405284363
  • Frankel, Mark S. 2015. “An Empirical Exploration of Scientists’ Social Responsibilities.” Journal of Responsible Innovation 2 (3): 301–310. doi:10.1080/23299460.2015.1096737.
  • Glerup, Cecilie. 2015. “Organizing Science in Society: The Conduct and Justifications of Responsible Research.” (PhD Dissertation). Copenhagen Business School, Copenhagen.
  • Glerup, Cecilie, and Maja Horst. 2014. “Mapping ‘Social Responsibility’ in Science.” Journal of Responsible Innovation 1 (1): 31–50. doi: 10.1080/23299460.2014.882077
  • Grunwald, Armin. 2011. “Responsible Innovation: Bringing Together Technology Assessment, Applied Ethics, and STS Research.” Enterprise and Work Innovation Studies 7 (Jan.): 9–31. http://run.unl.pt/handle/10362/7944.
  • Guston, David H, Erik Fisher, Armin Grunwald, Richard Owen, Tsjalling Swierstra, and Simone van der Burg. 2014. “Responsible Innovation: Motivations for a New Journal.” Journal of Responsible Innovation 1 (1): 1–8. doi:10.1080/23299460.2014.885175.
  • Hackett, E. J. 2005. “Essential Tensions Identity, Control, and Risk in Research.” Social Studies of Science 35 (5): 787–826. doi: 10.1177/0306312705056045
  • Hackett, E. J. 2014. “Academic Capitalism.” Science, Technology & Human Values 39 (5): 635–638. doi: 10.1177/0162243914540219
  • Holbrook, J Britt. 2012. “Re-assessing the Science – Society Relation: The Case of the US National Science Foundation’s Broader Impacts Merit Review Criterion (1997–2011).” Technology in Society 27: 437–451. February. doi:10.1016/j.techsoc.2005.08.001.
  • Irwin, A. 2006. “The Politics of Talk Coming to Terms with the ‘New’ scientific Governance.” Social Studies of Science. http://sss.sagepub.com/content/36/2/299.short.
  • Jasanoff, S. 2005. Designs On Nature: Science And Democracy In Europe And The United States. Princeton University Press.
  • de Jong S., K. Barker, D. Cox, D. Person, and P. van den Besselaar. 2011. Societal Impact of Enabling Research Fields: ICT Research: A Dutch and a UK Case. Proceedings of European Network of Indicator Designers Conference, Rome.
  • Kearnes, Matthew, and Arie Rip. 2009. “The Emerging Governance Landscape of Nanotechnology.” In Jenseits Von Regulierung: Zum Politischen Umgang Mit Der Nanotechnologie, 97–121. Berlin: Akademische Verlagsgesellschaft.
  • Kjølberg, Kamilla Lein, and Roger Strand. 2011. “Conversations about Responsible Nanoresearch.” NanoEthics 5 (1): 99–113. doi:10.1007/s11569-011-0114-2.
  • Kleinman, D. L. 1995. Politics on the Endless Frontier: Postwar Research Policy in the United States. Durham: Duke University Press.
  • Knorr Cetina, K. 1999. Epistimic Cultures, How the Sciences Make Knowledge. Cambridge: Harvard University Press.
  • Koops, Bert-Jaap, Ilse Oosterlaken, Henny Romijn, Tsjalling Swierstra, and Jeroen van den Hoven. 2015. Responsible Innovation 2: Concepts, Approaches, and Applications. Springer. https://books.google.dk/books?id=ZkjmCAAAQBAJ.
  • Law, John. 1994. Organizing Modernity. Cambridge: Blackwell Oxford.
  • Lentzos, Filippa. 2009. “Synthetic Biology in the Social Context: The UK Debate to Date.” Biosocieties 4 (2–3): 303–315. doi:10.1017/S1745855209990172.
  • Martin, B. 1993. “The Critique of Science Becomes Academic.” Science, Technology, & Human Values 18 (2): 247–259. doi: 10.1177/016224399301800208
  • Martin, B. R. 2011. “The Research Excellence Framework and the ‘Impact Agenda’: Are We Creating a Frankenstein Monster?” Research Evaluation 20 (3): 247–254. doi: 10.3152/095820211X13118583635693
  • McCarthy, E., and C. Kelty. 2010. “Responsibility and Nanotechnology.” Social Studies of Science 40 (3): 405–432. doi:10.1177/0306312709351762.
  • Meek, V. Lynn. 2000. “Diversity and Marketisation of Higher Education: Incompatible Concepts?” Higher Education Policy 13 (1): 23–39. doi: 10.1016/S0952-8733(99)00030-6
  • Mejlgaard, Niels. 2009. “The Trajectory of Scientific Citizenship in Denmark: Changing Balances Between Public Competence and Public Participation.” Science and Public Policy 36 (6): 483–496. doi: 10.3152/030234209X460962
  • Merton, Robert K. 1973. The Sociology of Science: Theoretical and Empirical Investigations. Chicago: University of Chicago Press.
  • Mitcham, C. 2003. “Co-responsibility for Research Integrity.” Science and Engineering Ethics 9 (2): 273–290. doi: 10.1007/s11948-003-0014-0
  • Nowotny, Helga. 1999. “The Place of People in our Knowledge.” European Review 7 (2): 247–262. doi: 10.1017/S1062798700004026
  • Owen, Richard, Phil Macnaghten, and Jack Stilgoe. 2012. “Responsible Research and Innovation: From Science in Society to Science for Society, with Society.” Science and Public Policy 39 (6): 751–760. doi:10.1093/scipol/scs093.
  • Prakash, Aseem. 2000. “Responsible Care: An Assessment.” Business & Society 39 (2): 183–209. doi:10.1177/000765030003900204.
  • Roco, Mihail C, Barbara Harthorn, David Guston, and Philip Shapira. 2011. “Innovative and Responsible Governance of Nanotechnology for Societal Development.” Journal of Nanoparticle Research 13 (Sept.): 3557–3590. doi:10.1007/s11051-011-0454-4.
  • The Royal Academy of Engineering. 2009. Synthetic Biology: Scope, Applications and Implications. London: The Royal Academy of Engineering. http://www.raeng.org.uk/news/publications/list/reports/Synthetic_biology.pdf.
  • de Saille, Stevienna. 2015. “Innovating Innovation Policy: The Emergence of ‘Responsible Research and Innovation’.” Journal of Responsible Innovation 2 (2): 152–168. doi: 10.1080/23299460.2015.1045280
  • Schuurbiers, Daan. 2011. “What Happens in the Lab: Applying Midstream Modulation to Enhance Critical Reflection in the Laboratory.” Science and Engineering Ethics 17 (4): 769–788. doi: 10.1007/s11948-011-9317-8
  • Schuurbiers, D., P. Osseweijer, and J. Kinderlerer. 2009. “Implementing the Netherlands Code of Conduct for Scientific Practice – A Case Study.” Science and Engineering Ethics 15 (2): 213–231. doi: 10.1007/s11948-009-9114-9
  • Silverman, D. 2001. Interpretating Qualitative Data: Methods for Analysing, Talk, Text and Interaction. London: Sage.
  • Slaughter, Sheila, and Larry L. Leslie. 1997. Academic Capitalism: Politics, Policies, and the Entrepreneurial University. The Johns Hopkins University Press. 2715 North Charles Street, Baltimore, MD 21218-4319 ($39.95). http://eric.ed.gov/?id=ED409816.
  • Spruit, Shannon L., Gordon D. Hoople, and David A. Rolfe. 2015. “Just a Cog in the Machine? The Individual Responsibility of Researchers in Nanotechnology Is a Duty to Collectivize.” Science and Engineering Ethics (May): 1–17. doi:10.1007/s11948-015-9718-1.
  • Stilgoe, J. 2013. “Foreword: Why Responsible Innovation.” In Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society, edited by R. Owen, J. Bessant and M. Heintz, 11–17. Chichester: Wilery.
  • Stilgoe, J., R. Owen, and P. Macnaghten. 2013. “Developing a Framework for Responsible Innovation.” Research Policy 42 (9): 1568–1580. doi: 10.1016/j.respol.2013.05.008
  • Swierstra, T., and A. Rip. 2007. “Nano-ethics as NEST-ethics: Patterns of Moral Argumentation About New and Emerging Science and Technology.” NanoEthics 1 (1): 3–20. doi: 10.1007/s11569-007-0005-8
  • Van der Burg, S., and A. van Gorp. 2005. “Understanding Moral Responsibility in the Design of Trailers.” Science and Engineering Ethics 11 (2): 235–256. doi: 10.1007/s11948-005-0044-x
  • von Schomberg, Rene. 2013. “A Vision of Responsible Innovation.” In Responsible Innovation, edited by Richard Owen, M. Heintz and J. Bessant, 51–74. London: John Wiley.
  • Watermeyer, Richard. 2015. “Lost in the ‘Third Space’: The Impact of Public Engagement in Higher Education on Academic Identity, Research Practice and Career Progression.” European Journal of Higher Education 5 (3): 331–347. doi:10.1080/21568235.2015.1044546.
  • Weber, Max. (1917) 2004. The Vocation Lectures. Indianapolis: Hackett.
  • Weick, K. E. 1995. Sensemaking in Organizations. Thousand Oaks: Sage.
  • Wender, Ben A, Rider W Foley, Troy A Hottle, Jathan Sadowski, Valentina Prado-Lopez, Daniel A Eisenberg, Lise Laurin, and Thomas P Seager. 2014. “Anticipatory Life-cycle Assessment for Responsible Research and Innovation.” Journal of Responsible Innovation 1 (2): 200–207. doi:10.1080/23299460.2014.920121.
  • Wood, Stephen, Richard Jones, and Alison Geldart. 2007. Nanotechnology: From the Science to the Social: The Social, Ethical and Economic Aspects of the Debate. Swindon: Economic and Social Research Council. http://www.esrc.ac.uk/ESRCInfoCentre/Images/ESRC_Nano07_tcm6-18918.pdf.
  • Yearley, S. 2009. “The Ethical Landscape: Identifying the Right Way to Think About the Ethical and Societal Aspects of Synthetic Biology Research and Products.” Journal of The Royal Society Interface 6 (Suppl. 4): S559–S564. http://rsif.royalsocietypublishing.org/content/6/Suppl_4/S559.short. doi: 10.1098/rsif.2009.0055.focus

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.