10,582
Views
70
CrossRef citations to date
0
Altmetric
Research articles

Mapping ‘social responsibility’ in science

&
Pages 31-50 | Received 09 Oct 2013, Accepted 26 Dec 2013, Published online: 17 Feb 2014

Abstract

This article employs the Foucauldian notion of ‘political rationality’ to map discussions and ideals about the responsibility of science toward society. By constructing and analyzing an archive of 263 journal papers, four political rationalities were identified: the Demarcation rationality, which aims to exclude the social from the scientific production in order to make it objective and thereby responsible; the Reflexivity rationality, which sees it as science's responsibility to let itself be guided by problems in society in choice of research focus and methods; the Contribution rationality, which insists that responsible science should live up to public demands for innovation and democracy; and the Integration rationality, which advocates that science should be co-constructed with societal actors in order to be socially responsible. While each rationality is distinct, the article argues that all of them address the issue of a boundary (or integration) between science and society. Hence, it is not possible for scientists to avoid ‘a relationship’ with society. The political question is how this relationship is to be defined and regulated.

One of the characteristics of the ‘new governance of science’ (Guston and Sarewitz Citation2002; Irwin Citation2006) is that the ability of science to govern itself in a responsible way has been fundamentally problematized (Braun et al. Citation2010; Jasanoff Citation2011). Along with this problematization has come a new set of sensibilities and demands for more deliberative forms of governance (Irwin Citation2006). The theme of social responsibility of science, or even ‘Responsible Research and Innovation’ (von Schomberg Citation2011), has gained momentum within both policy and academic discourse (Guston and Sarewitz Citation2002; Fisher and Mahajan Citation2006; Owen et al. Citation2009; Sutcliffe Citation2011). In particular, notions of the social responsibility of science are evolving in soft law and international settings, yet there is no unifying definition of what this term means (Davies and Horst Citation2012). Rather, the notion of social responsibility of science can be seen as political: it is open for contestation about how it should be defined and interpreted, and each of these interpretations has consequences for the governance of science, i.e. the way science is regulated and practiced (Foucault Citation2003a).

The notion that science has a responsibility toward society, however, is not new. Scientists have a long tradition of discussing their responsibilities as a balance between their professional autonomy and their general moral responsibility as human beings (Douglas Citation2003). Nevertheless, there are no clear connections between scientists' own discussions and that of policy-makers and scholars of the new scientific governance, although it should be recognized that the literature on social responsibility ‘in practice’ is growing (McCarthy and Kelty Citation2010; Phelps and Fisher Citation2011; Stilgoe, Owen, and Macnaghten Citation2013; Davies, Glerup, and Horst, Citationin press). This paper contributes to the understanding of the relationship between scientists' discussions and that of policy-makers and other actors by mapping the overall landscape of ideas about social responsibility in science as it can be found in academic journals. It takes as its point of departure the position that both scientists and scholars of new scientific governance are addressing the same ‘problematization’ of science, namely how it should be governed – or govern itself – in a responsible way. In doing so, all voices addressing this issue are viewed as ‘political’ because they contribute to the shaping of ideals about how science is to be performed and regulated. The paper, therefore, makes no a priori distinctions among different types of voices. Rather, it deliberately treats all voices equally in order to investigate the governance effects of these discussions.

Mapping discussions on social responsibility in academic journals, we have employed a Foucauldian analysis of governance to understand how a particular conceptualization of responsibility implies a political rationality, i.e. a particular form of governance of science. The analysis identifies four different political rationalities. They differ according to whether they advocate internal or external regulation of science and whether they are focused on regulation of the process or the outcomes of science. They all imply, however, that a particular relationship between science and society is necessary in order for science to be responsible and also that scientists need to conduct their science within the structure of this relationship in order for their practice to be legitimate and proper.

Responsibility as a political rationality

Debates about the social responsibility of science are far from new (Shapin Citation2008). While many events, important for these debates, have occurred, we restrict our brief discussion to two. First, the Manhattan Project – the development of the atomic bomb in the USA during World War II – led a number of scientists to discuss the purpose of their occupation (Rhodes Citation2012). Shapin (Citation2008, 65) describes how scientists in this context moved their discussions into the public sphere by debating their moral obligations in relation to the development and use of nuclear bombs. These discussions are continued in present debates about dual use (McLeish and Nightingale Citation2007), but also more broadly in the discussion of scientists' responsibilities for the use of the outcomes of their science. In 1973 and 1975, the Asilomar Conferences on recombinant DNA brought together scientists – and importantly some non-scientists – who gathered to discuss potential hazards in connection with the discovery of restriction enzymes and the incipient field of gene technology. These discussions inspired new governmental organizations such as ethics committees in a number of countries and instigated more general discussions of scientists' abilities to govern themselves responsibly (Braun et al. Citation2010). The case of Asilomar demonstrates the contested nature of such activities: is it a successful story about the will to self-governance, a discussion led by a small elite of scientists, or a story of how scientists were forced to further regulation by outside actors (Barinaga Citation2000)? Regardless of how one interprets the event, Asilomar epitomizes the question of science's ability to govern itself.

While the discussions about the relation between the scientific profession and society are thus not new, the debate following the Asilomar conferences marked a shift in the character of the problems discussed and the kind of actors who might have a legitimate say about them. In a Foucauldian perspective, this shift can be seen as a ‘problematization’ of the hitherto internal governance of science through professional norms. A problematization should be understood as the emergence of intensified discussions and propositions about the steering of a specific area of the social, because the preceding ideas on how to govern have come under scrutiny (Foucault Citation2003a, 229–230). In this instance, what had come under renewed scrutiny in the second part of the twentieth century were science's ability to govern itself responsibly, its strong connections with the state and its weak links to the public, and its traditional morality as basis for state regulation of new technologies (Braun et al. Citation2010, 512). In more recent years, discussions of science and social responsibility have also been connected to more general, theoretical debates about governance. Some consider the demands for responsibility an instance of a growing market-embedded morality spurred by neoliberalism (Shamir Citation2008); whereas, others are worried that questions of ethics and responsibility are neglected in favor of market considerations (Hellström Citation2003). The discussions of how responsibility in science relates to broader societal developments further underscore its more prominent place on the agenda as a governance-problem.

Our use of the term ‘governance’ follows Foucault's argument that the exercise of modern government is characterized by the structuring of possible fields of action (2003b, 138). Fields of action shape the freedom of actors to act by rendering some choices of behavior and thinking (rather than others) legitimate and right – what Foucault (Citation2003b, 138) has referred to as ‘the conduct of conduct’. First, it implies a move away from a state-centered view of ‘government’ as something that is conducted by specific individuals or classes of individuals with a pre-defined set of interests. Rather, governance should be understood as attempts to shape individual and collective behavior performed not by individuals or classes of individuals, but by rationalities or discourses (Foucault Citation1976/1998, 94, Foucault Citation2003b, 128). Hence, modern governance is understood as complex multitudes of language, agencies, and technologies that seek to administer the lives of others (Foucault Citation2003a, 237). Second, it implies that studies of modern governance should focus on how these multitudes make different fields of action possible by turning our attention to the study of ‘political rationalities’:

the changing discursive fields within which the exercise of power is conceptualised, the moral justifications for particular ways of exercising power by diverse authorities, notions of the appropriate forms, objects and limits of politics, and conceptions of the proper distribution of such tasks among secular, spiritual, military and familial sectors. (Rose and Miller Citation1992, 175, our emphasis)

In order to study modern governance, we thus have to study how different forms of rationalities about responsibility in science are conceptualized, how they are justified, and to whom the practice of responsibility is distributed.

We pursue this agenda by studying articulations about the social responsibilities of science as they appear in academic journals. As a medium for professional and normative discussions about the role of science, such journals are an important venue for discussions about this issue, and due to the intensity and breadth of viewpoints they present, the analysis can serve as an indicator of the current, more general perspectives on the social responsibility of science. Using empirical material from academic journals accentuates the fact that the governance of science is, to a large degree, structured by the profession's own norms and standards. Merton emphasized this point when he described the norms of communalism, universalism, disinterestedness, and organized skepticism based on interviews with various scientists (Merton Citation1973). Sociologists of science have subsequently criticized these CUDOS norms for not capturing what goes on in the daily work-life of scientists (Lynch Citation1997). In our view, such criticism should not be taken to imply that the CUDOS norms are irrelevant, but rather that they are insufficient as an account of the practices of science.

Departing from our focus on the conduct of conduct, norms – and among them norms similar to what Merton described – have an important influence on ideas about the objectives of science and the purpose of the scientific profession. Rather than studying scientific norms of professional conduct as a good or bad description of scientific practice, we are studying their performative effects for the governance of science. In this way, the discussions of social responsibility of science as they take place in academic journals are a governance technology, part of ‘the complex of mundane programs, calculations, techniques, apparatuses, documents and procedures through which authorities seek to embody and give effect to governmental ambitions’ (Rose and Miller Citation1992, 175). The journals are the specific, material technology (albeit not very ‘high-tech’) that allows arguments about responsibility in science to be circulated and read by those who are considered as the objects of governance, namely, scientists, science scholars, and science policy actors. It is thus a specific ‘apparatus’ that shapes conduct and ways of thinking toward a more responsible practice (Miller and Rose Citation1990, 8).

This is not to say that political rationalities on social responsibility in science are a straightforward phenomenon to study. As Foucault points out, modern governance can have a plurality of specific aims, and these aims can be ambiguous and even contradictory and yet still work on the same object of steering (Foucault Citation2003a, 237). There can, in other words, be different political rationalities at play at the same time. They overlap, contradict, and supplement each other continuously. Furthermore, arguments about enhancing the responsibility of science are pervasive, and they flow in many directions with circular forms. No a priori distinction between arguments coming from ‘within science’ or from ‘outside science’ can be made, as this distinction is itself an effect of political struggles.

On this basis, our task in this paper is to map the different political rationalities of social responsibility in science as they appear in contributions to academic journals. We have done that by studying how the ‘problem’ of social responsibility in science is articulated in various journal papers by asking of a sample of such papers the following three questions:

  • How is the specific problem (or problems) about lack of responsibility in science articulated?

  • What are the central aspects of science (or its relation to society) that need to be changed according to each articulation?

  • What kinds of solutions to the problems are imagined in these articulations and how are these solutions supposed to be put into place?

Through these questions, we want to carve out how spaces of action are constructed as legitimate for scientists and how these spaces differ from each other and overlap. In the next section, we will go further into the description of how we constructed an archive that made it possible to map political rationalities about social responsibility and investigate their implied effects on the governance of science.

Building an archive of political rationalities on responsibility

The first step of a Foucauldian analysis is to define and delineate the archive in which to study political rationalities (Foucault Citation1972/2004). Based on initial readings of a large number of papers, we identified 13 keywords central for the responsibility of science:

  • Science policy

  • Publishing

  • Public participation

  • Research misconduct

  • Social responsibility

  • Responsibility

  • Upstream public engagement

  • Risk management

  • Ethics

  • Environmental impact assessment

  • Science

  • Moral and ethical aspects

  • Public opinion

We combined the keywords into different search strings, and the resulting searches yielded approximately 1000 articles.Footnote1 From this collection we then selected all papers that explicitly stated normative ideals about the governance and responsibility of science, i.e. statements about the purpose of science and directions for how science should be regulated or steered toward this purpose.Footnote2 For each of the remaining 263 papers, we summarized the answers to the three research questions above and extracted illustrative quotations.

The papers in our sample are diverse and range from editorials addressing problems such as fraud in laboratory research to journal articles on how science can become more innovative. The sample also includes critical papers on science's damaging side effects and how they can be avoided, as well as debate pieces on how the direction of science could be subject to more democratic decision-making. Some papers are focused solely on the development of their own discipline, while others are concerned with the entire scientific community. Some papers are written by scientists and others by scholars in the humanities and social sciences with opinions about the development of science. In the analysis, we do not distinguish a priori between different speech positions, and so we have not differentiated between papers written by scientists and non-scientists. Rather, we have treated all voices equally in order to understand the total sum of possible articulations of the responsibility of scientists. The archive is therefore quite diverse and the positions plentiful. Nevertheless, all the papers offer answers to our three research questions and, in this way, they all point to a political rationality with implications for how science is governed.

Four kinds of responsibility

Based upon our close readings of the texts, we have identified four different political rationalities, which can be described using a 2 × 2 matrix. The first dimension describes whether regulation of science should be internal or external; the second dimension describes whether issues of responsibility relate to the process or the outcomes of science. illustrates the four rationalities and their relationship to each other. The Reflexivity and Demarcation rationalities both advocate internal regulation of science but, while the Reflexivity rationality insists that the responsibility of science is to strive for outcomes that can work as solutions to society's problems, the Demarcation rationality aims for a total separation between science and society in order to prevent social norms and values from biasing the otherwise objective production of knowledge. The Contribution and Integration rationalities both point to a need for the external regulation of science, but they also differ in their focus on whether responsibility relates to the outcome of science, as the Contribution rationale proposes, or its process, as the Integration rationale suggests.

Figure 1. An overview of the four political rationalities. They are arranged in relation to whether they advocate internal or external regulation of science and if they propose the process of science or the outcome of science as their object of steering.

Figure 1. An overview of the four political rationalities. They are arranged in relation to whether they advocate internal or external regulation of science and if they propose the process of science or the outcome of science as their object of steering.

In the following sections, we will describe each of these political rationalities in more detail using illustrative quotations from the papers in our archive. We have chosen the quotations as poignant examples of the patterns found in the analysis, but they should not be understood as ‘evidence’ in their own right outside of the analytical context. Within the Foucauldian analysis, it is not the individual texts but rather the patterns found in specific articulations – in this case the arguments – presented in the texts that matter. When including references to similar arguments in other papers, however, we have used the conventional way of referencing the entire paper rather than trying to reference the specific argument. It should also be noted that these four rationalities are found throughout the archive. As such, the findings are robust in describing a recurrent pattern. However, as Foucault has also pointed out, different rationalities overlap and intersect even though they can seem contradictory. Our description of these four rationalities should therefore be understood as an analytical construct that points to some patterns in the archive, while not offering justice to the full complexity that is also at play.

The Demarcation rationality

The Demarcation rationality often commences from a pride in science's endeavors, and it considers the job: ‘a noble and exciting calling; those who take part in it are fortunate’ (Danforth and Schoenhoff Citation1992, 351). Basically, science is an honorable profession, but unfortunately it is increasingly tormented by fraud and misconduct, which threatens its ability to do good for ‘the people’ (Danforth and Schoenhoff Citation1992, 355). This rationality articulates two interrelated problems as the reason for increasing fraud and misconduct in science. First, it worries about the increasing pressure for results and publications demanded of scientists, for instance: ‘The number of papers to consider is increasing rapidly, but the space [in journals] available for publication is not keeping pace’ (Brice and Bligh Citation2005, 84). These pressing conditions make fraud and misconduct more tempting for scientists:

Academic advancement, “publish or perish”, as well as prestige, are other important driving forces [for publishing scientific papers]. Finally, there are many financial benefits (direct and indirect) in publishing such as promotion and further research funding. Many of these forces can lead to ethical lapses. (Anderson and Boden Citation2008, 155; see also Caelleigh Citation1991; Evered and Lazar Citation1995; Anderson and Shultz Citation2003; Editorial Citation2006; PLoS Medicine Editors Citation2010)

Second, it expresses the fear that increasing incidents of fraud and misconduct will lead to an increasing public mistrust in the capability of science to contribute positively to society. This fear of declining public trust is expressed in Claxton (Citation2005a, 27), for example, when he states that ‘all scientists must be aware of the potential for fraud so that science can continue to pursue truth and serve mankind’ (see also Schmaus Citation1983; Danforth and Schoenhoff Citation1992; Whitbeck Citation1995; Frankel Citation2000; Caveman Citation2002; Fleischmann Citation2008; Illes et al. Citation2010). Public trust is often described as important for maintaining science as the institution that drives ‘mankind’ forward, and instances of fraud and misconduct fostered by increasing pressure to produce results are viewed as threats to this trust.

According to this rationality, the solution, i.e. the way to secure the responsibility of science, is to install: ‘a moral code that the vast majority of scientists embrace’ (Caelleigh Citation2003, 225), where only strict scientific methods are seen as legitimate (see also Schmaus Citation1983; Smith Citation1998; Heitman and Bulger Citation2005; Wolpe Citation2006; Wager Citation2011; Perlis and Shannon Citation2012). The reinforcement of such a ‘code’ is exemplified in the ‘Responsible Conduct of Research’ education program for young researchers, which should cover ‘almost every domain of scientific activity: data management, conflicts of interest, authorship, publication, peer-review, collaboration, mentoring and misconduct’ (Roland Citation2007, 707). A number of scientific actors and structures are enrolled as responsible for fostering this strict culture. The most frequently mentioned are: the scientific advisor, who has the duty to transfer these professional norms to the next generation of scientists (Brice and Bligh Citation2005; Roland Citation2007; Seiler et al. Citation2011); the review system and the journal editor, which have the final responsibility for declining manuscripts due to fraud (Cain Citation1999; Pittenger Citation2003; Slesser and Qureshi Citation2009); and ethical guidelines, which prescribe good scientific practice and authorship in both disciplinary societies and journals (Caelleigh Citation2003; Davidoff and Batalden Citation2005; Claxton Citation2005b; Poff Citation2009).

These actors and structures are all articulated as legitimate means to creating a moral culture that can enforce strict scientific methods. Their legitimacy is described as being rooted in the fact that scientists are all part of the same profession: ‘Members of a scientific discipline, like other professional groups, are bound together by similar aspirations, values and training and enter into a community of common purpose’ (Frankel Citation2000, 216). This membership is contrasted to a whole range of outsiders to science who should refrain from instilling or enforcing proper conduct: ‘Regulations imposed from outside science cannot promote the kind of atmosphere necessary to ensure ethical practices. An ethical climate must be fostered from within the scientific community’ (Frankel Citation2000, 216; see also Rothman and Poole Citation1985; Martinson, Anderson, and DeVries Citation2005; Wolpert Citation2005; Marchant and Pope Citation2009).

In this way, the Demarcation rationality articulates science as a profession that should have a high level of autonomy from other actors: outsiders to the scientific realm should not interfere with the discussions about scientists' responsibility and how to achieve it. But the profession itself ought to employ a number of techniques to install a specific kind of responsibility to be honest and objective in every single individual scientist. So the profession's freedom from interference from external actors is articulated as dependent on the internal establishment of a strong professional culture. This internal control system should constantly monitor the members of the scientific profession by scrutinizing methods and results and by socializing aspiring scientists into the system. Only by assuring that each individual scientist is rigorous, honest, transparent, and not influenced by society's interest in her work, is it possible to maintain proper responsibility within science (Edsall Citation1975; Bulger and Heitman Citation2007; Vollmer Citation2007; Evans Citation2010; Wager Citation2011).

Interestingly, the demarcation between science and society also points to decisions that scientists should not feel responsible for:

There is a real danger in asking scientists to be more socially responsible – the history of eugenics alone should show at least some of the dangers. For, by asking scientists to be socially responsible, in terms other than the obligations already discussed [to be rigorous and honest in relation to your results], would be to give power to a group who are neither trained nor competent to exert it. It was not for scientists to decide whether or not to build a[n atomic] bomb. Nor will it be for scientists and doctors alone to decide whether or not to introduce genes into the germ lines. (Wolpert Citation1989, 943; see also Rothman and Poole Citation1985; Fisher Citation2003; Nüsslin and Hendee Citation2008; Stieb Citation2008)

In such articulations, the effort to construct a strong internal commitment to truth simultaneously exempts the scientific profession from taking responsibility for decisions on the direction of society, even though the scientists' own insights or inventions are used in these decisions.

The Reflexivity rationality

Where the Demarcation rationality clearly praises science as an honorable profession, the Reflexivity rationality is more ambiguous in its appraisal. It acknowledges that science has solved many big problems for society, but it also worries that scientists do not assume responsibility for the wrongs that modern science has also produced: ‘Scientists can no longer get away with accepting credit for the glorious achievements of science, but must also respect some of the responsibility for the misapplications of science’ (Brouwer Citation1994, 193; see also Vessuri Citation2002; Schuurbiers, Osseweijer, and Kinderlerer Citation2009; Waelbers Citation2009). The ‘no longer’ in the above quotation indicates that the Reflexivity rationality describes a change in the scientific profession. Perhaps it was possible in some distant past to be disinterested and not care for the consequences of scientific developments, but: ‘a transformation … has taken place in science since World War II whereby [science] can be said to be a social institution and not something engaged in by disinterested seekers after the truth’ (Forge Citation2000, 348; see also Ziman Citation1998; Liska Citation2004; Waelbers Citation2009; Schuurbiers, Osseweijer, and Kinderlerer Citation2009).

Whether scientists appreciate this development or not, they need to assume the responsibility that comes with being a ‘social institution’. But, the Reflexivity rationality claims, scientists have not done so to a proper extent and they need to realize this challenge in order to be responsible: ‘Attention needs to be paid to recent changes in the research context: the principles of good scientific conduct themselves may need to be revisited and the capacity to address moral issues within research cultures should be addressed’ (Schuurbiers, Osseweijer, and Kinderlerer Citation2009, 230). The strong commitment to the use of rigorous scientific methods, praised by the Demarcation rationality, is not seen to be nearly enough to assure that scientists have acted responsibly. In fact, quite the opposite is seen as true, since science is considered to be a vital part of many societal catastrophes by the Reflexivity rationality. New technologies are not only creating progress, but they also cause bad side effects, which are largely ignored by science (Cournand Citation1977; Sassower Citation1996; Koepsell Citation2010). What scientists are missing in order to be socially responsible is articulated as a kind of self-awareness, an ability to foresee the consequences of their own practice. For example, Sweeney (Citation2006, 458) considers the lack of reflexive thinking with regard to nanotechnology: ‘What appears to be missing at the present time is a clearly articulated prognosis of the potential global social benefits and harms that may develop from further scientific and technological advances.’ As long as scientists are not considering how their own practices are affecting society, science cannot be understood as socially responsible (see also Watson Citation1974; Studer and Chubin Citation1977; Nicholas Citation1999; Strydom Citation1999; Forge Citation2000; Beckwith and Huang Citation2005).

The solution to this lack of attention is to improve scientists' self-awareness and their ability to incorporate such considerations in their research: ‘as previously stated, “To act responsibly,” means “to act in an inquiring and reflective way” … One has to try to understand how, in the specific situation, the human actors and the technologies mediate with each other, influencing the eventual outcome’ (Waelbers Citation2009, 62; see also Smith Citation1992; Rotblat Citation1999; Wing Citation2003). Scientists thus need to make an effort to foresee – or at least discuss – how their research affects their surroundings. They need to be able to look at their own role as part of a bigger society, where actions have consequences at other places and other times. In other words, scientists need to increase their reflexivity, even though it might not be an easy task: ‘While I agree that it is difficult to predict all the consequences of new scientific discoveries, I also have to stress that we scientists are seldom asked to reflect on the long-range effects of our work’ (Brouwer Citation1994, 193).

The Reflexivity rationality describes various techniques intended to help scientists get better at reflecting on their own practices. These techniques are focused on making scientists aware of their own values and motivations, as well as making them reflect on the possible outcomes of their scientific inquiries. This rationality often articulates faith in education as a means to make scientists more reflexive:

What science education now requires is “metascience”, a discipline that extends beyond conventional philosophy and ethics to include the social and humanistic aspects of the scientific enterprise. For example, students need to learn about the … societal responsibilities of research scientists, and to rehearse in advance some of the moral dilemmas that they are likely to meet. (Ziman Citation2001, 165; see also Kirman Citation1992; Ernst Citation2003; Peiffer, Hugenschmidt, and Laurienti Citation2011)

It also points to other methods whereby scientists should adapt a specific attitude toward their work, such as ‘co-responsibility’ (Strydom Citation1999), ‘social role responsibility’ (Waelbers Citation2009), or ‘strong objectivity’ (Wing Citation2003) – terms that all cover the development of more awareness of values, interests, and the consequences for society. But there are seldom any specific guidelines as to how scientists can adapt these attitudes in practice.

Compared to the Demarcation rationality, the Reflexivity rationality adds responsibility to the practice of individual scientists: scientists not only need to perform science according to the highest standards of quality, but they should also be able to oversee and reflect on the consequences of their own practice. In this way, reflexivity appears to become a sort of add-on: ‘Researchers still have a responsibility to produce “good science” – in two senses: science that is as “truthful” as the semantic, philosophical and ideological confusions surrounding that word allow; and science that is “socially responsible”’ (Scott Citation2003, 84; see also Pimple Citation2002). So not only do scientists have a responsibility of finding truths about the world, but they also have a responsibility for assessing if their science is good or does good in society. This rationality indicates a purpose for scientists beyond the one proposed in the Demarcation rationality. The Reflexivity rationality sees part of scientists’ task to be attentive to society and its problems – not the least of which are problems that science may itself have caused.

The Contribution rationality

The Contribution rationality articulates science as a societal institution akin to the healthcare and the education systems. It is part of society and serves certain societal goals: ‘Clearly, the aims of science, particularly in the case of the biomedical sciences, are closely linked to certain ethical, social, or political goals’ (De Melo-Martín Citation2008, 39; see also Sandler Citation2007; Allyse Citation2010). In this rationality, a particular vision of what is good for society is inherent in the specific goals that science pursues. According to the Contribution rationality, it is therefore paramount that society has a decisive role in shaping these visions and goals, and that scientists see themselves as working to produce a valuable contribution to society.

The arguments of this rationality center on two societal prescriptions that should guide scientists. The first is that science should be innovative and contribute with knowledge and technologies in order to improve national and regional growth. For instance, Heitor (Citation2008, 611) advocates stronger European universities in order to improve competitiveness: ‘There are … challenges that still remain in this reform movement to adapt higher education in Europe to the global landscape and to improve funding for R and D.’ He suggests that universities adapt models from the financial sector and become more ‘responsive’ to societal needs, as this responsiveness is a precondition for the adaption to a global knowledge economy (Heitor Citation2008, 609, 612; see also Etzkowitz et al. Citation2000; Beesley Citation2003; Weed and Mckeown Citation2003; Nature Editors Citation2009).

The second societal demand articulated by this rationality is that of democracy, here understood to mean that scientists' activities should be in line with expressed public preferences, and that experts' conduct should be subject to public scrutiny: ‘[I]n a democracy a skeptical and questioning attitude towards experts of all kinds is a thoroughly healthy thing’ (Durant Citation1999, 317; see also Abraham and Davis Citation2005; Bubela et al. Citation2009; Cho and Relman Citation2010).

Independent of whether the rationality addresses the demand of innovation or that of democracy – or both at the same time – the rationality is concerned with the ‘goals’ (De Melo-Martín Citation2008; Taylor Citation2009) or ‘purposes’ (Rappert Citation2003, 467) of science. The Contribution rationality measures science's ability to be innovative and democratic by looking at what comes out of the laboratories – the results and applications. If the knowledge and technologies are out of line with society's preferences and do not create growth, science has not lived up to its responsibilities. Therefore, following this rationality, scientists can be perceived as a sort of public servant working to materialize the objectives of society in their knowledge production: ‘Scientists should conceive of themselves as artisans working for the public good, whose efforts are directed toward an ideal of well-ordered science; and this ideal of well-ordered science should be understood in a global and democratic fashion’ (Kitcher Citation2004, 331).

According to the Contribution rationality, the current problem with responsibility in science is that scientists do not see themselves as these ‘artisans working for the public good’. Instead, they consider themselves and their work as separated from society and pursue irrelevant and perhaps dangerous paths that are not beneficial for anyone: ‘Curiosity-based research may provide new knowledge, but what can one now do with that knowledge – of what use is it?’ (Beesley Citation2003, 1529). According to this rationality this question – of what use is this knowledge? – should be forefront in all scientists' work, but it is not. Rather, scientists tend to be ‘retreating to the safety of the ivory tower’ (Drenth Citation2006, 15; see also Taylor Citation2009; Werner-Felmayer Citation2010). Scientists have been allowed to be cut-off from criticism and public inquiry, but this allowance needs to change: ‘Knowledge, as Francis Bacon famously observed, is power. If today's enormous scientific-knowledge-that-is-also-enormous-power is to be harnessed democratically, it is essential that it should be subjected to close and careful public scrutiny’ (Durant Citation1999, 317; see also Abraham Citation2003; Abraham and Davis Citation2005; Drenth Citation2006; de Melo-Martín Citation2008). Scientists' results are thus too important to be left to the scientists, but scientists are, seemingly, unwilling to be under public control.

To change this problem, the Contribution rationality proposes to enhance outside (societal) control over scientists. According to this rationality, the scientists do not have the ability to become more responsible by themselves (von Hippel Citation1978; Redman and Caplan Citation2005; Sandler Citation2007; Underwood Citation2009; Bates et al. Citation2010). Therefore, someone outside the system needs to intervene. In line with the Demarcation rationality, the Contribution rationality argues for more control within science, to avoid fraud and misconduct. Contrary to the former, the Contribution rationality articulates a strong need for external control. As with other public servants, i.e. doctors and teachers, the misuse of professional status should not be tolerated and, in fact, needs to be punished: ‘Lack of criminal sanctions for scientific misconduct appears to create an elite class of persons who are exempt from punishment for cheating, stealing and outright lying’ (Redman and Caplan Citation2005, 248; see also Lancet Editors Citation1996; Andersen Citation1999; Riis Citation2001; Bosch Citation2010; Miller Citation2011). Various kinds of external controls with results must therefore form part of scientists' work-life.

The Contribution rationality also articulates a more general need for public scrutiny of the directions of scientific inquiries. The rationality calls for improved governance in a range of areas such as potential patent possibilities, industrial potential, environmental harm, and (as in the following quote) the risk of ‘dual use’:

A clear normative articulation of acceptable and unacceptable behavior would therefore contribute towards improved governance. Currently there is a lack of international criminalization of individual activity in relation to biological weapons production that might allow actors to rationalize their choices. (McLeish and Nightingale Citation2007, 1649)

Thus, the rationality demands public control from different kinds of external bodies in order to assure that scientists are more innovative, that they collaborate more with industry, and that in general they aim to fulfill the public's stated preferences when aiming for various outcomes of science (see also von Hippel Citation1978; Baylis and Robert Citation2006; Drenth Citation2006; Bubela et al. Citation2009; Taylor Citation2009; Underwood Citation2009).

In the Contribution rationality, scientists have a responsibility to deliver results that are needed by society. Science should not just pursue knowledge for the sake of curiosity, but it needs to contribute to society. The purpose of science is to be at society's service and scientists need to be focused on this. Since scientists cannot, however, be expected to do so on their own accord, their conduct needs to be overseen by non-scientific actors, who are perceived to be more able to sustain responsibility.

The Integration rationality

The fourth of our identified rationalities is similar to the Contribution rationality in so far as it articulates that science is supposed to be firmly rooted within society, but it does so in a different way. The Integration rationality is centered around the vision that actors from science and society need to work together as equal partners in order to produce better results: ‘The exposure of citizens, public interest representatives and scientific experts to each others’ perspectives might contribute to a transformation of how these different participants define their interests or take account of others' interests' (Abraham and Sheppard Citation1997, 163). Here, scientists are conceived of as actors with special experiences who, in collaboration with other actors, can develop solutions for society (see also Ball Citation2002; Nowotny Citation2005; Roco Citation2006; Elshtain Citation2008; Horton Citation2010; Vogt, Baird, and Robinson Citation2007). It is the collaboration between different actors that is crucial in this rationality. While the Contribution rationality is focused on the outcome (that science lives up to the objectives of innovation and democracy), the Integration rationality does not seem to have a fixed societal objective. Rather, the goals for science and society should come out of a process in which scientific and societal actors agree on the preferred objectives together. Whereas democracy is also a strong value in this rationality, it is the process of discussing, rather than public oversight and control that is seen as crucial.

In line with the previous two rationalities, the arguments of the Integration rationality take as their point of departure the negative side effects of science when describing the main problem in current scientific conduct. Science is articulated as producing outcomes that are unsustainable and controversial, because scientists work without a thought for the wider societal implications of the knowledge and technologies they create:

Given the environment in science, scientists, on the whole, are unlikely to participate in soul-searching over the consequences of their work. Those who argue for the social implications of their findings may do so without any framework for thinking about the consequences. (Beckwith and Geller Citation1997, 147; see also Beggins Citation1978; Cohen and Gotlieb Citation1989; Thorpe Citation2004; Schuurbiers et al. Citation2009; Woollard Citation2006)

According to this rationality, scientists do not themselves engage with other actors and thus do not experience ‘a transformation’ (Abraham and Sheppard Citation1997, 163) that could lead scientific developments in more substantially democratic directions. Rather than pointing to a lack of internal considerations (Reflexivity) or a lack of external control (Contribution) as the key reason for these problems, this rationality points to the lack of integration between science and other actors in society as the main problem to be corrected.

In the Integration rationality, the solution is to enhance the dialogue between scientists and other actors in order to develop a new kind of ‘integrative’ responsibility that can transgress today's very specialized society:

Genuine responsibility is not to be found in the compartmentalized roles of the professional, expert, scientist, government official or career politician. It is, rather, in the cracks between such specialized roles that the basis for an integrative sense of human responsibility is to be found. (Thorpe Citation2004, 79) (see also Steckler Citation1973; Ritterbush Citation1977; Schrag et al. Citation2003; Rip Citation2009; Mikulak Citation2011; Roco et al. Citation2011)

Within this rationality, responsibility is thus an outcome of a process in which several different actors meet and together learn and change based on common deliberation. Consequently, science needs other actors – lots of other actors – to become responsible. Scientists need to be challenged continuously by different viewpoints so they can integrate them into the development of knowledge and technologies. Within this rationality, several techniques are proposed to expose scientists to society's values: scientists need training in philosophy and social science during their education in order to be more aware of the importance and works of norms and values; scientists need to engage in various public-participation exercises, in which emerging technologies are discussed with citizens (Miah Citation2005); scientists should be better at communicating with public media (Evans Citation1999; Fischer Citation1999); and scientists need to develop their methods in cooperation with those whom they study, so they can develop knowledge together (MacKenzie, McDowell, and Pittaway Citation2007; Hugman, Pittaway, and Bartolomei Citation2011). Scientists need to engage with these various other perspectives to jointly and collaboratively find a way forward.

The Integration rationality articulates as a crucial point that these dialogues should be on-going and happen ‘before-the-fact’, that is, before knowledge and technologies are finalized and implemented in society, at which time it becomes more difficult to change their properties and the way they affect their surroundings. Instead, it is important that various perspectives on emerging technologies are surfaced, while they are still in the making, so they can be integrated as development is on-going:

Public engagement processes should be established early so that stakeholders who will bear the risks and benefits of synthetic biology have the opportunity for meaningful input into the trajectory of this field. To be meaningful, public engagement must recognize that some avenues of research will not be acceptable and some products may be prevented from reaching the market. (Bubela, Hagen, and Einsiedel Citation2012, 136; see also van der Burg and van Gorp Citation2005; Roco Citation2006)

According to the Integration rationality, the main responsibility of scientists is to develop knowledge that is aligned with society's norms and values, having first realized that these norms and values should be contextually identified. The rationality sees scientists as a special kind of citizen, one who possess specialized knowledge that can be used to develop society in better directions. But this is only possible if science opens up and allows social concerns to form part of the scientific process, instead of focusing only on technical aspects. In this way, the Integration rationality diverges from the Contribution rationality by articulating that knowledge production is a collaboration among different actors and responsibility is something that develops through this collaborative process.

Conclusion

Analyzing 263 contributions to academic journals, we have identified four different rationalities of the social responsibility of science. Each of these rationalities articulates a specific way of defining problems, as well as a specific way of legitimizing certain political steering mechanisms as solutions to these problems. The analysis illustrates a wide variety of views on how science and the scientific profession should be governed. The proposals range from strict self-governance and autonomy to a very integrative view in which a large group of different stakeholders from outside the scientific system should be involved in the conduct of science. The Demarcation and Reflexivity rationalities both articulate skepticism toward external involvement; whereas, the Contribution and Integration rationalities embrace external regulation, albeit in different forms. In this way, we could argue that the four rationalities seem to stretch over a continuum from an idea of no involvement at all from society, to an idea of radical involvement from society in which citizens, social scientists, and other actors literally enter the laboratory and co-develop knowledge.

A different pattern emerges if we focus on the object of steering. The Demarcation rationality focuses on optimizing the scientific process before-the-fact by installing a strict moral code among the scientists that should be focused on honesty and accuracy in their work. In this way, it resembles the Integration rationality, which also has as its object of steering the scientific process before-the-fact. Even though the two rationalities are radically different with regard to their choice of internal or external regulation, they are focused on the process of science as the crucial object of steering in order to make science responsible. In contrast, the Reflexivity and Contribution rationalities share a focus on the outcome of science as the important object of steering, although they differ in terms of the steering mechanisms they want to use. While the Reflexivity rationality describes how scientists themselves should use society's problems as an inspirational framework that guides their research, the Contribution rationality advocates firm, external control, and guidelines from society that scientists should be compelled to follow.

We can also reflect on the distribution of morality or ethics in each rationality. On the one hand, the Demarcation and the Reflexivity rationalities articulate science as a fundamentally ‘good’ institution that has an in-built capacity to know how to serve society best. On the other hand, the Contribution and the Integration rationalities point to society as a necessary source of moral knowledge about how to develop ‘a good society’ from which science needs to learn. This perspective strongly reverberates with social studies of science and technology that advocate further involvement of philosophers and social scientists as teachers of responsibility in scientific processes (Macnaghten, Kearnes, and Wynne Citation2005; Fisher Citation2007; Flipse, van der Sanden, and Osseveijer Citation2012).

Following Foucault (Citation2003a), we do not consider the four political rationalities as mutually exclusive or strictly separated. From a very general perspective, they all stress the need to regulate the relationship between science and society. Even in the Demarcation rationality, society plays a large part as that which has to be excluded. The definition of a boundary (or an integration) between science and society is therefore an overall shared problematization in the four rationalities. In this way, the analysis portrays the current situation of the profession of science as one in which it is impossible to not ‘have a relationship’ with society. Rather, the main political question is how to define and regulate this relationship.

The identification of these four rationalities serves as a map of contemporary ideas on science governance. Like any other map, it excludes a lot of shades and details, and we might have enlarged certain differences in order to make the map intelligible. However, the map is intended as a reference point for directions and the understanding of differences, and as such it is useful as an overview of directions in the governance of science. It is clear that there is no uniform agreement about what social responsibility of science is and should be. It is equally clear that the definition of this concept is inherently political – in the Foucauldian sense of structuring fields of action. Similar to Braun et al. (Citation2010), our analysis demonstrates that a particular definition of responsibility also implies a particular understanding of the proper conduct of science, sustaining the description of some forms of practice as responsible and others as irresponsible. Following from this perspective is also the realization that an argument for more responsibility in science is not a way of dealing with the fact that science has political consequences – it is itself a political statement.

However, our map does not make us any wiser about the practice of science and the relationship between what scientists actually do in their laboratories and the normative and political statements about proper conduct of science investigated in this paper. As seminal laboratory ethnographies (Bloor Citation1976; Latour and Woolgar Citation1979; Knorr-Cetina Citation1999) have shown, scientific norms and scientific practice are at times worlds apart. We therefore propose further studies of this relationship – in particular of the ways in which scientists are influenced, if at all – by these political rationalities in their daily practices of making facts in the laboratory and making organizations when doing research management. It would be interesting to investigate whether there are more or different rationalities at play in these forms for practice. It is also pertinent to explore how these proposals of responsible conduct of science are played out in connection with such mundane organizational concerns as next year's budget, a debate piece in yesterday's paper, a failed experiment, or the need for a new coffee machine in the canteen.

Acknowledgments

We would like to thank Emil Husted for great assistance with the building of the archive and development of the method. Furthermore, we want to thank Alan Irwin, Erik Fisher, and Paul du Gay for reading earlier drafts and providing useful comments.

Notes on contributors

Cecilie Glerup is a Ph.D. fellow at Department of Organization, Copenhagen Business School. Her main interests are in the development of the scientific profession in relation to ‘the new governance of science’ and demands for ‘responsible innovation’.

Maja Horst is Head of Department at Department of Media, Cognition and Communication at the University of Copenhagen. She has published extensively in the areas of ‘science communication’ and ‘public engagement with science’ and is currently interested in the concept of ‘responsibility’ in relation to scientific work.

Notes

1. The searches were conducted in the three databases, Scopus, EBSCO Host, and SAGE. Based on the content of the databases, our searches go from 1960 to 2011 and they were done within all disciplinary areas. The selection of papers from the end of the 1990s and onwards is notably larger compared to the amount of earlier dated papers. There could be many reasons for this: a growing interest in the phenomenon, more publications in general, etc. We believe that this difference in size is also due to the fact that papers have increasingly been available online automatically, whereas older papers are not necessarily so.

2. Papers were excluded either because they did not address the governance efforts at science or scientists (but rather the media or citizens), or they were too descriptive. Many papers were descriptive empirical studies about scientists' engagement in society, for instance, studies of how scientists and citizens interact in certain public-participation exercises. In a more all-encompassing Foucauldian analysis of regimes of knowledge, these should also have been included, but for this limited study we chose to focus on the normative papers.

References

  • Abraham, J. 2003. “The Science and Politics of Medicines Control.” Drug Safety 26 (3): 135–143. doi: 10.2165/00002018-200326030-00001
  • Abraham, J., and C. Davis. 2005. “Risking Public Safety: Experts, the Medical Profession and ‘Acceptable’ Drug Injury.” Health, Risk and Society 7 (4): 379–395. doi: 10.1080/13698570500390473
  • Abraham, J., and J. Sheppard. 1997. “Democracy, Technocracy, and the Secret State of Medicines Control: Expert and Nonexpert Perspectives.” Science, Technology & Human Values 22 (2): 139–167. doi: 10.1177/016224399702200201
  • Allyse, Megan. 2010. “Embryos, Ethics and Expertise: The Emerging Model of the Research Ethics Regulator.” Science and Public Policy 37 (8): 597–609. doi: 10.3152/030234210X12767691861092
  • Andersen, D. 1999. “Guidelines for Good Scientific Practice.” Danish Medical Bulletin 46 (1): 60–61.
  • Anderson, P. A., and S. D. Boden. 2008. “Ethical Considerations of Authorship.” SAS Journal 2 (3): 155–158. doi: 10.1016/S1935-9810(08)70034-3
  • Anderson, M. S., and J. B. Shultz. 2003. “The Role of Scientific Associations in Promoting Research Integrity and Deterring Research Misconduct: Commentary on ‘Challenges in Studying the Effects of Scientific Societies on Research Integrity’ (Levine and Iutcovitch).” Science and Engineering Ethics 9 (2): 269–272. doi: 10.1007/s11948-003-0013-1
  • Ball, D. J. 2002. “Environmental Risk Assessment and the Intrusion of Bias.” Environment International 28 (6): 529–544. doi: 10.1016/S0160-4120(02)00061-2
  • Barinaga, M. 2000. “Asilomar Revisited: Lessons for Today?” Science 287 (5458): 1584–1585. doi: 10.1126/science.287.5458.1584
  • Bates, S. R., W. Faulkner, S. Parry, and S. Cunningham-Burley. 2010. “‘How Do We Know It's Not Been Done Yet?!’ Trust, Trust Building and Regulation in Stem Cell Research.” Science and Public Policy 37 (9): 703–718.
  • Baylis, Françoise, and Jason Scott Robert. 2006. “Human Embryonic Stem Cell Research: An Argument for National Research Review.” Accountability in Research 13 (3): 207–224. doi: 10.1080/03605300600848136
  • Beckwith, J., and L. N. Geller. 1997. “Commentary on ‘the Social Responsibilities of Biological Scientists’ (S.J. Reiser and R.E. Bulger).” Science and Engineering Ethics 3 (2): 145–148. doi: 10.1007/s11948-997-0005-7
  • Beckwith, J., and F. Huang. 2005. “Should We Make a Fuss? A Case for Social Responsibility in Science.” Nature Biotechnology 23 (12): 1479–1480. doi: 10.1038/nbt1205-1479
  • Beesley, Lisa G. A. 2003. “Science Policy in Changing Times: Are Governments Poised to Take Full Advantage of an Institution in Transition?” Research Policy 32 (8): 1519–1531. doi: 10.1016/S0048-7333(03)00023-4
  • Beggins, David. 1978. “Social Responsibility in Science.” Social Alternatives 1 (3): 54–60.
  • Bloor, D. 1976. Knowledge and Social Imagery. London: Routledge.
  • Bosch, X. 2010. “Safeguarding Good Scientific Practice in Europe.” EMBO Reports 11 (4): 252–257. doi: 10.1038/embor.2010.32
  • Braun, K., A. Moore, S. L. Herrmann, and Sabine Könninger. 2010. “Science Governance and the Politics of Proper Talk: Governmental Bioethics as a New Technology of Reflexive Government.” Economy and Society 39 (4): 510–533. doi: 10.1080/03085147.2010.510682
  • Brice, J., and J. Bligh. 2005. “Author Misconduct: Not Just the Editors’ Responsibility.” Medical Education 39 (1): 83–89. doi: 10.1111/j.1365-2929.2004.02027.x
  • Brouwer, W. 1994. “Taking Responsibility for the Implications of Science.” Bulletin of Science, Technology & Society 14 (4): 192–202. doi: 10.1177/027046769401400402
  • Bubela, T., G. Hagen, and E. Einsiedel. 2012. “Synthetic Biology Confronts Publics and Policy Makers: Challenges for Communication, Regulation and Commercialization.” Trends in Biotechnology 30 (3): 132–137. doi: 10.1016/j.tibtech.2011.10.003
  • Bubela, T., M. C. Nisbet, R. Borchelt, F. Brunger, C. Critchley, E. Einsiedel, G. Geller, A, et al. 2009. “Science Communication Reconsidered.” Nature Biotechnology 27 (6): 514–518. doi: 10.1038/nbt0609-514
  • Bulger, R. E., and E. Heitman. 2007. “Expanding Responsible Conduct of Research Instruction Across the University.” Academic Medicine 82 (9): 876–878. doi: 10.1097/ACM.0b013e31812f7909
  • Caelleigh, A. S. 1991. “Editorial: Credit and Responsibility in Authorship.” Academic Medicine 66 (11): 676–677. doi: 10.1097/00001888-199111000-00007
  • Caelleigh, A. S. 2003. “Roles for Scientific Societies in Promoting Integrity in Publication Ethics.” Science and Engineering Ethics 9 (2): 221–241. doi: 10.1007/s11948-003-0010-4
  • Cain, J. 1999. “Why Be My Colleague's Keeper? Moral Justifications for Peer Review.” Science and Engineering Ethics 5 (4): 531–540. doi: 10.1007/s11948-999-0053-2
  • Caveman, A. 2002. “Public Advocacy of Science.” Journal of Cell Science 115 (9): 1777–1778.
  • Cho, M. K., and D. A. Relman. 2010. “Synthetic ‘Life,’ Ethics, National Security, and Public Discourse.” Science 329 (5987): 38–39. doi: 10.1126/science.1193749
  • Claxton, L. D. 2005a. “Scientific Authorship: Part 1. A Window into Scientific Fraud?” Mutation Research – Reviews in Mutation Research 589 (1): 17–30. doi: 10.1016/j.mrrev.2004.07.003
  • Claxton, L. D. 2005b. “Scientific Authorship: Part 2. History, Recurring Issues, Practices, and Guidelines.” Mutation Research – Reviews in Mutation Research 589 (1): 31–45. doi: 10.1016/j.mrrev.2004.07.002
  • Cohen, Robin, and Calvin C. Gotlieb. 1989. “Educating Tomorrow's Professionals.” Journal of Business Ethics 8 (2): 193–199. doi: 10.1007/BF00382584
  • Cournand, A. 1977. “The Code of the Scientist and Its Relationship to Ethics.” Science 198 (4318): 699–705. doi: 10.1126/science.910153
  • Danforth, W. H., and D. M. Schoenhoff. 1992. “Fostering Integrity in Scientific Research.” Academic Medicine 67 (6): 351–356. doi: 10.1097/00001888-199206000-00001
  • Davidoff, F., and P. Batalden. 2005. “Toward Stronger Evidence on Quality Improvement. Draft Publication Guidelines: The Beginning of a Consensus Project.” Quality Safety in Health Care 14 (5): 319–325. doi: 10.1136/qshc.2005.014787
  • Davies, S., and M. Horst. 2012. “Responsible Innovation in the US, UK and Denmark: Governance Landscapes.” Working Paper. Presented at the responsible innovation conference, Den Haag.
  • Davies, S. C. Glerup, and M. Horst. in press. “On Being Responsible. Multiplicity in Responsible Development.” In Responsibility in Nanotechnology Development, edited by S. Arnaldi, A. Ferrari, P. Magaudda, and F. Marin. London: Springer.
  • De Melo-Martín, I. 2008. “Ethics, Embryos, and Eggs: The Need for More than Epistemic Values.” American Journal of Bioethics 8 (12): 38–40. doi: 10.1080/15265160802559211
  • Douglas, H. E. 2003. “The Moral Responsibility of Scientists (Tensions Between Autonomy and Responsibility).” American Philosophical Quarterly 40 (1): 59–68.
  • Drenth, P. J. D. 2006. “Responsible Conduct in Research.” Science and Engineering Ethics 12 (1): 13–21. doi: 10.1007/s11948-006-0003-1
  • Durant, J. 1999. “Participatory Technology Assessment and the Democratic Model of the Public Understanding of Science.” Science and Public Policy 26 (5): 313–319. doi: 10.3152/147154399781782329
  • Editorial. 2006. “Beautification and Fraud.” Nature Cell Biology 8 (2): 101–102. doi: 10.1038/ncb0206-101
  • Edsall, J. T. 1975. “Scientific Freedom and Responsibility – Report of the AAAS Committee on Scientific Freedom and Responsibility.” Science 188 (4189): 687–693. doi: 10.1126/science.11643270
  • Elshtain, J. B. 2008. “Why Science Cannot Stand Alone.” Theoretical Medicine and Bioethics 29 (3): 161–169. doi: 10.1007/s11017-008-9074-0
  • Ernst, R. R. 2003. “The Responsibility of Scientists, a European View.” Angewandte Chemie (International Edition) 42 (37): 4434–4439. doi: 10.1002/anie.200330065
  • Etzkowitz, Henry, Andrew Webster, Christiane Gebhardt, and Branca R. C. Terra. 2000. “The Future of the University and the University of the Future: Evolution of Ivory Tower to Entrepreneurial Paradigm.” Research Policy 29 (2): 313–330. doi: 10.1016/S0048-7333(99)00069-4
  • Evans, M. 1999. “Bioethics and the Newspapers.” The Journal of Medicine and Philosophy 24 (2): 164–180. doi: 10.1076/jmep.24.2.164.2529
  • Evans, N. G. 2010. “Speak No Evil: Scientists, Responsibility, and the Public Understanding of Science.” NanoEthics 4 (3): 215–220. doi: 10.1007/s11569-010-0101-z
  • Evered, D., and P. Lazar. 1995. “Misconduct in Medical Research.” Lancet 345 (8958): 1161–1162. doi: 10.1016/S0140-6736(95)90984-2
  • Fischer, F. 1999. “Technological Deliberation in a Democratic Society: The Case for Participatory Inquiry.” Science and Public Policy 26 (5): 294–302. doi: 10.3152/147154399781782293
  • Fisher, C. B. 2003. “Developing a Code of Ethics for Academics. Commentary on ‘Ethics for All: Differences Across Scientific Society Codes’ (Bullock and Panicker).” Science and Engineering Ethics 9 (2): 171–179. doi: 10.1007/s11948-003-0004-2
  • Fisher, E. 2007. “The Convergence of Nanotechnology, Policy, and Ethics.” Advance in Computers 71 (2007): 273–296. doi: 10.1016/S0065-2458(06)71006-3
  • Fisher, E., and R. L. Mahajan. 2006. “Contradictory Intent? US Federal Legislation on Integrating Societal Concerns into Nanotechnology Research and Development.” Science and Public Policy 33 (1): 5–16. doi: 10.3152/147154306781779181
  • Fleischmann, Martin. 2008. “Reflections on the Sociology of Science and Social Responsibility in Science, in Relationship to Cold Fusion.” Accountability in Research 8 (1–2): 19–54.
  • Flipse, S. M., M. C. A. van der Sanden, and P. Osseveijer. 2012. “Midstream Modulation in Biotechnology Industry: Redefining What Is ‘Part of the Job’ of Researchers in Industry.” Science and Engineering Ethics 19 (3): 1141–1164. doi: 10.1007/s11948-012-9411-6
  • Forge, J. 2000. “Moral Responsibility and the ‘Ignorant Scientist’.” Science and Engineering Ethics 6 (3): 341–349. doi: 10.1007/s11948-000-0036-9
  • Foucault, M. 1972/2004. Vidensarkæologien. Aarhus: Forlaget Philosophia.
  • Foucault, M. 1976/1998. The Will to Power. The History of Sexuality Volue 1. London: Penguin Books.
  • Foucault, M. 2003a. “Governmentality.” In The Essential Foucault. Selections from Essential Works of Foucault, 1954–1984, edited by P. Rabinow and N. Rose, 229–245. New York: The New Press.
  • Foucault, M. 2003b. “The Subject and Power.” In The Essential Foucault. Selections from Essential Works of Foucault, 1954–1984, edited by P. Rabinow and N. Rose, 126–145. New York: The New Press.
  • Frankel, M. S. 2000. “Scientific Societies as Sentinels of Responsible Research Conduct.” Proceedings of the Society for Experimental Biology and Medicine 224 (4): 216–219. doi: 10.1046/j.1525-1373.2000.22424.x
  • Guston, D. H., and D. Sarewitz. 2002. “Real-Time Technology Assessment.” Technology in Society 24 (2002): 93–109. doi: 10.1016/S0160-791X(01)00047-1
  • Heitman, E., and R. E. Bulger. 2005. “Assessing the Educational Literature in the Responsible Conduct of Research for Core Content.” Accountability in Research 12 (3): 207–224. doi: 10.1080/08989620500217420
  • Heitor, M. 2008. “A System Approach to Tertiary Education Institutions: Towards Knowledge Networks and Enhanced Societal Trust.” Science and Public Policy 35 (8): 607–617. doi: 10.3152/030234208X377371
  • Hellström, T. 2003. “Systemic Innovation and Risk: Technology Assessment and the Challenge of Responsible Innovation.” Technology in Society 25 (3): 369–384. doi: 10.1016/S0160-791X(03)00041-1
  • Horton, R. 2010. “Science Will Never be the Same Again.” Lancet 376 (9736): 143–144. doi: 10.1016/S0140-6736(10)61091-4
  • Hugman, R., E. Pittaway, and L. Bartolomei. 2011. “When ‘Do No Harm’ Is Not Enough: The Ethics of Research with Refugees and Other Vulnerable Groups.” British Journal of Social Work 41 (7): 1271–1287. doi: 10.1093/bjsw/bcr013
  • Illes, J., M. A. Moser, J. B. McCormick, E. Racine, S. Blakeslee, A. Caplan, E. C. Hayden, et al. 2010. “Neurotalk: Improving the Communication of Neuroscience Research.” Nature Reviews Neuroscience 11 (1): 61–69. doi: 10.1038/nrn2773
  • Irwin, A. 2006. “The Politics of Talk: Coming to Terms with the ‘New’ Scientific Governance.” Social Studies of Science 36 (2): 299–320. doi: 10.1177/0306312706053350
  • Jasanoff, S. 2011. “Constitutional Moments in Governing Science and Technology.” Science and Engineering Ethics 17 (4): 621–638. doi: 10.1007/s11948-011-9302-2
  • Kirman, Joseph M. 1992. “Values, Technology, and Social Studies.” McGill Journal of Education 27 (1): 5–18.
  • Kitcher, P. 2004. “Responsible Biology.” Bioscience 54 (4): 331–336. doi: 10.1641/0006-3568(2004)054[0331:RB]2.0.CO;2
  • Knorr-Cetina, K. 1999. Epistemic Cultures. How the Sciences Make Knowledge. Cambridge: Harvard University Press.
  • Koepsell, D. 2010. “On Genies and Bottles: Scientists’ Moral Responsibility and Dangerous Technology R&D.” Science and Engineering Ethics 16 (1): 119–133. doi: 10.1007/s11948-009-9158-x
  • Lancet Editors. 1996. “Dealing with Deception.” Lancet 347 (9005): 843. doi: 10.1016/S0140-6736(96)91337-9
  • Latour, B., and S. Woolgar. 1979/1986. Laboratory Life. The Construction of Scientific Facts. Princeton: Princeton University Press.
  • Liska, A. J. 2004. “The Morality of Problem Selection in Proteomics.” Proteomics 4 (7): 1929–1931. doi: 10.1002/pmic.200300714
  • Lynch, M. 1997. Scientific Practice and Ordinary Action. Ethnomethodology and Social Studies of Science. Cambridge: Cambridge University Press.
  • MacKenzie, Catriona, Christopher McDowell, and Eileen Pittaway. 2007. “Beyond ‘Do No Harm’: The Challenge of Constructing Ethical Relationships in Refugee Research.” Journal of Refugee Studies 20 (2): 299–319. doi: 10.1093/jrs/fem008
  • Macnaghten, P., M. B. Kearnes, and B. Wynne. 2005. “Nanotechnology, Governance and Public Deliberation: What Role for the Social Sciences?” Science Communication 27 (2): 268–291. doi: 10.1177/1075547005281531
  • Marchant, G. E., and L. L. Pope. 2009. “The Problems with Forbidding Science.” Science and Engineering Ethics 15 (3): 375–394. doi: 10.1007/s11948-009-9130-9
  • Martinson, B. C., M. S. Anderson, and R. DeVries. 2005. “Scientists Behaving Badly.” Nature 435 (7043): 737–738. doi: 10.1038/435737a
  • McCarthy, E., and C. Kelty. 2010. “Responsibility and Nanotechnology.” Social Studies of Science 40 (3): 405–432. doi: 10.1177/0306312709351762
  • McLeish, Caitríona, and Paul Nightingale. 2007. “Biosecurity, Bioterrorism and the Governance of Science: The Increasing Convergence of Science and Security Policy.” Research Policy 36 (10): 1635–1654. doi: 10.1016/j.respol.2007.10.003
  • Merton, R. K. 1973. The Sociology of Science. Theoretical and Empirical Investigations. Chicago: University of Chicago Press.
  • Miah, A. 2005. “Genetics, Cyberspace and Bioethics: Why Not a Public Engagement with Ethics?” Public Understanding of Science 14 (4): 409–421. doi: 10.1177/0963662505056616
  • Mikulak, A. 2011. “Mismatches Between ‘Scientific’ and ‘Non-Scientific’ Ways of Knowing and Their Contributions to Public Understanding of Science.” Integrative Psychological and Behavioral Science 45 (2): 201–215. doi: 10.1007/s12124-011-9157-8
  • Miller, N. R. 2011. “Checking for Plagiarism, Duplicate Publication, and Text Recycling.” Lancet 377 (9775): 1403. doi: 10.1016/S0140-6736(11)60565-5
  • Miller, P., and N. Rose. 1990. “Governing Economic Life.” Economy and Society 19 (1): 1–31. doi: 10.1080/03085149000000001
  • Nature Editors. 2009. A Responsibility Index. Vol. 457. Nature Publishing Group.
  • Nicholas, B. 1999. “Molecular Geneticists and Moral Responsibility: ‘Maybe If We Were Working on the Atom Bomb I Would Have a Different Argument’.” Science and Engineering Ethics 5 (4): 515–530. doi: 10.1007/s11948-999-0052-3
  • Nowotny, H. 2005. “High- and Low-Cost Realities for Science and Society.” Science 308 (5725): 1117–1118. doi: 10.1126/science.1113825
  • Nüsslin, Fridtjof, and William Hendee. 2008. “A Statement of the Rights of Scientists and Engineers.” Physica Medica 24 (3): 127–128. doi: 10.1016/j.ejmp.2008.08.001
  • Owen, R., D. Baxter, T. Maynard, and M. Depledge. 2009. “Beyond Regulation: Risk Pricing and Responsible Innovation.” Environmental Science & Technology 43 (18): 6902–6906. doi: 10.1021/es803332u
  • Peiffer, A. M., C. E. Hugenschmidt, and P. J. Laurienti. 2011. “Ethics in 15 Min Per Week.” Science and Engineering Ethics 17 (2): 289–297. doi: 10.1007/s11948-010-9197-3
  • Perlis, C., and N. Shannon. 2012. “Role of Professional Organizations in Setting and Enforcing Ethical Norms.” Clinics in Dermatology 30 (2): 156–159. doi: 10.1016/j.clindermatol.2011.06.002
  • Phelps, R., and E. Fisher. 2011. “Legislating the Laboratory’ Promotion and Precaution in a Nanomaterials Company.” Biomedical Nanotechnology 726 (2011): 339–358.
  • Pimple, K. D. 2002. “Six Domains of Research Ethics: A Heuristic Framework for the Responsible Conduct of Research.” Science and Engineering Ethics 8 (2): 191–205. doi: 10.1007/s11948-002-0018-1
  • Pittenger, D. J. 2003. “Intellectual Freedom and Editorial Responsibilities Within the Context of Controversial Research.” Ethics Behavior 13 (2): 105–125. doi: 10.1207/S15327019EB1302_01
  • PLoS Medicine Editors. 2010. “Increased Responsibility and Transparency in an Era of Increased Visibility.” PLoS Medicine 7 (10): 1–2.
  • Poff, D. 2009. “Reflections on Ethics in Journal Publications.” Journal of Academic Ethics 7 (1): 51–55. doi: 10.1007/s10805-009-9090-3
  • Rappert, B. 2003. “Coding Ethical Behaviour. The Challenges of Biological Weapons.” Science and Engineering Ethics 9 (4): 453–470. doi: 10.1007/s11948-003-0044-7
  • Redman, B. K., and A. L. Caplan. 2005. “Off with Their Heads: The Need to Criminalize Some Forms of Scientific Misconduct.” The Journal of Law, Medicine 33 (2): 345–348.
  • Rhodes, R. 2012. The Making of the Atomic Bomb. 25th Anniversary ed. New York: Simon & Schuster.
  • Riis, P. 2001. “Scientific Dishonesty: European Reflections.” Journal of Clinical Pathology 54 (1): 4–6. doi: 10.1136/jcp.54.1.4
  • Rip, A. 2009. “Futures of ELSA: Science Society Series on Convergence Research.” EMBO Reports 10 (7): 666–670. doi: 10.1038/embor.2009.149
  • Ritterbush, Philip C. 1977. “The Public Side of Science.” Change 9 (9): 26–33. doi: 10.1080/00091383.1977.10569232
  • Roco, M. C. 2006. Progress in Governance of Converging Technologies Integrated from the Nanoscale. Vol. 1093. New York: New York Academy of Sciences.
  • Roco, M. C., B. Harthorn, D. Guston, and P. Shapira. 2011. “Innovative and Responsible Governance of Nanotechnology for Societal Development.” Journal of Nanoparticle Research 13 (9): 3557–3590. doi: 10.1007/s11051-011-0454-4
  • Roland, M.-C. 2007. “Who Is Responsible?” EMBO Reports 8 (8): 706–711. doi: 10.1038/sj.embor.7401035
  • Rose, N., and P. Miller. 1992. “Political Power Beyond the State. Problematics of Government.” The British Journal of Sociology 43 (2): 173–205. doi: 10.2307/591464
  • Rotblat, J. 1999. “A Hippocratic Oath for Scientists.” Science 286 (5444): 1475. doi: 10.1126/science.286.5444.1475
  • Rothman, K. J., and C. Poole. 1985. “Science and Policy Making.” American Journal of Public Health 75 (4): 340–341. doi: 10.2105/AJPH.75.4.340
  • Sandler, Ronald. 2007. “Nanotechnology and Social Context.” Bulletin of Science, Technology & Society 27 (6): 446–454. doi: 10.1177/0270467607308288
  • Sassower, R. 1996. “Responsible Technoscience: The Haunting Reality of Auschwitz and Hiroshima.” Science and Engineering Ethics 2 (3): 277–290. doi: 10.1007/BF02583914
  • Schmaus, W. 1983. “Fraud and the Norms of Science.” Science, Technology, Human Values 8 (4): 12–22.
  • Schrag, B., L. Love-Gregory, K. M. T. Muskavitch, and J. McCafferty. 2003. “Forbidden Knowledge: A Case Study with Commentaries Exploring Ethical Issues and Genetic Research.” Science and Engineering Ethics 9 (3): 409–416. doi: 10.1007/s11948-003-0037-6
  • Schuurbiers, D., P. Osseweijer, and J. Kinderlerer. 2009. “Implementing the Netherlands Code of Conduct for Scientific Practice – A Case Study.” Science and Engineering Ethics 15 (2): 213–231. doi: 10.1007/s11948-009-9114-9
  • Schuurbiers, D., S. Sleenhoff, J. F. Jacobs, and P. Osseweijer. 2009. “Multidisciplinary Engagement with Nanoethics Through Education – The Nanobio-RAISE Advanced Courses as a Case Study and Model.” NanoEthics 3 (3): 197–211. doi: 10.1007/s11569-009-0073-z
  • Scott, P. 2003. “The Ethical Implications of the New Research Paradigm.” Science and Engineering Ethics 9 (1): 73–84. doi: 10.1007/s11948-003-0021-1
  • Seiler, S. N., B. J. Brummel, K. L. Anderson, K. J. Kim, S. Wee, C. K. Gunsalus, and M. C. Loui. 2011. “Outcomes Assessment of Role-Play Scenarios for Teaching Responsible Conduct of Research.” Accountability in Research 18 (4): 217–246.
  • Shamir, R. 2008. “The Age of Responsibilization. On Market-Embedded Morality.” Economy and Society 37 (1): 1–19. doi: 10.1080/03085140701760833
  • Shapin, S. 2008. The Scientific Life. A Moral History of a Late Modern Vocation. Chicago: University of Chicago press.
  • Slesser, A. A. P., and Y. A. Qureshi. 2009. “The Implications of Fraud in Medical and Scientific Research.” World Journal of Surgery 33 (11): 2355–2359. doi: 10.1007/s00268-009-0201-5
  • Smith, M. L. 1992. “On being an Authentic Scientist.” Ethics and Human Research 14 (2): 1–4. doi: 10.2307/3564534
  • Smith, R. 1998. “Beyond Conflict of Interest.” British Medical Journal 317 (7154): 291–292. doi: 10.1136/bmj.317.7154.291
  • Steckler, Bernard M. 1973. “The World of Science and the World of People.” Journal of Chemical Education 50 (1): 46–49. doi: 10.1021/ed050p46
  • Stieb, J. A. 2008. “A Critique of Positive Responsibility in Computing.” Science and Engineering Ethics 14 (2): 219–233. doi: 10.1007/s11948-008-9067-4
  • Stilgoe, J., R. Owen, and P. Macnaghten. 2013. “Developing a Framework for Responsible Innovation.” Research Policy 42 (9): 1568–1580. doi: 10.1016/j.respol.2013.05.008
  • Strydom, Piet. 1999. “The Challenge of Responsibility for Sociology.” Current Sociology 47 (3): 65–82. doi: 10.1177/0011392199047003006
  • Studer, Kenneth E., and Daryl E. Chubin. 1977. “Ethics and the Unintended Consequences of Social Research: A Perspective from the Sociology of Science.” Policy Sciences 8 (2): 111–124. doi: 10.1007/BF01712288
  • Sutcliffe, H. 2011. “A Report on Responsible Research & Innovation.” Accessed January 17, 2013. http://www.matterforall.org/pdf/RRI-Report2.pdf
  • Sweeney, A. E. 2006. “Social and Ethical Dimensions of Nanoscale Science and Engineering Research.” Science and Engineering Ethics 12 (3): 435–464. doi: 10.1007/s11948-006-0044-5
  • Taylor, P. 2009. “Scientific Self-Regulation—So Good, How Can It Fail? Commentary on ‘The Problems with Forbidding Science’.” Science and Engineering Ethics 15 (3): 395–406. doi: 10.1007/s11948-009-9123-8
  • Thorpe, Charles. 2004. “Violence and the Scientific Vocation.” Theory, Culture & Society 21 (3): 59–84. doi: 10.1177/0263276404043620
  • Underwood, M. C. 2009. “Joseph Rotblat and the Moral Responsibilities of the Scientist.” Science and Engineering Ethics 15 (2): 129–134. doi: 10.1007/s11948-009-9117-6
  • Van der Burg, S., and A. van Gorp. 2005. “Understanding Moral Responsibility in the Design of Trailers.” Science and Engineering Ethics 11 (2): 235–256. doi: 10.1007/s11948-005-0044-x
  • Vessuri, H. 2002. “Ethical Challenges for the Social Sciences on the Threshold of the 21st Century.” Current Sociology 50 (1): 135–150. doi: 10.1177/0011392102050001010
  • Vogt, Tom, Davis Baird, and Chris Robinson. 2007. “Opportunities in the ‘Post-Academic’ World.” Nature Nanotechnology 2 (6): 329–332. doi: 10.1038/nnano.2007.164
  • Vollmer, W. M. 2007. “Responsibilities of Authorship.” Chest 132 (6): 2042–2045. doi: 10.1378/chest.07-2051
  • Von Hippel, Frank. 1978. “Professional Freedom and Responsibility: The Role of the Professional Society.” Science, Technology & Human Values 3 (1): 37–42. doi: 10.1177/016224397800300107
  • Von Schomberg, R. 2011. Towards Responsible Research and Innovation in the Information and Communication Technologies and Security Technologies Fields. European Commission Services, Directorate General for Research and Innovation. Luxembourg: Publication Office of the European Union.
  • Waelbers, K. 2009. “Technological Delegation: Responsibility for the Unintended.” Science and Engineering Ethics 15 (1): 51–68. doi: 10.1007/s11948-008-9098-x
  • Wager, E. 2011. “Coping with Scientific Misconduct.” British Medical Journal 343 (7831): 992–993.
  • Watson, Bernard C. 1974. “The Social Responsibility of the Social Scientist.” Paper presented at the American Educational Research Association Meeting, Chicago, April 15–19.
  • Weed, D. L., and R. E. Mckeown. 2003. “Science and Social Responsibility in Public Health.” Environmental Health Perspectives 111 (14): 1804–1808. doi: 10.1289/ehp.6198
  • Werner-Felmayer, G. 2010. “Rethinking the Meaning of Being a Scientist – The Role of Scientific Integrity Boards and Some Thoughts about Scientific Culture.” Medicine and Law 29 (3): 329–339.
  • Whitbeck, C. 1995. “Truth and Trustworthiness in Research.” Science and Engineering Ethics 1 (4): 403–416. doi: 10.1007/BF02583258
  • Wing, S. 2003. “Objectivity and Ethics in Environmental Health Science.” Environmental Health Perspectives 111 (14): 1809–1818. doi: 10.1289/ehp.6200
  • Wolpe, P. R. 2006. “Reasons Scientists Avoid Thinking about Ethics.” Cell 125 (6): 1023–1025. doi: 10.1016/j.cell.2006.06.001
  • Wolpert, L. 1989. “The Social Responsibility of Scientists: Moonshine and Morals.” British Medical Journal 298 (6678): 941–943. doi: 10.1136/bmj.298.6678.941
  • Wolpert, L. 2005. “The Medawar Lecture 1998 Is Science Dangerous?” Philosophical Transactions – Royal Society Biological Sciences 360 (1458): 1253–1258. doi: 10.1098/rstb.2005.1659
  • Woollard, Robert F. 2006. “Caring for a Common Future: Medical Schools’ Social Accountability.” Medical Education 40 (4): 301–313. doi: 10.1111/j.1365-2929.2006.02416.x
  • Ziman, J. 1998. “Why Must Scientists Become More Ethically Sensitive than They Used to Be?” Science 282 (5395): 1813–1814. doi: 10.1126/science.282.5395.1813
  • Ziman, J. 2001. “Getting Scientists to Think about What They Are Doing.” Science and Engineering Ethics 7 (2): 165–176. doi: 10.1007/s11948-001-0038-2

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.