Publication Cover
Accountability in Research
Ethics, Integrity and Policy
Volume 17, 2010 - Issue 2
2,487
Views
11
CrossRef citations to date
0
Altmetric
Original Articles

A Theoretical Comparison of the Models of Prevention of Research Misconduct

Pages 51-66 | Published online: 19 Mar 2010

Abstract

The current methods of dealing with research misconduct involve detection and rectification after the incident has already occurred. This method of monitoring scientific integrity exerts considerable negative effects on the concerned persons and is also wasteful of time and resources. Time has arrived for research administrators to focus seriously on prevention of misconduct. In this article, preventive models suggested earlier by Weed and Reason have been combined to arrive at six models of prevention. This is an effort to streamline the thinking regarding misconduct prevention, so that the advantages and disadvantages of each can be weighed and the method most appropriate for the institute chosen.

INTRODUCTION

It is said by eminent scientists that science is self-correcting and occasional fraud does not affect the overall sanctity of scientific literature. Self-correction does not mean that fraudulent research gets eliminated from the system on its own. On the contrary, it involves a laborious process of detection and investigation of research misconduct. Friedman says that misconduct investigation is a “very treacherous business” that leaves a lot of unpleasantness in its trail (CitationTaubes, 1993). It also happens after the damage has already been done and the benefit of prevention is lost. Fraudulent articles may never get deleted and persist as pollutants of scientific literature (CitationKochan and Budd, 1992; CitationPfeifer and Snodgrass, 1990, CitationSox and Rennie, 2006). Articles that are declared fit for retraction continue to get cited by other researchers (CitationNeale et al., 2007). Conscientious scientists may be misled by the false representations of a few dishonest ones. Society has generally held scientists in high regard (CitationLaFollette, 2000). As research enterprise depends on taxpayers' money, it is important to retain their confidence. Increasing episodes of misconduct decreases the mutual trust between researchers (CitationFuchs and Westervelt, 1996). High profile misconduct issues attract the media which in turn attracts the attention of politicians. Political interference is often followed by legislation leading to increased external oversight and decrease in autonomy that is treasured by all scientists (CitationLaFollette, 2000). Sometimes, research agenda itself may suffer due to the publicity generated by research misconduct (CitationCheck, 2005; CitationCyranoski, 2006).

Those dealing with research misconduct have not yet fully appreciated the superiority of prevention over cure. Science policing should not be limited to retrospective dissection of episodes of reported misconduct. A proactive method aimed at prevention should also become a part of the process. Reason says that draining the swamps is more effective than squatting mosquitoes (CitationReason, 2000). Prevention of misconduct would also decrease the need for whistle blowing with all its undesirable effects on honest researchers. Several models of prevention have been suggested over the last few years, but a comparative analysis of their advantages and disadvantages has not been done. The aim of this article is to provide the readers a summary of the various models of prevention for which have been proposed so far and to analyze the literature in support of each model. No definite recommendation can be made at present as to the superiority of any one particular model. It is up to the scientific community to do so following empirical research that involves comparison of the various models.

Preventive methods may be classified in two ways. According to the “stages of research” at which they are employed, prevention can be primary, secondary, or tertiary. Weed proposed this type of classification more than a decade ago (CitationWeed, 1998). The definitions utilized in this article are slightly different from those proposed by Weed. Primary prevention refers to the process of addressing the causative factors of misconduct so that misconduct does not occur at all. Secondary prevention refers to the corrective action taken after the misconduct has occurred but before such research is published or implemented in practice. Tertiary prevention refers to the damage control instituted after flawed research has been published. According to the “intended targets,” prevention can be of “person approach” type (aimed at the researchers) or “systems approach” type (aimed at the environments in which they work) (CitationReason, 2000). Amalgamation of the Weed and Resnik classifications leads to 6 different models of prevention, which are discussed below.

TERTIARY PREVENTION MODELS

“Person Approach” to Tertiary Prevention

At present, research misconduct investigations follow the “person approach” type of the tertiary prevention mode, wherein, an individual is held responsible for lack of care or ethics. The “person approach” is actually a “person versus person” approach which works on the basis of one or more persons (accusing person) pointing their fingers at other persons (accused person) on the issue of research misconduct. The institution tries to distance itself from the actions of individuals to escape blame and avoid censures. It also (perhaps mistakenly) assumes that removal of person removes the errors from the system. Punishments such as barring funding of future research and nonacceptance of articles by journals and occasionally imposition of fines have been suggested as deterrents of misconduct among the offenders as well as others in general (CitationKassirer, 1993). Some authors have called for criminalization of the scientific misconduct issues and harsh penalization of the concerned researchers to increase the deterrence value of the sanctions. These include heavier fines, imprisonment and revocation of research licenses (CitationSovacool, 2005). Legal experts have argued that criminal prosecution would help in deterring perpetration of research fraud in the same way as it has helped to decrease crimes related to sexual harassment and hate crimes (CitationGoldberg, 2003; CitationKuzma, 1992).

As of now, there is not much evidence to say that the system of shaming and banishing researchers has reduced the incidence of misconduct (despite the presence of the Office of Research Integrity for over 16 years now and its predecessors for more than a decade before its inception). Even the reported incidence has been considered to be only the tip of the iceberg (CitationWells, 2008). It is also important to consider the fate of researchers who are wrongly accused of research misconduct. Researchers who have been investigated and eventually exonerated of misconduct have significant negative impact physically and mentally (CitationRTI, 1996). With respect to the “accusing persons” (whistleblowers), it is well known that they are subjected to retribution and their own interests may suffer in the process of exposing misconduct (CitationLennane, 1993; CitationLenzer, 2004; CitationMartin, 1999; CitationLubalin and Matheson, 1999). Fear of retribution by the accused and shame of being identified as a traitor may decrease whistleblower's motivation to report misconduct (Bolsin, 2003). Rules and regulations exist in theory but do not guarantee protection to the whistleblowers (CitationRuhlen, 2006). Accounts of the difficulties faced by individual whistleblowers are sufficient to deter even highly motivated potential whistleblowers.

“Systems Approach” to Tertiary Prevention

Systems approach to tertiary prevention is based on the conviction that misconduct investigation should not stop with the researchers only. People with oversight authority should also be taken into account. External oversight of the misconduct investigations has been considered essential by some. Penalization of oversight authorities in the research departments is also said to prompt them to maintain stricter vigil over research conduct and deter them from colluding with erring researchers in covering up misconduct (CitationRhodes and Strain, 2004). Fear of loss of prestige and loss of financial sponsors may prompt administrators to cover-up misconduct or perform an unfair investigation. Experience in clinical misconduct investigation has prompted calls for external intervention for increasing the objectivity of the process (CitationBreen, 2003; CitationFaunce and Bolsin, 2004; CitationWilmhurst, 2004). Federal regulation in the form of Sarbanes–Oxley Act (SOX) emphasizes the importance given to institutional leadership of business corporations (CitationSarbanes–Oxley Act, 2002). The act was passed following a series of business scandals such as the “Enron” fiasco that surprised corporate America. The Office of Research Integrity (ORI) demands institutional investigation of research misconduct, but there is no provision for making the top-brass accountable for the failings of their subordinates. There is perhaps, place for such a provision in research institutions also.

SECONDARY PREVENTION MODELS

“Person Approach” to Secondary Prevention

The method of exposure of misconduct is same as that of tertiary prevention (whistle blowing). The type of response may differ because the research is not yet published and scientific community outside the university may not be aware of the misconduct. The way in which misconduct in unpublished research is dealt with depends on the attitudes of the institutional leaders. At present, institutional investigation follows the principle of retributive justice. It is considered unfair to the researchers by some authors (CitationMello and Brennan, 2003; CitationSpece, Jr. and Marchalonis, 2001). If proved guilty, the researcher is barred from performing funded research for a defined period of time. Since the research is not published yet, there may be no external pressures on the research oversight authorities and the quality of response to misconduct allegations depends on the perceptions and values of authorities. Attitude of institutional leader is said to be the most important factor governing the culture of scientific integrity that prevails in the universities (CitationGunsalus, 1993). In the absence of “conscience-keeping” leadership there is a tendency of faculty disowning their own failures. Similar to the person and systems approach to tertiary prevention, person approach to secondary prevention based on retributive justice has not led to a decrease in research misconduct. Eighty-nine instances of new allegations of misconduct were reported to the ORI through the annual reports by universities in 1994. It increased to 136 in 2003 and 266 in 2006 (ORI, 2009).

Should the administrators adopt the attitude of “restorative justice” instead of “retributive justice” to minimize the blame game? Gunsalus felt that administrators' attitude towards research misconduct should be educative and reformative rather than punitive (CitationGunsalus, 1998). In the criminal justice system, thoughts of restorative justice have evolved due to the failure of the retributive justice system to prevent the rising incidence of crimes (CitationJohnstone, 2002). Restorative justice aims to allow wrong doers to adopt the path of “shame acknowledgement” rather than “shame displacement.” It focuses less on punishment of the guilty and more on repairing the damage done by the misconduct and ensuring its avoidance in future. Early evidence seems to suggest that restorative justice is effective to a certain extent (CitationLatimer et al., 2005; CitationSherman and Strang, 2007). It is probably worthy of experimentation in noncriminal offence situation such as research misconduct. Before the research is published, the concerned researcher can be given a chance to rectify wrong doings and put through Responsible Conduct of Research (RCR) education program to ensure that any ignorance of ethics is addressed (CitationShamoo and Resnik, 2003).

Another method of person approach type of secondary prevention involves the “collective openness” approach. Anderson has suggested that “collective openness” philosophy in the laboratories may create an environment where anyone from the most senior to the most junior level can voice their concerns about ethics of any research conducted in the laboratory. Free and open discussion of the concerns between members of the laboratories is permitted. In this way, each member of the organization becomes responsible for the ethical conduct of research. The tension, secrecy and martyrdom involved in channelized whistle blowing are avoided (CitationAnderson, 2007).

“Systems Approach” to Secondary Prevention

Routine audit of all research similar to the auditing done for accounting purposes is one of the suggested options (CitationShamoo and Dunigan, 2000; CitationShamoo, 1988; CitationLoeb and Shamoo, 1989). These auditors should be independent from academic institutions and government agencies. The goal of auditing would be best served if it is done before research is published. It is felt that routine performance of research audit is not cost effective. It could be cost effective if the incidence of fraud is significant (CitationGlick, 1989). As evidence suggests significantly high incidence of fraud in biomedical research, there is probably a case for running some trials of audit in biomedical research (CitationBroad and Wade, 1982; CitationGoodstein, 2002; CitationAbbott and Graf, 2003). Secondly, biomedical research is significantly underreported and auditing may increase the detection rate (CitationWells, 2008). Other reasons for routine auditing of biomedical research are the size of public funding that goes into biomedical research and the possible impact of such research on public health. Shapiro and Charrow reported that the percentage of serious misconduct decreased from 12% to 7% between 1985 and 1988 in the work of around 2000 researchers following the introduction of routine data auditing of investigational drug programs (CitationShapiro and Charrow, 1989). Even when the reported incidence of fraud is low, auditing is likely to be useful in minimizing errors in non fraudulent research. These include unintentional misrepresentation or inaccurate reporting of data, statistical errors and non compliance with institutional guidelines concerning human research subjects or hazardous materials. It could be questioned whether the cost of routine auditing is justifiable. Auditing experts feel that 25% of research output from a university could be audited per year. Auditing of samples alone (not full data) may take around 30 hours of time and $6,500 of money for each project audited (CitationSmith, 2009). Though “cost factor” has been cited as argument against routine audit, such argument may be an exaggeration not backed by empirical research till now. Cost of auditing is not supposed to exceed 1% of the study cost (CitationShamoo, 1989; CitationGlick and Shamoo, 1991). It is felt that the resistance to audit from the scientific community is not really due to the cost factor. Dislike of the researchers towards external oversight and fear of losing autonomy are more plausible reasons for unpopularity of audit (CitationResnik et al., 2006).

Auditing at random is a cost-effective possibility, if routine auditing all the research projects is indeed cost ineffective. Fear of possible detection may act as a deterrent. When it comes to the cost effectiveness of auditing, it is interesting to compare the cost of auditing with the cost of investigation of misconduct by the ORI. At present, such comparison is not possible as the average cost per episode of misconduct investigation by the ORI has not been calculated (CitationDahlberg, 2009). However, individual universities can judge the cost per episode of institutional investigation of misconduct and they may in a better position to undertake comparative studies. If such studies can prove that cost of investigation exceeds that of routine audit, then the latter option is justified. Even if the cost of auditing equals that of investigation, auditing is a better option than investigation as it offers the benefit of prevention.

PRIMARY PREVENTION MODELS

“Person Approach” to Primary Prevention

Conventional wisdom suggests that primary prevention is better than secondary or tertiary prevention. Identifying those researchers at high risk of misconduct and counseling them on ethics has been considered as an ideal primary prevention method (CitationWeed, 1998). Factors that have been thought to interfere with RCR include publication pressure, competition, careerism, over confidence of researchers in their own hypotheses, large scale of research with reduced chances for effective mentoring, mentors setting bad examples, financial gain, ego and psychiatric illness (CitationMishkin, 1988; CitationPetersdorf, 1986; CitationGoodstein, 2002; CitationFleet et al., 2006; CitationAnderson, Ronning, et al., 2007). However, at present no methods exist for identification of risk factors. Available literature based on empirical research supports some of these factors but more studies are necessary. Rose advocated population strategy instead of high risk approach for prevention whenever the risk is widely dispersed in the population (CitationRose, 1985). He felt that minor changes in the right direction by most of the population are more effective than major changes in a few individuals. Competition, publication pressure and drive for fame and money are similar for all researchers and it is difficult to identify “high risk” groups for prevention. Population based measures are therefore more sensible than individual based models. The Rose model has been advocated in the prevention of scientific misconduct (CitationNylenna and Simonsen, 2006).

One way of implementing the Rose model type of “person approach” to primary prevention is ethical education of all the researchers. Sustained efforts at educating all the researchers regarding misconduct issues are thought to be essential by several authors (CitationTaubes, 1993; CitationRhodes, 2002; CitationEisen and Berry, 2002; CitationBruhn et al., 2002). In this way researchers learn from mistakes of others and avoid repeating the same mistakes themselves. Education regarding research ethics should be introduced as soon as a researcher starts his or her career and reinforcements of the ethical aspects should continue at regular intervals throughout the career of researchers (CitationMcGee et al., 2008). Heitman et al. reported that fresh graduates entering biomedical research programs had insufficient knowledge of RCR and even those who had been instructed about RCR earlier, displayed lacunae in their knowledge (CitationHeitman et al., 2007). The ORI has issued guidelines for education in RCR.

Empirical research, however, has not shown much benefit from mandatory education about RCR. Weed had expressed skepticism about the effectiveness of ethics education more than a decade ago and the current research seems to support his views (CitationWeed, 1998). Surveys among postdoctoral trainees have shown no significant effect of education in improving attitudes of trainees towards RCR (CitationElliott and Stern, 1996; CitationFunk et al., 2007). Effectiveness of RCR education depends on previous experiences, beliefs, and knowledge of the trainees. Trainees with prior experience and knowledge may reject new ideas about research ethics (CitationMcGee et al., 2008). Any misconduct is said to have two elements: propensity and opportunity (CitationAdams and Pimple, 2005). Propensity represents the individual psychological mechanisms that influence self control and rational behavior. Opportunity includes external factors that motivate and facilitate the manifestation of propensities. Education regarding ethics helps to modify the propensity but supporters of opportunity theories believe that due to the sheer numbers of researchers with differing mental attitudes, educational pathway is not economical of time and energy.

Resnik says that research misconduct occurs due to few researchers having a mind set to compromise ethics (“bad apples”) or due to environmental pressures such as publication pressures that force even otherwise ethical researchers to commit misconduct (“imperfect environments”) or due to a combination of the two. It is felt that bad apples do not respond well to education, but those who may commit misconduct due to environmental pressures are amenable to education (CitationResnik, 2007). Educational programs in RCR begin with the assumption that various components of ethical decision making (ethical sensitivity, reasoning, motivation, character, and competence) are well developed in scientists by the time they begin their RCR education, and all that is required is to teach the rules of fair play (CitationInstitute of Medicine Report, 2002). Such education may have little impact on those with primitively developed faculties of ethical decision making. There is a lack of consensus about the goals of education as shown by a survey of the instructors of RCR courses for trainees funded by NIH grants (CitationKalichman and Plemmons, 2007). Schmaling and Blume reported that instructional courses in RCR improved the knowledge of the trainees but did not improve their moral reasoning capacity (CitationSchmaling and Blume, 2009).

Mentoring is considered as an important means of ethics education of the trainees. Some authors have felt that mentors should display high ethical standards in their own conduct of research and set examples for trainees to follow. Behavior of the mentors provides nonverbal cues that act as informal education of the trainees (CitationPellegrino, 1992; CitationWocial, 1995; CitationWright et al., 2008). More empirical research is required to assess whether mentoring improves the ethical behavior of the mentees. In the survey by Anderson, Horn, et al., mentoring in research ethics decreased the odds of researchers indulging in misconduct but mentoring in financial issues and survival skills increased the odds (CitationAnderson, Horn, et al., 2007).

“Systems Approach” to Primary Prevention

The “systems approach” type of primary prevention works on the premise that “to err is human” and errors are only to be expected. The important issue is not who committed the error, but how the safe-guards failed. The emphasis is on changing the conditions under which humans work rather than changing humans themselves. In the Swiss-cheese model, adverse events are said to arise due to a combination of two factors: active failures and latent conditions (CitationReason, 2000). Active failures are committed by the people directly handling a project, whereas latent conditions are the “resident pathogens” in the system which are often under the control of people not directly involved in the project. Latent conditions are factors that increase the likelihood of errors. Defects in latent conditions increase the “holes” in the defense against error and combination of active failures, and provocative latent conditions provides an opportunity for error to manifest. It is difficult to foresee active failures in all their different forms, but it is easier to identify latent conditions and rectify them. The Kohlbergian school of thought says that ethical standards are developed continually through a dynamic interaction between the individual and the environment (CitationKohlberg, 1973). Environment is created by policy makers, oversight authorities, political figures, and distributors and consumers of research. In that case, the environment should take equal blame in the ethical lapses by individuals.

Systems approach has caught the attention of authorities overseeing research probity, and they have already thought of shifting their focus from misbehaving individuals to the environmental conditions in which such individuals function (CitationInstitute of Medicine Report, 2002). In criminology, arguments have been advanced to change the emphasis from individualistic models for criminal behavior to situational factors that either promote or restrict criminal behavior (opportunity theories). It is perhaps better and easier to institute measures to manipulate the environment to minimize the opportunities to engage in criminal activities. Competition, professional pressures to excel, and financial ambitions can be perceived as “needs” that dictate a course of action (propensity) leading to misconduct. However, criminologists have observed that even when needs are absent and propensity is not fuelled by needs, misconduct can still occur when opportunity for it exists. Modification of opportunities for misconduct at a particular place is deemed to be easier than decreasing propensity (CitationAdams and Pimple, 2005).

The “Final Rule” of the Public Health Service policies on research misconduct states that universities must foster an environment that promotes RCR (CitationFinal Rule, 2005). One way of creating an environment that is less conducive of research misconduct is to be sensitive to the needs of the researchers and their sense of organizational justice. In the survey by Martinson et al., self-reported scientific misconduct by scientists correlated positively with their perceptions of distributive and procedural injustice (CitationMartinson et al. 2006). Another survey by Keith-Spiegel et al. showed that researchers expect fairness and respect from the Institutional Review Boards (IRBs) (CitationKeith-Spiegel et al., 2006). Institutions are said to be “social actors responsible for the ethical or unethical behaviors of their employees” (CitationVictor and Cullen, 1988). Scientists are sensitive to their identity or standing within a group, and when they perceive an unfair threat to their identity, they may respond with unfair means to guard their identity (CitationAdams, 1965; CitationClay-Warner, 2001; CitationTyler and Blader, 2000). Since scientists with less well-established reputations in their field have more fragile identities than their better established counterparts and are more vulnerable to organizational injustice, higher chances of misconduct can be expected of them. The Gallup report to the ORI stated that research misconduct was highest among post doctoral fellows and junior level researchers (CitationWells, 2008). Deviant behavior is considered as one of the defense mechanisms used by the affected individual to reduce the impact of identity crisis. The other logical coping mechanism would be to distance oneself from the strain producing situation. This means that the researchers leave the research field and choose alternative professions and identities. However, investment of substantial amounts of time, money, and effort into their careers strengthens their sense of identity and makes it difficult for them to assume alternative identities (CitationHackett, 1990). Scientists with high intrinsic drive are “over committed” to their field and find it difficult to retrace their steps (CitationSiegrist, 1996, Citation2001). These factors increase the distress associated with threats to their identity and abet misconduct for survival. Oversight authorities in a research organization should be sensitive to the need for social identity among researchers (especially young and mid career researchers) and adopt the principles of organizational justice in distribution of grants and rewards. Reward systems that encourage self interest maximization are associated with higher incidence of misconduct (CitationKurland, 1996; CitationTreviño et al., 1996).

The oversight authorities control many aspects of the research which are beyond the control of individual researcher. However, the individual only is held responsible for any malpractice. For administrative reasons as well as for ignoring collective guilt, the community tends to dismiss misconduct issues as the result of errors of few individuals (CitationKreutzberg, 2004). The misconduct may be a response to the institution's priority towards to increase the research output or attract increased funding by showing attractive research proposals (Institution of Medicine Report, 2002). Victor and Cullen introduced the concept of “ethical work climate” to denote the “prevailing perceptions of typical organizational practices and procedures that have ethical content.” In management studies, ethical work climate has been found to have significant correlation with job satisfaction and commitment to organization and ethical behavior (CitationBabin et al., 2000; CitationSchwepker, 2001; CitationBartels et al., 1998). Poor ethical climate at the work place has been said to contribute towards scientific misconduct in research institutions (CitationFranzen et al., 2007; CitationPryor et al., 2007). If institutions are bringing increased pressure on researchers to increase the productivity, they should share the blame of misconduct equally with the researchers. If they are unwilling to share the blame, they should reduce unrealistic pressures on the researchers. That would be a good systems approach towards primary prevention. Organizational culture has been shown to influence moral decision making of the individuals either positively or negatively. The role of institutional leadership with high levels of ethical sensitivity is considered very important in this regard (CitationGibson et al., 2003; CitationVerschoor, 2004). Under the Sarbanes–Oxley act, if any company is facing charges of unethical practice, the topmost level of management (board of directors and chief executive) is responsible for providing evidence that they have instituted ethics programs compliant with SOX regulations and are directly monitoring such programs (CitationSarbanes–Oxley Act, 2002; CitationGalla et al., 2007). Should such a regulation be extended to research corporations also?

CONCLUSION

Person approach type of tertiary and secondary prevention have had insufficient impact in decreasing research misconduct and are unlikely to do so in future. Until acceptable primary prevention models are available, improvements suggested by various authors may improve the existing system marginally. Person approach type of primary prevention has problems of its own in the present form of RCR education. Better modalities of education aimed at improving the ethical decision making faculties of researchers as well as validated measures to assess the outcome of education might make this approach more effective (CitationRest et al., 1999; CitationMumford et al., 2008). System centered models have better theoretical appeal but their actual effectiveness will be known only with further research. Models have been suggested time and again but have not been investigated with empirical research. Gary CitationTaubes wrote in 1993 that dealing with scientific misconduct “is a field that is crying out for models” (CitationTaubes, 1993). In 2009, plenty of models exist but they are crying out for application and research. A criticism of systems approach would be that it shifts some of the responsibility for research misconduct from researchers to administrators. Such criticism would lose its sting when administrators understand that they are also as much a part of the research establishment as the researchers themselves, and they are the ones who can change policies and not the researchers.

REFERENCES

  • Abbott , A. and Graf , P. 2003 . Survey reveals mixed feelings over scientific misconduct . Nature , 424 ( 6945 ) : 117
  • Adams , J. 1965 . “ Inequality in social exchange ” . In Advances in Experimental Social Psychology , Edited by: Berkowitz , L. Vol. 2 , 267 – 299 . New York : Academic Press .
  • Adams , D. and Pimple , K.D. 2005 . Research misconduct and crime lessons from criminal science on preventing misconduct and promoting integrity . Accountability in Research , 12 ( 3 ) : 225 – 240 .
  • Anderson , M.S. 2007 . Collective openness and other recommendations for the promotion of research integrity . Science and Engineering Ethics , 13 : 387 – 394 .
  • Anderson , M.S. , Ronning , E.A. , DeVries , R. and Martinson , B.C. 2007 . The perverse effects of competition on scientists' work and relationships . Science and Engineering Ethics , 13 ( 4 ) : 437 – 461 .
  • Anderson , M.S. , Horn , A.S. , Risbey , K.R. , Ronning , E.A. , DeVries , R. and Martinson , B.C. 2007 . What do mentoring and training in the responsible conduct of research have to do with scientists' misbehavior? Findings from a National Survey of NIH- funded scientists . Academic Medicine , 82 ( 9 ) : 853 – 860 .
  • Office of Research Integrity (ORI). Annual Reports http://ori.dhhs.gov/publications/annual_reports.shtml (Accessed: 11 December 2009 ).
  • Babin , B.J. , Boles , J.S. and Robin , D.P. 2000 . Representing the perceived ethical work climate among marketing employees . Journal of Academy of Marketing Science , 28 ( 3 ) : 345 – 358 .
  • Bartels , K.K. , Harrick , E. , Martell , K. and Strickland , D. 1998 . The relationship between ethical climate and ethical problems within human resource management . Journal of Business Ethics , 17 ( 7 ) : 799 – 804 .
  • Breen , K.J. 2003 . Misconduct in medical research: Whose responsibility? . Internal Medicine Journal , 33 ( 4 ) : 186 – 191 .
  • Broad , W.J. and Wade , N. 1982 . Betrayers of Truth: Fraud and Deceit in the Halls of Science , New York : Simon and Schuster .
  • Bruhn , J.G. , Zajac , G. , Al-Kazemi , A.A. and Prescott , L. 2002 . Moral positions and academic conduct: parameters of tolerance for ethics failure . The Journal of Higher Education , 73 ( 4 ) : 461 – 493 .
  • Check , E. 2005 . Where now for stem cell cloners? . Nature , 438 : 1058 – 1059 .
  • Clay-Warner , J. 2001 . Perceiving procedural injustice: Procedural justice: The effects of group membership and status . Social Psychology Quarterly , 64 ( 3 ) : 224 – 238 .
  • Cyranoski , D. 2006 . Blow follows blow for stem cells work . Nature , 439 ( 7074 ) : 8
  • Dahlberg , J.E. 2009 . Personal communication, September 28, 2009, Office of The Director , Division of Investigative Oversight, ORI .
  • Eisen , A. and Berry , R.M. 2002 . The absent professor: Why we don't teach research ethics and what to do about it . The American Journal of Bioethics , 2 : 38 – 49 .
  • Elliott , D. and Stern , J.E. 1996 . Evaluating teaching and students' learning of academic research ethics . Science and Engineering Ethics , 2 ( 3 ) : 345 – 366 .
  • Faunce , T. and Bolsin , S.N.C. 2004 . Three Australian whistle blowing sagas: Lessons for internal and external regulation . Medical Journal of Australia , 181 ( 1 ) : 44 – 47 .
  • Final Rule; Public Health Service policies on research misconduct. (2005). Section 93.300 (General Responsibilities for Compliance) http://ori.dhhs.gov/documents/FR_Doc_05-9643.shtml (Accessed: 19 Ocr 2009 ).
  • Fleet , C.M. , Rosser , M. F. N. , Zufall , R. A. , Pratt , M.C. , Feldman , T.S. and Lemons , P.P. 2006 . Hiring criteria in biology departments of academic institutions . Bioscience , 56 : 430 – 436 .
  • Franzen , M. , Rödder , S. and Weingart , P. 2007 . Fraud: Causes and culprits as perceived by science and the media. Institutional changes, rather than individual motivation, encourage misconduct . EMBO Reports , 8 ( 1 ) : 3 – 7 .
  • Fuchs , S. and Westervelt , S.D. 1996 . Fraud and trust in science . Perspectives in Biology and Medicine , 39 : 248 – 269 .
  • Funk , C.L. , Barrett , K.A. and Macrina , F.L. 2007 . Authorship and publication practices: Evaluation of the effect of responsible conduct of research instruction to postdoctoral trainees . Accountability in Research , 14 : 269 – 305 .
  • Galla , D. , Cavico , F. and Mujtaba , B. 2007 . Compliance with Sarbanes–Oxley requires formal ethics training: Are you doing it? . Journal of Business and Economic Research , 5 ( 9 ) : 15 – 18 .
  • Gibson , J. , Ivancevich , J. , Donnelly , J. and Konopaske , R. 2003 . Organizations Behavior Structure Process , 11th , New York : McGraw Hill Companies .
  • Glick , J.L. 1989 . On the potential cost effectiveness of scientific audits . Accountability in Research , 1 ( 1 ) : 77 – 83 .
  • Glick , J. and Shamoo , A. 1991 . Auditing biomedical research data: A case study . Accountability in Research , 1 : 223 – 243 .
  • Goldberg , D. 2003 . Research fraud: A sui generis problem demands a sui generis solution (plus a little due process) . Thomas M Cooley Law Review , 20 ( 47 ) : 50
  • Goodstein , D. 2002 . Scientific misconduct . Academe , 88 ( 1 )
  • Gunsalus , C.K. 1993 . Institutional structure to ensure research integrity . Academic Medicine , 68 ( Supplement ) : S33 – S38 .
  • Gunsalus , C.K. 1998 . Preventing the need for whistle blowing: Practical advice for university administrators . Science and Engineering Ethics , 4 ( 1 ) : 75 – 94 .
  • Hackett , E. 1990 . Science as a vocation in the 1990s . Journal of Higher Education , 61 ( 3 ) : 241 – 279 .
  • Heitman , E. , Olsen , C.H. , Anaestidou , L. and Bulger , R.E. 2007 . New graduate students' baseline knowledge of the responsible conduct of research . Academic Medicine , 82 ( 9 ) : 838 – 845 .
  • Institute of Medicine and National Research Council Committee on Assessing Integrity in Research Environment. (2002). Integrity in Scientific Research: Creating an Environment that Promotes Responsible Conduct. Washington, D.C.: The National Academy Press http://www.nap.edu/catalog.php?record_id=10430 (Accessed: 19 October 2009 ).
  • Johnstone , G. 2002 . Restorative Justice: Ideas, Values, Debates , Devon , , U.K : Willan Publishing .
  • Kalichman , M.W. and Plemmons , D.K. 2007 . Reported goals for Responsible Conduct of Research courses . Academic Medicine , 82 ( 9 ) : 846 – 852 .
  • Kassirer , J.P. 1993 . The frustrations of scientific misconduct . New England Journal of Medicine , 328 : 1634 – 1636 .
  • Keith-Spiegel , P. , Koocher , G.P. and Tabachnik , B. 2006 . What scientists want from their research ethics committees . Journal of Empirical Research on Human Research Ethics , 1 ( 1 ) : 67 – 82 .
  • Kochan , C.A. and Budd , J.M. 1992 . The persistence of fraud in literature: The Darsee case . Journal of American Society of Information Science , 43 : 488 – 493 .
  • Kohlberg , L. 1973 . The contribution of development psychology to education: Examples from moral education . Education Psychologist , 10 ( 1 ) : 2 – 14 .
  • Kreutzberg , G.W. 2004 . The rules of good science . EMBO Reports , 5 : 330 – 32 .
  • Kurland , N. 1996 . Trust, accountability and sales agents' dueling loyalties . Business Ethics Quarterly , 6 : 281 – 310 .
  • Kuzma , S.M. 1992 . Criminal liability for misconduct in scientific research . University of Michigan Journal of Law , 25 : 357 – 401 .
  • LaFollette , M.C. 2000 . The evolution of the “scientific misconduct” issue: An historical overview . Proceedings of the Society for Experimental Biology and Medicine , 224 : 211 – 215 .
  • Latimer , J. , Dowden , C. and Muise , D. 2005 . The effectiveness of restorative justice practices: A meta-analysis . The Prison Journal , 85 ( 2 ) : 127 – 144 .
  • Lennane , K.J. 1993 . “Whistle blowing”: A health issue . British Medical Journal , 307 : 667 – 670 .
  • Lenzer , J. 2004 . Public interest group accuses FDA of trying to discredit whistle blower . British Medical Journal , 329 : 1255
  • Loeb , S.E. and Shamoo , A.E. 1989 . Data audit: Its place in auditing . Accountability in Research , 1 : 23 – 32 .
  • Lubalin , J.S and Matheson , J.L. 1999 . The fall out: What happens to whistleblowers and those accused but exonerated of scientific misconduct? . Science and Engineering Ethics , 5 ( 2 ) : 229 – 250 .
  • Martin , B. 1999 . “ Suppression of dissent in science ” . In Research in Social Problems and Public Policy , Edited by: Freudenberg , W.R. and Youn , T.I.K. Vol. 7 , 105 – 135 . Stamford , CT : JAI Press .
  • Martinson , B.C. , Anderson , M.S. , Crain , A.L. and DeVries , R. 2006 . Scientist's perception of organizational justice and self-reported misbehaviors . Journal of Empirical Research on Human Research Ethics , 1 ( 1 ) : 51 – 66 .
  • McGee , R. , Almquist , J. , Keller , J.L. and Jacobsen , S.J. 2008 . Teaching and learning responsible conduct of research: Influences of prior experiences on acceptance of new ideas . Accountability in Research , 15 ( 1 ) : 30 – 62 .
  • Mello , M.M. and Brennan , T.A. 2003 . Due process in investigations of research misconduct . New England Journal of Medicine , 349 : 1280 – 1286 .
  • Mishkin , L. 1988 . Fraud is a symptom of a deeper flaw . Scientist , 9 : 12
  • Mumford , M.D. , Connelly , S. , Brown , R.P. , Murphy , S.T. , Hill , J.H. , Antes , A.L. , Waples , E.P. and Davenport , L.D. 2008 . A sense making approach to ethics training for scientists: Preliminary evidence of training effectiveness . Ethics and Behavior , 18 ( 4 ) : 315 – 339 .
  • Neale , A.V. , Northrup , J. , Dailey , R. , Marks , E. and Abrams , J. 2007 . Correction and use of biomedical literature affected by scientific misconduct . Science and Engineering Ethics , 13 : 5 – 24 .
  • Nylenna , M. and Simonsen , S. 2006 . Scientific misconduct: a new approach to prevention . The Lancet , 367 ( 9526 ) : 1882 – 1884 .
  • Pellegrino , E.D. 1992 . Character and the ethical conduct of research . Accountability in Research , 2 : 1 – 11 .
  • Petersdorf , R.G. 1986 . The pathogenesis of fraud in science . Annals of Internal Medicine , 104 : 252 – 254 .
  • Pfeifer , M.P. and Snodgrass , G.L. 1990 . The continued use of retracted, invalid scientific literature . JAMA , 263 : 1420 – 1423 .
  • Pryor , E.R. , Habermann , B. and Broome , M.E. 2007 . Science misconduct from the perspective of research coordinators: A national survey . Journal of Medical Ethics , 33 : 365 – 369 .
  • Reason , J. 2000 . Human error: Models and management . British Medical Journal , 320 : 768 – 770 .
  • Research Triangle Institute (RTI) . 1996 . Survey of accused but exonerated individuals in research misconduct cases , Washington , D.C : Final report .
  • Resnik, D.B. (2007). What is ethics in research and why is it important? NIEHS bioethics http://www.niehs.nih.gov/research/resources/bioethics/whatis.cfm (Accessed: 19 October 2009 ).
  • Resnik , D.B. , Shamoo , A. and Krimsky , S. 2006 . Fraudulent human embryonic stem cell research in South Korea: Lessons learned . Accountability in Research , 13 ( 1 ) : 101 – 109 .
  • Rest , J. , Narvaez , D.F. , Thoma , S.J. and Bebeau , M.J. 1999 . DIT 2: Devising and testing a revised instrument of moral judgment . Journal of Educational Psychology , 91 ( 4 ) : 644 – 659 .
  • Rhodes , R. 2002 . The pressing need for post doctoral research ethics education . American Journal of Bioethics , 2 ( 4 ) : 1
  • Rhodes , R. and Strain , J.J. 2004 . Whistle blowing in academic medicine . Journal of Medical Ethics , 30 : 35 – 39 .
  • Rose , G. 1985 . Sick individuals and sick populations . International Journal of Epidemiology , 14 : 32 – 38 .
  • Ruhlen , R.L. 2006 . What happens to whistleblowers? . Science , 314 : 251 – 252 .
  • Sarbanes–Oxley Act of 2002, H.R. 3763, 107th Cong. (2002) http://news.findlaw.com/hdocs/docs/gwbush/sarbanesoxley072302.pdf (Accessed: 19 October 2009 ).
  • Schmaling , K.B. and Blume , A.W. 2009 . Ethics instruction increases graduate students' Responsible Conduct of Research knowledge but not their moral reasoning . Accountability in Research , 16 ( 5 ) : 268 – 283 .
  • Schwepker , C.H. Jr. 2001 . Ethical climate's relationship to job satisfaction, organizational commitment, and turnover intention in the sales force . Journal of Business Research , 54 ( 1 ) : 39 – 52 .
  • Shamoo , A.E. 1989 . Principles of Research data Audit , New York : Gordon and Breach .
  • Shamoo , A.E. 1988 . We need data audit . AAA Observer , 4 : 4
  • Shamoo , A.E. and Dunigan , C.D. 2000 . Ethics in research . Proceedings of the Society for Experimental Biology and Medicine , 224 : 205 – 210 .
  • Shamoo , A.A. and Resnik , D.B. 2003 . Responsible Conduct of Research , 2nd , New York : Oxford University Press .
  • Shapiro , M.F. and Charrow , R.P. 1989 . The role of data audits in dealing scientific misconduct: Results of the FDA program . JAMA , 261 : 2505 – 2511 .
  • Sherman, L.W., and Strang, H. (2007). Restorative justice: The evidence. London: Smith Institute http://www.smith-institute.org.uk/pdfs/RJ_full_report.pdf (Accessed: 19 October 2009 ).
  • Siegrist , J. 1996 . Adverse health effects of high effort/low reward conditions . Journal of Occupational Health Psychology , 1 ( 1 ) : 27 – 41 .
  • Siegrist , J. 2001 . “ A theory of occupational stress ” . In Stress in the Work Place , Edited by: Dunham , J. 52 – 66 . London , , Philadelphia : Whurr Publishers .
  • Smith , J. 2009 . Personal communication , Norton Audits Inc . September 28, 2009
  • Sovacool , B.K. 2005 . Using criminalization and due process to reduce scientific misconduct . American Journal Bioethics , 5 ( 5 ) : W1 – 7 .
  • Sox , H.C. and Rennie , D. 2006 . Research misconduct, retraction and cleansing the medical literature: Lessons from the Poehlmann case . Annals of Internal Medicine , 144 : 609 – 613 .
  • Spece , R.G Jr. and Marchalonis , J.J. 2001 . Fourth amendment restrictions on scientific misconduct proceedings at public universities . Health Matrix Clevel , 11 ( 2 ) : 571 – 626 .
  • Taubes , G. 1993 . Misconduct: views from the trenches . Science , 261 : 1108 – 1111 .
  • Treviño , L.K. , Butterfield , K.D. and McCabe , D.L. 1996 . The ethical context in organizations: Influences on employee attitudes and behaviors . Business and Ethics Quarterly , 8 ( 3 ) : 447 – 476 .
  • Tyler , T.R. and Blader , S.L. 2000 . Cooperation in Groups: Procedural Justice, Social Identity and Behavioral Engagement , Philadelphia : Psychology Press .
  • Verschoor , C. 2004 . Toward a corporation with conscience . Strategic Finance , 4 ( 85 ) : 20
  • Victor , B. and Cullen , J.B. 1988 . The organizational bases of ethical work climates . Administrative Science Quarterly , 33 ( 1 ) : 101 – 125 .
  • Weed , D.L. 1998 . Preventing scientific misconduct . American Journal of Public Health , 88 ( 1 ) : 125 – 129 .
  • Wells, J.A. (2008). The Gallup Organization. Final report: Observing and reporting suspected misconduct in biomedical research. Submitted to The Office of Research Integrity http://ori.hhs.gov/research/intra/documents/gallup_finalreport.pdf (Accessed: 19 October 2009 ).
  • Wilmhurst , P. 2004 . External checks must be imposed to protect the public . British Medical Journal , 328 : 230
  • Wocial , L.D. 1995 . The role of mentors in promoting integrity and preventing scientific misconduct in nursing research . Journal of Professional Nursing , 11 ( 5 ) : 276 – 280 .
  • Wright , D.E. , Titus , S.L. and Cornelison , J.B. 2008 . Mentoring and research misconduct: An analysis of research mentoring in closed ORI cases . Science and Engineering Ethics , 14 ( 3 ) : 232 – 236 .

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.