Abstract
This pilot study evaluates the accessibility and usefulness of the research misconduct (RM) policies at the top-25 universities as ranked by NIH and NSF grant awards. Measuring accessibility demonstrates how readily-available policies are to the people they affect. Evaluating the range of policy content indicates whether policies and procedures on research misconduct are “useful” as opposed to merely “minimal” (CitationRhoades, 2003). On average, it took five clicks to get from a university's home page to its RM policies. Only nine policies were accessed within three or fewer clicks. Policy information was coded into categories comprising a total of 20 topic areas, which were then grouped into five content domains. The policies reveal a broad range of usefulness. Some provide relevant details on almost every topic area, while others leave most questions unanswered. Three of the 20 topic areas are almost universally covered in the policies analyzed. In contrast, five other topic areas average less than half of the information which could have been included. These policies, from elite U.S. research universities, may serve as role models; as such they should perhaps be held to the highest standards. If the message sent by a policy lacks clarity and precision, it should be revised to include an appropriate level of detail.
The author is grateful to Angela Lawson, Kristine Schoo, and Loren Booker for their assistance in obtaining the policies and coding. Lawson was a graduate student and Schoo and Booker were undergraduate students in the Department of Communication at the University of Illinois at Chicago when the research was conducted.
Notes
The author is grateful to Angela Lawson, Kristine Schoo, and Loren Booker for their assistance in obtaining the policies and coding. Lawson was a graduate student and Schoo and Booker were undergraduate students in the Department of Communication at the University of Illinois at Chicago when the research was conducted.
1This research uses the term “research misconduct/RM” for consistency; in practice, the policies have a variety of titles.
2To discourage speculation about the identity of individual institutions, ID numbers reflect neither alphabetical nor ranked funding order. Instead, ID numbers were assigned based on the sum of the schools’ five content domain scores.