1,441
Views
1
CrossRef citations to date
0
Altmetric
Editorial

Tackling the reproducibility problem to empower translation of preclinical academic drug discovery: is there an answer?

Pages 595-600 | Received 08 Nov 2020, Accepted 18 Feb 2021, Published online: 04 Mar 2021

1. Introduction

‘If I have seen further, it is by standing upon the shoulders of giants.’ This metaphor from one of the most influential scientists in history, Sir Isaac Newton (1642–1727), well encapsulates the postulate that scientific research advances human knowledge by building upon prior art, cumulatively and progressively. The pharmacologist and Nobelist Sir James Black (1924–2010) reduced Newton’s proposition to drug-discovery practice by advocating: ‘The most fruitful basis for the discovery of a new drug is to start with an old drug’ [Citation1]. Accurate, factual dissemination of reproducibleFootnote1 research results is thus paramount to new-drug invention, which is fundamentally a data-driven, knowledge-based, multiparameter, multiobjective optimization enterprise that strategically exploits the accrued wealth of research findings published in professional journals from the academy. A 2005 analysis by John P.A. Ioannidis [Citation2] and subsequent reports of the low success rate encountered by pharmaceutical companies in their attempts to reproduce seminal findings published by university laboratories [Citation3–5] have intensified vociferous concerns from the public at-large, patient-advocacy groups, healthcare providers, and industry drug hunters about preclinical research integrity and, specifically, the reliability of early stage research results from the academy for inventing better medicines that improve human life- and healthspans [Citation5–7]. Accentuating this problem are reported failures of university laboratories to reproduce therapeutically relevant preclinical results published from others’ [e.g. Citation8,Citation9] and their own [Citation10] academic research groups. This infodemic of irreproducible published data has been estimated to encompass 50% of all published preclinical life-science research and waste around 28 USD billion annually in the United States alone [Citation11].

The reproducibility problem lingers in a time of seismic tension within the pharmaceutical industry. Myriad computational and technological advances allow unprecedented manipulation and interrogation of living systems and disease processes across all levels of biological organization, and the structural novelty of small-molecule and peptide new chemical entities (NECs) is more diversified than ever [Citation12,Citation13]. Yet steady increases in both research and development (R&D) costs for bringing novel medicines to market and discovery-project cycle times attend falling returns on R&D investment, unacceptably high rates of candidate failure in clinical trials, and erosion of public trust in the pharmaceutical industry [Citation14–16]. This unsustainable innovation and the productivity deficit in new medicine's invention are being addressed by a seemingly perpetual transformation of the pharmaceutical industry aimed at enhancing the quantity, quality, efficiency, and economy with which drug candidates are identified and translated into medical practice. A component of this pharma-industry evolution encompasses increased reliance upon university research for method development, insights into disease mechanisms, target identification, and the generation and preclinical pharmacological profiling of new chemical entities (NCEs) as bioprobes, if not potential drugs [Citation17,Citation18]. The alliance between public-sector pharmaceutical and biotechnology companies and research-intensive universities has thus emerged as a key activity for de-risking drug discovery and enhancing its clinical and commercial success [Citation19,Citation20]. Pre-and postdoctoral scientists-in-training are front-line in this evolution, since they conduct the majority of wet-lab academic research, and the pharmaceutical industry relies upon the academy to produce well-educated, wide-bandwidth graduates ready to contribute to pharma commercial success as independent critical thinkers with the highest standards of research acumen and quality [Citation21,Citation22]. These considerations make the reproducibility problem in published academic research particularly noisome to drug discovery, especially in its potential to undermine the translational value of academic research findings and its negative implications for pharma workforce competence. Unfortunately, ambiguities in published research and their negative repercussions do not inevitably self-correct by follow-on findings or retractions, and remediation may take considerable time, a most valuable commodity in any discovery campaign [Citation23,Citation24].

The data reproducibility problem in science is complex, and many potential causes and consequences have been identified, discussed, and debated in several recent overviews [Citation25–27], as have its implications for disease treatment and medical practice [Citation28–30]. The purpose of this editorial is to focus specifically on educational gaps in the academy that may be encountered by scientists-in-training as contributors to the reproducibility problem and their undermining effects on drug discovery from the standpoints of both therapeutic invention itself and the private sector’s ever-present need for capable, wide-bandwidth drug hunters educated in the university setting.

2. Mainsprings of research irreproducibility within the academic construct

Diverse stakeholders including universities themselves, government agencies, manuscript peer-reviewers, and journal publishers and editors have proposed assorted best-practice policies and oversight agency aimed at enhancing overall academic research integrity and publication standards [Citation31–33]. Intentional scientific misconduct involving various forms of overt data manipulation/falsification/plagiarism is one of the most widely recognized contributors to the reproducibility problem, as evidenced by publication retractions arising therefrom [Citation34,Citation35]. Perverse incentive for publishing intentionally biased, fabricated, or plagiarized data has been ascribed mainly to negative structural and competitive forces exerted by the individualistic, investigator- and turf-focused mind-set of both science itself and the academic career-progression and reward system, compounded by the dearth of available tenure-track faculty positions in the face of dilatory senior-faculty turnover and constrained university budgets [Citation36–38]. In the research-intensive academy, publications based upon high-impact experimental results serve as prima facie validation of professional acumen for promotion/tenure and coin of the realm for generating funded grants and investigator/university prestige. Resulting pressures to fulfill associated demands along with myriad coursework and administrative commitments can promote a problematic, hostile work environment that helps sanctify outright publication fraud as a seemingly viable – perhaps inevitable – proposition for garnering academic reward and realizing faculty personal agendas [Citation39]. The perpetual scramble to generate publications that support a continual flow of grant applications competing for limited public funds [Citation40,Citation41] renders it difficult for senior laboratory heads in academia to have much direct involvement in evolving wet-lab research other than in a soft-governance capacity, a principal investigator’s precarious balancing act often akin to that of a tightrope performer. Consequently, postdoctoral fellows – themselves trainees with their own research-projects and career demands and limited experience as teachers/mentors – routinely assume a great top-down responsibility for educating doctoral students in research practice [Citation21,Citation22,Citation42,Citation43].

Although relevant quantitative studies are not as numerous or comprehensive as may be wished, most incidences of data irreproducibility in academic research publications do not appear to reflect purposeful fraud or intentional malfeasance, deeply concerning as these actions are [Citation44–46] as a prime cause for publication retractions [Citation34]. As detailed elsewhere by the author and others, other root causes have emerged that can lead to published preclinical university research being so flawed that it cannot be reproduced: unintentional carelessness; inadvertent technical error; weak experiment or protocol design; specious reasoning; inadequate recordkeeping; improper methodological approaches for problem-solving and data analysis/reporting; deficient experiment repetition in the original lab; inappropriate statistical analyses; poor quality/unvalidated reagents, reference materials, or experimental models; unrecognized technical or analytical variables; inherent biological variation with living test systems; insufficient heterogenization among living systems employed as study-objects; observation, gender, or confirmation bias [Citation10,Citation26,Citation27,Citation29]. Despite their varied nature, these contributors to the reproducibility problem conflate at the nexus of inadequate research training and practice.

3. Redefine academy cultural norms and reward and professionalization architecture?

The foregoing has led some to advocate that systemic changes in the prevalent culture of university-based professionalization – specifically its reward structure and the role of scientific research and research-based publications therein – will definitively allay the reproducibility problem [Citation47–49]. In broad terms, this rethink involves a professional and organizational resetting of the research-intensive university by reducing the classical academic promotion/tenure emphasis on publishing high-impact papers and maximizing research funding to increase professional reward for research, publication, and teaching quality. Furthermore, publications, journal impact factors, and grant monies are readily quantifiable as faculty promotion and tenure criteria for all involved in the business of education, whereas research and teaching quality is more shadowy and subjective. While appreciating the positive impact such realignment could have, this editorialist’s experience teaches that such modification of the prevalent academic research and reward culture requires an unrealistic level of widespread commitment to change from within. Incentives and rewards for academic laboratories to undertake extensive experimentation for reproducing their own or others’ published results are lacking, as it is difficult to get confirmatory studies published or funded. Since junior-faculty career decisions depend greatly upon evaluations from senior faculty within a highly political, bureaucratic power structure, the academic reward system as linked to the process of academic professionalization is operationally regulated from within, a characteristic that recalls the old adage about foxes minding the henhouse. Indeed, this system has been analogized to a pyramid scheme in which relatively few senior investigators oversee (and may manipulate for self-serving career purposes) a far more extensive population of pre- and postdoctoral scientists-in-training whose job opportunities, let alone chances of career progression, within the academy are slim [Citation50]. The possibility of significant retooling from within the typical academic reward and professionalization structure thus runs counter to behavioral theory, which shows that for humans to change the way they do things, need and opportunity count for little without motivation, change itself reflecting the collective force of individuals’ voluntary actions [Citation51,Citation52].

4. Variegate and deepen doctoral-level education and training

To help militate the negative effects of academic research irreproducibility on therapeutic invention, the author emphasizes instructional initiatives that enrich and expand the bandwidth of doctoral scientists-in-training beyond specific technical skills for performing experiments to include experiential-based education on good research practice, data analysis/communication, and the contemporary drug discovery process. This view is predicated upon the proposition that the way scientists design, conduct, manage, analyze, interpret, record, and report research is inseparable from reproducibility such that proper experimenter training is fundamental to addressing the reproducibility problem.

Specific didactic initiatives toward combating the reproducibility problem could be envisioned to encompass both classroom and laboratory instruction for doctoral students, with appropriate institutional sanction and support, the fruits of which would be ripened in postdoctoral training. Exemplary, semester-length courses to include theory and application (e.g., problem-based with illustrative casework and active discussion) of research practice, ethics, and experimental design would provide a formal basis for scientists-in-training to develop sensitivity to and awareness of the reproducibility problem and codes of experimenter conduct, procedures and practices for quality assurance in research, and means of responsible data generation that support research integrity. Coursework in the fine art and precise science of drug discovery that includes historical, regulatory, business, and entrepreneurial aspects would help students realize the importance of sound experimental data from the academic sector to such a far-reaching commercial enterprise. The foundational knowledge gained should help equip scientists-in-training with powers of scrutiny and discretion and a critical stance for interrogating the robustness and significance of their and others’ research output well beyond its intramural value as fodder for publications, grants, and faculty/university profile. The self-initiative should spur the continuous, independent intellectual growth required of a productive scientist in the dynamic drug-discovery and applied technology landscapes. By transcending bench-level experimenter skills, such practical and contextual classroom instruction invites appreciation of the risks and high stakes associated with translating preclinical research into clinically useful therapeutic breakthroughs and helps inculcate in scientists a critical awareness of the enormous financial and opportunity costs the reproducibility problem brings to applying academic research output for public-health benefit.

As active participants in their professional development, scientists-in-training should be encouraged to leverage peers and senior in-laboratory scientists for developing and applying principles of ethics and good research practice when recording, analyzing, and presenting/publishing their own research results. Although ‘laboratory meetings’ serve as a classic route for this activity, at-the-bench opportunities for consultation between researchers should not be neglected for helping foster a sustained research culture of shared responsibility in nurturing responsible researchers. In these regards, academia might learn from the professional and organizational settings common to the pharma/biotech industry, in which team-based research predominates involving cross-disciplinary monitoring, accountability, scholarly exchange, and avenues for conflict awareness and resolution. This thinking gains appeal from the variegated career paths for scientists well beyond academia and the collaborative thrust of contemporary university research, as reflected in, for example, the increased number of authors and institutions per academic research publication and the trend toward faculty ‘cluster hires’ for promulgating interdisciplinary projects with translational reach [Citation53,Citation54].

Incorporation of experiential learning modalities into scientist education also appears useful for helping combat the reproducibility problem. Examples of this approach at a didactic level would include critically evaluating published research papers as to their quality and implications for therapeutic invention in a team-based, journal-club format and involving advanced predoctoral students and postdoctoral scientists in the peer-review process under the guidance of experienced faculty. These exercises prove valuable for building skills such as interpreting and constructing a judicious argument from a body of data and supporting or refuting a proposition based on experimental results. More far-reaching embodiments could take the form of cooperative public-private initiatives, e.g., including industry-seasoned discovery scientists in graduate-level teaching/mentoring and offering doctoral students the opportunity for an extramural practicum in a pharmaceutical or biotech company actively pursuing therapeutic invention. Experiential activities such as these should help develop in pre- and postdoctoral trainees broader transferable skills (communication, project management, leadership) relevant to all researchers and instill practical awareness of the considerable rigor and robustness required in project evaluation, experimentation, and data scrutiny that characterize commercial discovery research within the context of bringing a drug to market. This tenor of experiential education should bring value to those scientists-in-training who plan to remain in academia, since government funding of basic university research invites practical (e.g., therapeutic) application of the academy’s basic biomedical research output [Citation55].

Since pre- and postdoctoral scientists carry out most of the published university research [Citation21,Citation22,Citation42,Citation43], the reproducibility problem necessitates applied, graduate-level training in data analysis, recordkeeping, and reporting. Formal coursework in statistics would involve analytical and practical instruction on topics such as data checking for normality, identifying and applying appropriate methods for analyzing data, assessing the significance of treatment effect, and powering experiments in line with projected conclusions. Critical analysis of data description, presentation, and probability evaluation should be an integral component of course-related assignments (e.g., term papers, examinations) and classroom discussions of published research papers. Unfortunately, the rush to publish in academia may sideline Socratic dialog between faculty members experienced in scientific communication and individual students on best practices in publishing and constructing arguments and conclusions from a data set. More personalized instruction of this type is particularly important for students whose native language is not English, since university programs for remedial English-language instruction are typically generalist and do not focus on scientific research/data communication per se.

5. Conclusion

The complex, multidimensional nature of the reproducibility problem obviates simple remediation, involving as it does multiple interrelated factors and participants beyond university scientists to include the academic reward and professionalization system, funding agencies, publishers, peer reviewers of grants and manuscripts, and institutional prerogatives. No initiatives established by stakeholders in drug discovery have yet solved the reproducibility problem. Directly responsible for carrying out most of the discovery-related experimentation in the academy, pre- and postdoctoral researchers are in a prime position to address the reproducibility problem and enhance the value of academic publications to drug discovery if provided with applied education regarding standards of experimental practice ranging from good experiment design and recordkeeping through data analysis and communication. As supported locally by senior researchers and academic institutions themselves, the higher-education experience should equip scientists-in-training with analytical and self-critical abilities that enable them to function as independent researchers keenly aware of the potential implications and translational reach of their published data within the worldwide academic and commercial innovation ecosystem. The educational process should also serve to instill situational awareness that working within a dynamic knowledge landscape involving complex experimental systems invites potential for – but can never excuse – inadvertent errors in professional communication.

6. Expert opinion

Although sources of information apart from academic preclinical research publications and at different points along the road to new drug introduction into the clinic are critical for informing therapeutics invention and healthcare (e.g., patient data on drug effects as obtained through digital medicine [Citation30]), the author’s engagement in research-intensive university, government, and big-pharma/biotech sectors over many decades has afforded innumerable examples as to how published preclinical research from academic laboratories vitally serves drug discovery and how, in turn, the reproducibility problem may undermine the knowledge-intensive discovery ecosystem. It therefore seems prudent to sustain awareness of the problem and reflect upon positive initiatives to address it at the most fundamental levels, thereby helping empower the translatability of preclinical academic research to drug discovery and clinical practice. Unfortunately, one conspicuous contributor to unreliable published research, outright falsification of results by malfeasant researchers, establishes a negative association that can marginalize the relevance of the reproducibility problem to scientists-in-training and the academy in which they are educated. The association of the reproducibility problem with deceptive practice may make it tempting to cast a jaundiced eye over most published academic research and adopt the seventeenth-century motto of the Royal Society, ‘Nullus in verba’ (‘Take no one’s word for it.’) [Citation56], thereby inviting missed opportunities to make practical applications of research findings. Indeed, in consultancy practice, the author is routinely asked by pharma/biotech companies and investment concerns evaluating university-industry discovery collaborations or alliances with academic spin-outs to address issues such as ‘What “killer experiments” would challenge the published conclusions from the university?’ ‘What are the key results from the prospective university collaborator(s) that need to be reproduced?’

Such experience supports the proposition that education about reproducibility and its determinants in the academic setting is an important element of risk management in therapeutic invention. Unreliable data from academic publications may pose problems to drug discovery, for example, by undermining insights into disease processes and modeling, therapeutic targets, pathways of drug action, and probe design and synthesis. Yet it is unrealistic to place pharma-industry productivity and novelty shortfalls at the feet of the university educational system, its laboratory heads/principal investigators, and/or its students. As a major consumer of basic knowledge derived from academic laboratories, the private sector pharma-biotech industry plays the dominant role in translating preclinical findings to the patient, especially with respect to clinical testing, commercialization, and post-approval surveillance, and has its own operational and sector shortcomings that can contribute to failure of discovery campaigns [Citation12–15,Citation19,Citation29,Citation57]. Likewise, although senior principal investigators bear primary responsibility to ensure that students who wish to engage in careers as discovery practitioners have the theoretical and experiential foundations to develop independent abilities to design, conduct, analyze, evaluate, and communicate quality research in a (self-)critical manner with highest ethical standards, the graduates themselves ultimately bear personal agency as to how their education is applied, independently of their teachers and mentors. Nevertheless, both academia and private-sector drug discovery concerns have a vested interest in recognizing and remediating university-related educational gaps that might constitute risks to drug discovery by compromising trust in published information from the academy. The healthcare ramifications of educational efforts toward data reproducibility are made increasingly pervasive by the fact that some newer enabling technologies in drug discovery (e.g., artificial intelligence, machine learning) routinely mine and curate published experimental findings for building and contextualizing required big-data training and validation sets [Citation58,Citation59].

The editorialist has advocated several embodiments of purposeful education on good research practice leavened with practical initiatives beyond the classroom as essential to an actionable framework for addressing the reproducibility problem. Implicit in this view is that both senior researchers and universities are not only aware of the problem and its widespread implications but are also committed to support, develop, and implement educational initiatives having potential to address the problem. Expanding the scope of pre- and postdoctoral scientist education beyond the silos of individual research projects and laboratories to include team-based and experiential learning appears key to addressing the reproducibility problem and improving the translational value of academic preclinical research findings for therapeutic invention. Appreciation of methodological and test-system limitations and advances; good experimental design; data analysis and reporting; statistical (mis)application; and the operational and entrepreneurial workings of drug invention and commercialization is advanced as essential to a strategic plan of action for addressing the reproducibility problem by improving graduate-level education. Another essential component is a professorate incentivized to instill in its own research and didactic activities and in the overall educational system the highest regard for data reproducibility and its consequences for drug discovery and human health. Key to this educational effort is scientist training in recognizing and communicating limitations and provisional aspects of their own and others’ research findings.

Senior faculty may need to be ever vigilant to avoid subordinating graduate training in good research and publication practices to the myriad of administrative and other commitments that inevitably accrue. An active, shared stewardship between experienced faculty and the academy as an institution to address the reproducibility problem should be within any university’s educational mandate for establishing an environment, culture, and infrastructure that support responsibility in research conduct and communication. Such actionable elements in the education of pre- and postdoctoral researchers-in-training should help strengthen the overall academic research enterprise while contributing to a talented drug-discovery workforce well-equipped to address the increasingly challenging expectations and evolving roles that therapeutic invention and allied new technologies place upon its practitioners. The ongoing SARS-CoV-2/COVID-19 pandemic not only provides a sterling example of global public-private information sharing through publications that resulted in turning basic research on the virus’s genome into vaccine products [Citation60], but may also represent a resounding call-to-arms for improving standards of research accountability and published data reproducibility [Citation61].

Declaration of interest

The author has no relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript. This includes employment, consultancies, honoraria, stock ownership or options, expert testimony, grants or patents received or pending, or royalties.

Reviewer Disclosures

Peer reviewers on this manuscript have no relevant financial or other relationships to disclose

Additional information

Funding

This manuscript has not been funded.

Notes

1. The term ‘reproducibility’ is used broadly herein to denote the ability of practiced researchers to verify the results of and draw the same overall conclusions from a published study independently of the original experimenters by using the methodological, experimental, and data analysis conditions specified in-print.

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.