324
Views
0
CrossRef citations to date
0
Altmetric
A Special Issue for Preregistration Templates

Preregistration templates as a new addition to the evidence-based toxicology toolbox

ORCID Icon, ORCID Icon & ORCID Icon
Article: 2314303 | Received 24 Jan 2024, Accepted 30 Jan 2024, Published online: 14 Feb 2024

In this editorial, we define the practice of “preregistration” of research and describe its motivations, explain why we believe preregistration templates should make preregistration more effective as an intervention for improving the quality of scientific research, and introduce Evidence-Based Toxicology’s Preregistration Templates Special Issue.

Preregistration as a desirable research practice

Publication bias from non-disclosure of planned research, and issues such as outcome-switching and non-reporting of results, are recognised as some of the most important factors that distort evaluations of the effectiveness of healthcare interventions (Evans Citation2011). The suboptimal treatment decisions that can result from such distortions has made “preregistration”, i.e., registering the existence of a study before it has been conducted, a priority strategy in improving the quality of clinical trials (Zarin et al. Citation2011).

Preregistration seems to work: studies are finding it to be associated with an increase in reporting of null results (Allen and Mehler Citation2019; Scheel, Schijen, and Lakens Citation2021), a reduction in p-hacking (Decker and Ottaviani Citation2023), and an increase in study quality (Lindsley et al. Citation2022). The effect on the latter seems especially pronounced when paired with the peer-review of preregistered study plans (Soderberg et al. Citation2021) and the provision of structured formats to support the development of preregistrations (Bakker et al. Citation2020).

The issues that preregistration seeks to remedy are not unique to clinical trials. Self-report surveys indicate that selective non-reporting of results and undisclosed flexibility in choosing how data is analyzed may be highly prevalent among researchers (Banks et al. Citation2016). Such practices may lead to over-prevalence in the literature of either positive findings (Scheel, Schijen, and Lakens Citation2021) - or potentially negative, depending on the direction that publication incentives push the bias - and are increasingly recognised as a threat to the credibility of whole areas of scientific research (West and Bergstrom Citation2021).

In healthcare research the preregistration of clinical trials is relatively well-established and, in some countries, legally mandated (Zarin et al. Citation2019). Outside of healthcare, uptake of preregistration varies considerably by area of research and design of study. For example, systematic reviews are commonly preregistered in a process supported by platforms such as PROSPERO (Booth et al. Citation2012; Centre for Reviews and Dissemination and University of York Citation2011) and, increasingly, by scientific journals peer-reviewing and publishing systematic review protocols. In development economics, it is common to preregister randomized controlled trials (Swanson et al. Citation2020). Preregistration is also catching on in the social sciences, most notably in psychological research (Spitzer and Mueller Citation2023). Efforts to create common standards for the preregistration of animal studies (Heinl et al. Citation2022) and the recent publication of articles arguing for the preregistration of epidemiological research (Mathur and Fox Citation2023) suggest growing awareness of the potential value of the practice for the fields of toxicology and environmental health.

Besides offering a means to tackle publication bias and selective reporting, preregistration has a related, less well-recognised, but potentially equally important role in making clear the distinction between planned and unplanned research (a difference sometimes referred to as confirmatory vs. exploratory modes of research, or by Popperians as hypothesis-testing vs. hypothesis-generating).

In general, planned research is important for testing precise a priori hypotheses and therefore confirming well-articulated theories, whereas unplanned research is important for making discoveries and generating new hypotheses (Ledgerwood Citation2018; Nosek et al. Citation2018a, Citation2018b). While it is arguable if the distinction between exploratory and confirmatory research is truly a dichotomy (Jacobucci Citation2022), what is important for both research modes is that one is not conflated with the other. If exploratory-mode research is misunderstood as being confirmatory, it risks accidental features of a dataset being inappropriately interpreted as supporting a hypothesis rather than being coincidental. This is the problem of confirmation bias (Braithwaite et al. Citation2021) that can easily slide into “hypothesizing after the results are known” or HARKing (Kerr Citation1998). On the other hand, using confirmatory methods for exploratory research can be very time-inefficient (anyone who has, for example, attempted a systematic review of an open-ended question will likely agree with this).

Unfortunately, the wide use of null hypothesis significance tests, and the ability to generate spurious significance using cherry-picked or post-hoc analyses, makes it difficult to distinguish when research is exploratory from when it is confirmatory. The lack of a record of the original hypothesis and method might be appropriate if a study is exploratory, but the lack of such records in general makes it too easy for researchers to present unplanned research as if it had been planned - particularly if they are incentivised to do so by a publishing system that is biased toward positive results. This is bad for science. The solution is for researchers intending to conduct a confirmatory study to state their plan up front, by preregistering their methods and analysis plan before they collect their data.

Making preregistration more common

Because of its actual and potential benefits, the Center for Open Science (COS) advocates for the adoption of preregistration whenever possible. Evidence-Based Toxicology (EBT), as a journal committed to demonstrating the value of open science practices to the environmental health community, seeks to support and encourage authors who want to preregister their studies. Of course, “preregistration whenever possible” is a sizable aspiration. While of growing interest, preregistration is still a minority enterprise, particularly in toxicology and environmental health. When preregistration is practiced, it is rarely implemented or reported perfectly, even when the rules are well-articulated and enforced such as with clinical research (Goldacre Citation2016), let alone when enforcement is lacking (Booth et al. Citation2020). So there is a good deal to do, both to encourage preregistration and help ensure preregistration is done well.

To support the adoption of preregistration and other open science practices, COS follows the blueprint for adopting new tools and methods that is outlined in Diffusion of Innovations (Center for Open Science Citation2024; Rogers Citation2003). This blueprint maps out a route for making new practices such as preregistration first possible, then easy, through normal, rewarded, and (eventually) required. The blueprint needs to be followed in a roughly linear fashion. For example, if a mandate to practice preregistration is imposed before it is seen as normal to do so, there will be resistance to adopting a seemingly bureaucratic and unnecessary step, and the mandate will fail.

Instead of making mandates, COS therefore supports the creation and maintenance of open source platforms that enable desirable research practices, such as the Open Science Framework (OSF - https://osf.io), while devoting time and resources to making the practices easy to follow. COS also encourages recognition and rewards for these practices, and develops policies to support their implementation such as the Transparency and Openness Promotion (TOP) Guidelines (Karlan et al. Citation2014). EBT, for its part, provides detailed guidance and encouragement for authors in using open science practices, offers a wide range of submission types that includes study protocols, has implemented COS’ open science badges, and is aiming for a TOP score that would put EBT in the top 1% of journals worldwide. EBT was created as a mission-aligned publication venue for the Evidence-Based Toxicology Collaboration (EBTC), an international cross-sector collaboration promoting the development and use of evidence-based research and decision frameworks in toxicology and environmental health (www.ebtox.org).

Two key parts of a strategy for “whenever possible” adoption of preregistration is to make it easy for researchers to do and then normal for it to be done. One method for at least making preregistration easier is to create templates and guidelines for preregistering studies of a wide variety of designs. Therefore, through this Special Issue, COS and EBT are soliciting the submission of new preregistration templates that provide reusable specifications of planned methods for toxicological and environmental health studies, of any type.

The preregistration templates special issue

A preregistration template is a form designed to help articulate a fully pre-specified research plan. Similar to reporting guidelines such as the CONSORT checklist (Schulz, Altman, and Moher Citation2010) or ARRIVE 2.0 Guidelines (Percie Du Sert et al. Citation2020), the template’s purpose is to help an author comprehensively document a piece of research - but with two major differences. The first difference is that a preregistration template is intended to be completed before rather than after a study has been conducted. The second difference is that a preregistration template is designed to capture important nuances of specific study designs rather than the more general set of factors that are provided by the broader checklists.

Any researcher with a method that more than one person might use when planning a future study should be able to create a preregistration template (). The specific goals of a template can vary, but they typically relate to one or more of: increasing transparency about the research process; constraining “researcher degrees of freedom” (Wicherts et al. Citation2016); specifying a priori hypotheses; and informing the community about the existence of a study. A researcher creates a template by converting their method into a set of questions about study design, data collection procedures, data analysis plans, and the criteria by which any inferences would be made at the conclusion of the work. For a rigorously-designed template, this conversion process may be supported by cross-walking the template with existing study design standards and checklists (for an example, see Whaley et al. (Citation2023)), community engagement methods that help ensure the template reflects current understanding of best practices, and user testing.

Figure 1. How preregistration templates work. Researcher 1 develops a method for their study. Recognising that other researchers may benefit from a detailed operationalisation of their approach, they convert their method to a preregistration template. They have their template peer-reviewed and published, for example by a journal such as EBT. Researcher 2 discovers the template and uses it to preregister their own similar study, thus building on Researcher 1’s experience and gaining the benefits of preplanning their own work.

Figure 1. How preregistration templates work. Researcher 1 develops a method for their study. Recognising that other researchers may benefit from a detailed operationalisation of their approach, they convert their method to a preregistration template. They have their template peer-reviewed and published, for example by a journal such as EBT. Researcher 2 discovers the template and uses it to preregister their own similar study, thus building on Researcher 1’s experience and gaining the benefits of preplanning their own work.

The OSF has about a dozen examples of preregistration templates covering general-purpose preregistration, psychology research, research using existing datasets, and qualitative research (https://osf.io/zab38/). Many of these templates date back to 2016, and none are specific to toxicology or environmental health. This special issue is an opportunity for researchers in the toxicology and environmental health communities to create a new and wide-ranging set of preregistration standards and norms relevant to their work and better reflect contemporary standards in study design.

Any toxicologist or environmental health researcher with a method that another researcher might use should consider submitting their method as a template to the Special Issue. The options for templates are almost limitless but for inspiration might include: minimum information for registering the existence of a planned bioassay; planning the analysis of data generated in new alternative method (NAM) studies for assessing the toxicity of chemical exposures; or a comprehensive study methods and data analysis plan for epidemiological studies that seek to comply with Good Epidemiology Practice. Detailed instructions for Special Issue submissions can be found on the EBT website (link). Questions about the ­submission process and ideas for templates are welcome and should be addressed to Paul Whaley, Editor-in-Chief.

Conclusion

Routine preregistration of scientific studies could help toxicology and environmental health tackle multiple research issues with which all scientific disciplines are increasingly having to contend. Publication bias and selective non-reporting of outcomes is combated by preregistration of the outcomes of interest in a study. P-hacking, cherry-picking, confirmation bias, and HARKing are combated by preregistration of analysis plans. Clarity of planning that improves the general quality of research and improves the prospects of later replication is helped by preregistration of methods - especially when those methods are peer-reviewed. Ultimately, both individual researchers and the broader community are set to benefit from preregistration. At EBT we look forward to receiving preregistration templates as a submission type.

Acknowledgements

We thank Olwenn Martin (Evidence-Based Toxicology) and Sebastian Hoffmann (EBTC) for their comments on draft versions of this manuscript.

References

  • Allen, C., and D. M. A. Mehler. 2019. “Open Science Challenges, Benefits and Tips in Early Career and beyond.” PLoS Biology 17 (5):1. https://doi.org/10.1371/journal.pbio.3000246
  • Bakker, M., C. L. S. Veldkamp, M. A. L. M. van Assen, E. A. V. Crompvoets, H. H. Ong, B. A. Nosek, C. K. Soderberg, D. Mellor, and J. M. Wicherts. 2020. “Ensuring the Quality and Specificity of Preregistrations.” PLoS Biology 18 (12): e3000937. https://doi.org/10.1371/journal.pbio.3000937
  • Banks, G. C., S. G. Rogelberg, H. M. Woznyj, R. S. Landis, and D. E. Rupp. 2016. “Editorial: Evidence on Questionable Research Practices: The Good, the Bad, and the Ugly.” Journal of Business and Psychology 31 (3): 323–5. https://doi.org/10.1007/s10869-016-9456-7
  • Booth, A., A. S. Mitchell, A. Mott, S. James, S. Cockayne, S. Gascoyne, and C. McDaid. 2020. “An Assessment of the Extent to Which the Contents of PROSPERO Records Meet the Systematic Review Protocol Reporting Items in PRISMA-P.” F1000Research 9: 773. https://doi.org/10.12688/f1000research.25181.2
  • Booth, A., M. Clarke, G. Dooley, D. Ghersi, D. Moher, M. Petticrew, and L. Stewart. 2012. “The Nuts and Bolts of PROSPERO: An International Prospective Register of Systematic Reviews.” Systematic Reviews 1 (1): 2. https://doi.org/10.1186/2046-4053-1-2
  • Braithwaite, R. S., K. F. Ban, E. R. Stevens, and E. C. Caniglia. 2021. “Rounding up the Usual Suspects: confirmation Bias in Epidemiological Research.” International Journal of Epidemiology 50 (4): 1053–1057. https://doi.org/10.1093/ije/dyab091
  • Center for Open Science. 2024. About COS: Our mission is to increase openness, integrity, and reproducibility of research. https://www.cos.io/about.
  • Centre for Reviews and Dissemination, University of York. 2011. PROSPERO International prospective register of systematic reviews. https://www.crd.york.ac.uk/prospero.
  • Decker, C., and M. Ottaviani. 2023. Preregistration and Credibility of Clinical Trials. Centre for Economic Policy Research. https://play.google.com/store/books/details?id=xs0G0AEACAAJ.
  • Evans, I. 2011. Testing treatments: better research for better healthcare. Pinter & Martin.
  • Goldacre, B. 2016. “Make Journals Report Clinical Trials Properly.” Nature 530 (7588): 7–7. https://doi.org/10.1038/530007a
  • Heinl, C., A. M. D. Scholman-Végh, D. Mellor, G. Schönfelder, D. Strech, S. Chamuleau, and B. Bert. 2022. “Declaration of Common Standards for the Preregistration of Animal Research-Speeding up the Scientific Progress.” PNAS Nexus 1 (1): pgac016. https://doi.org/10.1093/pnasnexus/pgac016
  • Jacobucci, R. 2022. “A Critique of Using the Labels Confirmatory and Exploratory in Modern Psychological Research.” Frontiers in Psychology 13: 1020770. https://doi.org/10.3389/fpsyg.2022.1020770
  • Karlan, D., C. K. Soderberg, G. Alter, S. Bowman, R. K. Wilson, E. L. Paluck, B. A. Nosek, et al. 2014. Transparency and Openness Promotion (TOP) Guidelines [dataset]. Center For Open Science. https://osf.io/9f6gx/.
  • Kerr, N. L. 1998. “HARKing: hypothesizing after the Results Are Known.” Personality and Social Psychology Review: An Official Journal of the Society for Personality and Social Psychology, Inc 2 (3): 196–217. https://doi.org/10.1207/s15327957pspr0203_4
  • Ledgerwood, A. 2018. “The Preregistration Revolution Needs to Distinguish between Predictions and Analyses [Review of the Preregistration Revolution Needs to Distinguish between Predictions and Analyses].” Proceedings of the National Academy of Sciences of the United States of America 115 (45): E10516–E10517. https://doi.org/10.1073/pnas.1812592115
  • Lindsley, K., N. Fusco, T. Li, R. Scholten, and L. Hooft. 2022. “Clinical Trial Registration Was Associated with Lower Risk of Bias Compared with Non-Registered Trials among Trials Included in Systematic Reviews.” Journal of Clinical Epidemiology 145: 164–173. https://doi.org/10.1016/j.jclinepi.2022.01.012
  • Mathur, M. B., and M. P. Fox. 2023. “Toward Open and Reproducible Epidemiology.” American Journal of Epidemiology 192 (4): 658–664. https://doi.org/10.1093/aje/kwad007
  • Nosek, B. A., C. R. Ebersole, A. C. DeHaven, and D. T. Mellor. 2018a. “Reply to Ledgerwood: Predictions without Analysis Plans Are Inert [Review of Reply to Ledgerwood: Predictions without Analysis Plans Are Inert].” Proceedings of the National Academy of Sciences of the United States of America 115 (45): E10518. https://doi.org/10.1073/pnas.1816418115
  • Nosek, B. A., C. R. Ebersole, A. C. DeHaven, and D. T. Mellor. 2018b. “The Preregistration Revolution.” Proceedings of the National Academy of Sciences of the United States of America 115 (11): 2600–2606. https://doi.org/10.1073/pnas.1708274114
  • Percie Du Sert, N., A. Ahluwalia, S. Alam, M. T. Avey, M. Baker, W. J. Browne, A. Clark, et al. 2020. “Reporting Animal Research: Explanation and Elaboration for the ARRIVE Guidelines 2.0.” PLoS Biology 18 (7): e3000411. https://doi.org/10.1371/journal.pbio.3000411
  • Rogers, E. M. 2003. Diffusion of Innovations. 5th edition. Simon and Schuster. https://play.google.com/store/books/details?id=9U1K5LjUOwEC.
  • Scheel, A. M., M. R. M. J. Schijen, and D. Lakens. 2021. “An Excess of Positive Results: Comparing the Standard Psychology Literature with Registered Reports.” Advances in Methods and Practices in Psychological Science 4 (2): 251524592110074. https://doi.org/10.1177/25152459211007467
  • Schulz, K. F., D. G. Altman, and D. Moher. 2010. “CONSORT 2010 Statement: updated Guidelines for Reporting Parallel Group Randomised Trials.” BMC Medicine 8 (1): 18. https://doi.org/10.1186/1741-7015-8-18
  • Soderberg, C. K., T. M. Errington, S. R. Schiavone, J. Bottesini, F. S. Thorn, S. Vazire, K. M. Esterling, and B. A. Nosek. 2021. “Initial Evidence of Research Quality of Registered Reports Compared with the Standard Publishing Model.” Nature Human Behaviour 5 (8): 990–997. https://doi.org/10.1038/s41562-021-01142-4
  • Spitzer, L., and S. Mueller. 2023. “Registered Report: Survey on Attitudes and Experiences regarding Preregistration in Psychological Research.” PloS One 18 (3): e0281086. https://doi.org/10.1371/journal.pone.0281086
  • Swanson, N., G. Christensen, R. Littman, D. Birke, E. Miguel, E. L. Paluck, and Z. Wang. 2020. “Research Transparency is on the Rise in Economics.” AEA Papers and Proceedings 110: 61–65. https://doi.org/10.1257/pandp.20201077
  • West, J. D., and C. T. Bergstrom. 2021. “Misinformation in and about Science.” Proceedings of the National Academy of Sciences of the United States of America 118 (15). https://doi.org/10.1073/pnas.1912444117
  • Whaley, P., S. Wattam, A. M. Scott, and J. Vidler. 2023. “Using Bench Protocol Platforms to Improve Compliance with Systematic Review Guidance Documents and Reporting Checklists: Proof of Concept.” Evidence-Based Toxicology 1 (1): 2259938. https://doi.org/10.1080/2833373X.2023.2259938
  • Wicherts, J. M., C. L. S. Veldkamp, H. E. M. Augusteijn, M. Bakker, R. C. M. van Aert, and M. A. L. M. van Assen. 2016. “Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking.” Frontiers in Psychology 7: 1832. https://doi.org/10.3389/fpsyg.2016.01832
  • Zarin, D. A., K. M. Fain, H. D. Dobbins, T. Tse, and R. J. Williams. 2019. “10-Year Update on Study Results Submitted to ClinicalTrials.gov.” The New England Journal of Medicine 381 (20): 1966–1974. https://doi.org/10.1056/NEJMsr1907644
  • Zarin, D. A., T. Tse, R. J. Williams, R. M. Califf, and N. C. Ide. 2011. “The ClinicalTrials.gov Results Database–Update and Key Issues.” The New England Journal of Medicine 364 (9): 852–860. https://doi.org/10.1056/NEJMsa1012065

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.