729
Views
0
CrossRef citations to date
0
Altmetric
Research Article

How well are open science practices implemented in industrial and organizational psychology and management?

, , , , , , & show all
Pages 461-475 | Received 15 Nov 2022, Accepted 20 Apr 2023, Published online: 07 May 2023

References

  • Aguinis, H., Banks, G. C., Rogelberg, S. G., & Cascio, W. F. (2020). Actionable recommendations for narrowing the science-practice gap in open science. Organizational Behavior and Human Decision Processes, 158, 27–35. https://doi.org/10.1016/j.obhdp.2020.02.007
  • Aguinis, H., Cummings, C., Ramani, R. S., & Cummings, T. G. (2020). “An a is an a”: The new bottom line for valuing academic research. Academy of Management Perspectives, 34(1), 135–154. https://doi.org/10.5465/amp.2017.0193
  • Anderson, M. S., Martinson, B. C., & De Vries, R. (2007). Normative dissonance in science: Results from a national survey of U.S. scientists. Journal of Empirical Research on Human Research Ethics, 2(4), 3–14. https://doi.org/10.1525/jer.2007.2.4.3
  • Ansari, S., Wijen, F., & Gray, B. (2013). Constructing a climate change logic: An institutional perspective on the “tragedy of the commons”. Organization Science, 24(4), 1014–1040. https://doi.org/10.1287/orsc.1120.0799
  • Artner, R., Verliefde, T., Steegen, S., Gomes, S., Traets, F., Tuerlinckx, F., & Vanpaemel, W. (2021). The reproducibility of statistical results in psychological research: An investigation using unpublished raw data. Psychological Methods, 26(5), 527–546. https://doi.org/10.1037/met0000365
  • Asendorpf, J. B., Conner, M., De Fruyt, F., De Houwer, J., Denissen, J. J. A., Fiedler, K., Fieder, S., Funder, D. C., Kliegl, R., Nosek, B. A., Perugini, M., Roberts, B. W., Schmitt, M., Vanaken, M. A. G., Weber, H., & Wicherts, J. M. (2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27(2), 108–119. https://doi.org/10.1002/per.1919
  • Banks, G. C., Rogelberg, S. G., Woznyj, H. M., Landis, R. S., & Rupp, D. E. (2016). Evidence on questionable research practices: The good, the bad, and the ugly. Journal of Business and Psychology, 31(3), 323–338. https://doi.org/10.1007/s10869-016-9456-7
  • Baruch, Y., & Holtom, B. C. (2008). Survey response rate levels and trends in organizational research. Human Relations, 61(8), 1139–1160. https://doi.org/10.1177/0018726708094863
  • Bedeian, A. G. (2003). The manuscript review process: The proper roles of authors, referees, and editors. Journal of Management Inquiry, 12(4), 331–338. https://doi.org/10.1177/1056492603258974
  • Bedeian, A. G., Taylor, S. G., & Miller, A. N. (2010). Management science on the credibility bubble: Cardinal sins and various misdemeanors. Academy of Management Learning & Education, 9(4), 715–725. https://doi.org/10.5465/amle.9.4.zqr715
  • Bergh, D. D., Sharp, B. M., Aguinis, H., & Li, M. (2017). Is there a credibility crisis in strategic management research? Evidence on the reproducibility of study findings. Strategic Organization, 15(3), 423–436. https://doi.org/10.1177/1476127017701076
  • Bosco, F. A., Aguinis, H., Field, J. G., Pierce, C. A., & Dalton, D. R. (2016). Harking’s threat to organizational research: Evidence from primary and meta-analytic sources. Personnel Psychology, 69(3), 709–750. https://doi.org/10.1111/peps.12111
  • Branney, P. E., Brooks, J., Kilby, L., Newman, K., Norris, E., Pownall, M., Talbot, C. V., Treharne, G. J., & Whitaker, C. M. (2023). Three steps to open science for qualitative research in psychology. Social and Personality Psychology Compass, 17(4). Article e12728. https://doi.org/10.1111/spc3.12728.
  • Byington, E. K., & Felps, W. (2017). Solutions to the credibility crisis in management science. Academy of Management Learning & Education, 16(1), 142–162. https://doi.org/10.5465/amle.2015.0035
  • Campos, D. G., Cheung, M. W.-L., & Scherer, R. (2023). A primer on synthesizing individual participant data obtained from complex sampling surveys: A two-stage IPD meta-analysis approach. Psychological Methods. Advance online publication. https://doi.org/10.1037/met0000539
  • Center for Open Science.(n.d.,a.). Registered reports: Peer review before results are known to align scientific values and practices. https://www.cos.io/initiatives/registered-reports.
  • Center for Open Science.(n.d.,b.). Open science badges enhance openness, a core value of scientific practice. https://www.cos.io/initiatives/badges.
  • Chauvette, A., Schick-Makaroff, K., & Molzahn, A. E. (2019). Open data in qualitative research. International Journal of Qualitative Methods, 18, 1–6. https://doi.org/10.1177/1609406918823863
  • COPE. (2020, July 29). COPE governance. https://publicationethics.org/about/governance.
  • COPE, OASPA, DOAJ, & WAME. (2018, January). Principles of transparency and best practice in scholarly publishing. https://doi.org/10.24318/cope.2019.1.12.
  • Coughlan, P., & Coghlan, D. (2002). Action research for operations management. International Journal of Operations & Production Management, 22(2), 220–240. https://doi.org/10.1108/01443570210417515
  • Crüwell, S., Van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., Orben, A., Parsons, S., & Schulte-Mecklenbeck, M. (2019). Seven easy steps to open science. Zeitschrift für Psychologie, 227(4), 237–248. https://doi.org/10.1027/2151-2604/a000387
  • Cycyota, C. S., & Harrison, D. A. (2006). What (not) to expect when surveying executives: A meta-analysis of top manager response rates and techniques over time. Organizational Research Methods, 9(2), 133–160. https://doi.org/10.1177/1094428105280770
  • Ebersole, C. R., Mathur, M. B., Baranski, E., Bart-Plange, D. J., Buttrick, N. R., Chartier, C. R., Corker, K. S., Corley, M., Hartshorne, J. K., IJzerman, H., Lazarevic´, L. B., Rabagliati, H., Ropovik, I., Aczel, B., Aeschbach, L. F., Andrighetto, L., Arnal, J. D., Arrow, H., Babincak, P., & Szecsi, P. (2020). Many labs 5: Testing pre-data-collection peer review as an intervention to increase replicability. Advances in Methods and Practices in Psychological Science, 3(3), 309–331. https://doi.org/10.1177/2515245920958687
  • Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90(3), 891–904. https://doi.org/10.1007/s11192-011-0494-7
  • Ferguson, C. J. (2015). “Everybody knows psychology is not a real science”: Public perceptions of psychology and how we can improve our relationship with policymakers, the scientific community, and the general public. The American Psychologist, 70(6), 527–542. https://doi.org/10.1037/a0039405
  • Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th Edition ed.). SAGE Publications Ltd.
  • Fisher, R. A. (1922). On the mathematical foundations of theoretical statistics. Philosophical Transactions of the Royal Society of London, 222, 309–368. https://doi.org/10.1098/rsta.1922.0009
  • Gervais, W. M., Jewell, J. A., Najle, M. B., & Ng, B. K. (2015). A powerful nudge? Presenting calculable consequences of underpowered research shifts incentives toward adequately powered designs. Social Psychological and Personality Science, 6(7), 847–854. https://doi.org/10.1177/1948550615584199
  • Giner-Sorolla, R. (2012). Science or art? How aesthetic standards grease the way through the publication bottleneck but undermine science. Perspectives on Psychological Science, 7(6), 562–571. https://doi.org/10.1177/1745691612457576
  • Gioia, D. A., Corley, K. G., & Hamilton, A. L. (2012). Seeking qualitative rigor in inductive research: Notes on the Gioia methodology. Organizational Research Methods, 16(1), 15–31. https://doi.org/10.1177/1094428112452151
  • Gomez-Mejia, L. R., & Balkin, D. B. (1992). Determinants of faculty pay: An agency theory perspective. Academy of Management Journal, 35(5), 921–955. https://doi.org/10.2307/256535
  • Götz, M., & Field, J. G. (2022). Data sharing and data integrity. In K. R. Murphy (Ed.), Data, methods and theory in the organizational sciences: A new synthesis (pp. 49–72). Routledge. https://doi.org/10.4324/9781003015000-4
  • Götz, M., & O’Boyle, E. H. (in press). Cobblers, let’s stick to our lasts! A song of sorrow (and of hope) about the state of personnel and human resource management science. Research in Personnel and Human Resources Management, 41. https://doi.org/10.1108/S0742-730120230000041004
  • Götz, M., O’Boyle, E. H., Gonzalez-Mulé, E., Banks, G. C., & Bollmann, S. S. (2021). The “Goldilocks Zone”: (Too) many confidence intervals in tests of mediation just exclude zero. Psychological Bulletin, 147(1), 95–114. https://doi.org/10.1037/bul0000315
  • Grund, S., Lüdtke, O., & Robitzsch, A. (2022). Using synthetic data to improve the reproducibility of statistical results in psychological research. Psychological Methods. Advance online publication. https://doi.org/10.1037/met0000526
  • Guzzo, R., Schneider, B., & Nalbantian, H. (2022). Open science, closed doors: The perils and potential of open science for research in practice. Industrial and Organizational Psychology, 15(4), 495–515. https://doi.org/10.1017/iop.2022.61
  • Hardwicke, T. E., Mathur, M. B., MacDonald, K., Nilsonne, G., Banks, G. C., Kidwell, M. C., Hofelich Mohr, A., Clayton, E., Yoon, E. J., Henry Tessler, M., Lenne, R. L., Altman, S., Long, B., & Frank, M. C. (2018). Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal cognition. Royal Society Open Science, 5(8), 180448. https://doi.org/10.1098/rsos.180448
  • Haven, T. L., Errington, T. M., Gleditsch, K. S., van Grootel, L., Jacobs, A. M., Kern, F. G., Piñeiro, R., Rosenblatt, F., & Mokkink, L. B. (2020). Preregistering qualitative research: A delphi study. International Journal of Qualitative Methods, 19, 1–13. https://doi.org/10.1177/1609406920976417
  • Hensel, P. (2021). Reproducibility and replicability crisis: How management compares to psychology and economics – a systematic review of literature. European Management Journal, 39(5), 577–594. https://doi.org/10.1016/j.emj.2021.01.002
  • Hubbard, R., Vetter, D. E., & Little, E. L. (1998). Replication in strategic management: Scientific testing for validity, generalizability, and usefulness. Strategic Management Journal, 19(3), 243–254. doi:https://doi.org/10.1002/(SICI)1097-0266(199803)19:3<243:AID-SMJ951>3.0.CO;2-0
  • Hüffmeier, J., Torka, A.-K., Jäckel, E., & Schäpers, P. (2022). Open science practices in IWO psychology: Urban legends, misconceptions, and a false dichotomy. Industrial and Organizational Psychology, 15(4), 520–524. https://doi.org/10.1017/iop.2022.69
  • IBM Corp. (2022). IBM SPSS Statistics (Version 28.0.0.0). IBM Corp. https://www.ibm.com/products/spss-statistics
  • JASP Team. (2023). JASP (0.12.2). https://jasp-stats.org/
  • John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
  • Kathawalla, U. K., Silverstein, P., & Syed, M. (2021). Easing into open science: A guide for graduate students and their advisors. Collabra: Psychology, 7(1). https://doi.org/10.1525/collabra.18684
  • Kepes, S., Banks, G. C., McDaniel, M., & Whetzel, D. L. (2012). Publication bias in the organizational sciences. Organizational Research Methods, 15(4), 624–662. https://doi.org/10.1177/1094428112452760
  • Kepes, S., Keener, S. K., McDaniel, M. A., & Hartman, N. S. (2022). Questionable research practices among researchers in the most research‐productive management programs. Journal of Organizational Behavior, 43(7), 1190–1208. https://doi.org/10.1002/job.2623
  • Kepes, S., List, S. K., & McDaniel, M. A. (2018). Enough talk, it’s time to transform: A call for editorial leadership for a robust science. Industrial and Organizational Psychology, 11(1), 43–48. https://doi.org/10.1017/iop.2017.83
  • Kern, F. G., & Gleditsch, K. S. (2017). Exploring pre-registration and pre-analysis plans for qualitative inference [Working paper]. https://t1p.de/nvb9i.
  • Kerr, N. L. (1998). Harking: Hypothesizing after the results are known. Personality and Social Psychology Review, 2(3), 196–217. https://doi.org/10.1207/s15327957pspr0203_4
  • Kerr, S. (1975). On the folly of rewarding A, while hoping for B. Academy of Management Journal, 18(4), 769–783. https://doi.org/10.2307/255378
  • Kidwell, M. C., Lazarević, L. B., Baranski, E. N., Hardwicke, T. E., Piechowski, S., Falkenberg, L. -S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C., Errington, T. M., Fiedler, S., & Nosek, B. A. (2016). Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS biology, 14(5), e1002456. https://doi.org/10.1371/journal.pbio.1002456
  • Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Bahník, Š., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Brumbaugh, C. C., Cemalcilar, Z., Chandler, J. J., Cheong, W., Davis, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., & Nosek, B. A. (2014). Investigating variation in replicability: A “Many Labs” replication project. Social Psychology, 45(3), 142–152. https://doi.org/10.1027/1864-9335/a000178
  • Koole, S. L., & Lakens, D. (2012). Rewarding replications: A sure and simple way to improve psychological science. Perspectives on Psychological Science, 7(6), 608–614. https://doi.org/10.1177/1745691612462586
  • Lakens, D. (2023). Is my study useless? Why researchers need methodological review boards. Nature, 613(7942), 9. https://doi.org/10.1038/d41586-022-04504-8
  • Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174. https://doi.org/10.2307/2529310
  • Maassen, E., van Assen, M. A., Nuijten, M. B., Olsson-Collentine, A., Wicherts, J. M., & Gnambs, T. (2020). Reproducibility of individual effect sizes in meta-analyses in psychology. PloS One, 15(5), e0233107. https://doi.org/10.1371/journal.pone.0233107
  • Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? The American Psychologist, 70(6), 487–498. https://doi.org/10.1037/a0039400
  • Merton, R. K. (1942). A note on science and democracy. Journal of Legal and Political Sociology, 1(1 and 2), 115–126.
  • Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., Glennerster, R., Green, D. P., Humphreys, M., Imbens, G., Laitin, D., Madon, T., Nelson, L., Nosek, B. A., Petersen, M., Sedlmayr, R., Simmons, J. P., Simonsohn, U., & Van der Laan, M. (2014). Promoting transparency in social science research. Science, 343(6166), 30–31. https://doi.org/10.1126/science.1245317
  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & Group, P. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Medicine, 6(7), e1000097. https://doi.org/10.1371/journal.pmed.1000097
  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021
  • Murphy, K. R., & Aguinis, H. (2019). Harking: How badly can cherry-picking and question trolling produce bias in published results?. Journal of Business and Psychology, 34(1), 1–17. https://doi.org/10.1007/s10869-017-9524-7
  • National Academies of Sciences, Engineering, and Medicine (NASEM). (2019) . Reproducibility and replicability in science. The National Academies Press.
  • Nelson, L. D., Simmons, J. P., & Simonsohn, U. (2018). Psychology’s renaissance. Annual Review of Psychology, 69(1). https://doi.org/10.1146/annurev-psych-122216-011836
  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374
  • Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11), 2600–2606. https://doi.org/10.1073/pnas.1708274114
  • Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D., & Vazire, S. (2022). Replicability, robustness, and reproducibility in psychological science. Annual Review of Psychology, 73(1), 719–748. https://doi.org/10.1146/annurev-psych-020821-114157
  • Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615–631. https://doi.org/10.1177/1745691612459058
  • Obels, P., Lakens, D., Coles, N. A., Gottfried, J., & Green, S. A. (2020). Analysis of open data and computational reproducibility in registered reports in psychology. Advances in Methods and Practices in Psychological Science, 3(2), 229–237. https://doi.org/10.1177/2515245920918872
  • O’Boyle, E. H., Banks, G. C., Carter, K., Walter, S. L., & Yuan, Z. (2019). A 20-year review of outcome reporting bias in moderated multiple regression. Journal of Business and Psychology, 34(1), 19–37. https://doi.org/10.1007/s10869-018-9539-8
  • O’Boyle, E. H., Banks, G. C., & Gonzalez-Mulé, E. (2017). The chrysalis effect: How ugly initial results metamorphosize into beautiful articles. Journal of Management, 43(2), 376–399. https://doi.org/10.1177/0149206314527133
  • O’Boyle, E. H., & Götz, M. (2022). Questionable research practices. In L. J. Jussim, J. A. Krosnick, & S. T. Stevens (Eds.), Research integrity: Best practices for the social and behavioral sciences (pp. 260–294). Oxford University Press. https://doi.org/10.1093/oso/9780190938550.003.0010
  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716
  • Pashler, H., & Wagenmakers, E. J. (2012). Editors’ introduction to the special section on replicability in psychological science: A crisis of confidence? Perspectives on Psychological Science, 7(6), 528–530. https://doi.org/10.1177/1745691612465253
  • Piñeiro, R., & Rosenblatt, F. (2016). Pre-analysis plans for qualitative research. Revista de Ciencia Política, 36(3), 785–796. https://doi.org/10.4067/S0718-090X2016000300009
  • Protzko, J., Krosnick, J., Nelson, L. D., Nosek, B. A., Axt, J., Berent, M., Buttrick, N., DeBell, M., Ebersole, C., Lundmark, S., MacInnis, B., O’Donnell, M., Perfecto, H., Pustejovsky, J. E., Roeder, S., Walleczek, J., & Schooler, J. W. (2020). High replicability of newly-discovered social-behavioral findings is achievable [Working paper]. https://doi.org/10.31234/osf.io/n2a9x.
  • Rauhut, H., Johann, D., Jerke, J., Rathmann, J., & Velicu, A. (2020). The Zurich survey of academics: Methods, design, and data. https://doi.org/10.5167/uzh-188304.
  • Riley, S., Brooks, J., Goodman, S., Cahill, S., Branney, P., Treharne, G. J., & Sullivan, C. (2019). Celebrations amongst challenges: Considering the past, present and future of the qualitative methods in psychology section of the British psychology society. Qualitative Research in Psychology, 16(3), 464–482. https://doi.org/10.1080/14780887.2019.1605275
  • Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005). Publication bias in meta‐analysis: Prevention, assessment and adjustments. Wiley. https://doi.org/10.1002/0470870168
  • Rubin, M., & Donkin, C. (2022). Exploratory hypothesis tests can be more compelling than confirmatory hypothesis tests. Philosophical Psychology, 1–29. https://doi.org/10.1080/09515089.2022.2113771
  • Sarafoglou, A., Kovacs, M., Bakos, B., Wagenmakers, E. J., & Aczel, B. (2022). A survey on how preregistration affects the research workflow: Better science but more work. Royal Society Open Science, 9(7), 211997. https://doi.org/10.1098/rsos.211997
  • Sijtsma, K., Emons, W. H. M., Steneck, N. H., & Bouter, L. M. (2021). Steps toward preregistration of research on research integrity. Research Integrity and Peer Review, 6(1), 5. https://doi.org/10.1186/s41073-021-00108-4
  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632
  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2021). Pre‐registration: Why and how. Journal of Consumer Psychology, 31(1), 151–162. https://doi.org/10.1002/jcpy.1208
  • Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014). P-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143(2), 534–547. https://doi.org/10.1037/a0033242
  • Spellman, B., Gilbert, E., & Corker, K. S. (2017). Open science: What, why, and how [Working paper]. https://doi.org/10.31234/osf.io/ak6jr
  • Stanley, T. D., Carter, E. C., & Doucouliagos, H. (2018). What meta-analyses reveal about the replicability of psychological research. Psychological Bulletin, 144(12), 1325–1346. https://doi.org/10.1037/bul0000169
  • Sterling, T. D. (1959). Publication decisions and their possible effects on inferences drawn from tests of significance—or vice versa. Journal of the American Statistical Association, 54(285), 30–34. https://doi.org/10.1080/01621459.1959.10501497
  • Stewart, M. M., & Shapiro, D. L. (2000). Selection based on merit versus demography: Implications across race and gender lines. The Journal of Applied Psychology, 85(2), 219–231. https://doi.org/10.1037/0021-9010.85.2.219
  • Sturman, M. C., Sturman, A. J., & Sturman, C. J. (2022). Uncontrolled control variables: The extent that a researcher’s degrees of freedom with control variables increases various types of statistical errors. The Journal of Applied Psychology, 107(1), 9–22. https://doi.org/10.1037/apl0000849
  • Susman, G. I., & Evered, R. D. (1978). An assessment of the scientific merits of action research. Administrative Science Quarterly, 23(4), 582–603. https://doi.org/10.2307/2392581
  • Tenney, E. R., Costa, E., Allard, A., & Vazire, S. (2021). Open science and reform practices in organizational behavior research over time (2011 to 2019). Organizational Behavior and Human Decision Processes, 162, 218–223. https://doi.org/10.1016/j.obhdp.2020.10.015
  • Torka, A. -K., Mazei, J., & Hüffmeier, J. (in press). Are replications mainstream now? A comparison of support for replications expressed in the policies of social psychology journals in 2015 and 2022 [Accepted manuscript]. Social Psychological Bulletin. https://doi.org/10.23668/psycharchives.8392
  • Toth, A. A., Banks, G. C., Mellor, D., O’Boyle, E. H., Dickson, A., Davis, D. J., DeHaven, A., Bochantin, J., & Borns, J. (2020). Study preregistration: An evaluation of a method for transparent reporting. Journal of Business and Psychology, 36(4), 553–571. https://doi.org/10.1007/s10869-020-09695-3
  • Van Eerde, W., & Thierry, H. (1996). Vroom’s expectancy models and work-related criteria: A meta-analysis. The Journal of Applied Psychology, 81(5), 575–586. https://doi.org/10.1037/0021-9010.81.5.575
  • Vroom, V. H. (1964). Work and motivation. https://doi.org/10.1037/h0042378
  • Wagenmakers, E. -J., & Dutilh, G. (2016). Seven selfish reasons for preregistration. APS Observer, 29(9). https://www.psychologicalscience.org/observer/seven-selfish-reasons-for-preregistration
  • Zakay, D., Ellis, S., & Shevalsky, M. (2004). Outcome value and early warning indications as determinants of willingness to learn from experience. Experimental Psychology, 51(2), 150–157. https://doi.org/10.1027/1618-3169.51.2.150
  • Zhao, B. (2011). Learning from errors: The role of context, emotion, and personality. Journal of Organizational Behavior, 32(3), 435–463. https://doi.org/10.1002/job.696
  • Zhao, B., & Olivera, F. (2006). Error reporting in organizations. Academy of Management Review, 31(4), 1012–1030. https://doi.org/10.5465/amr.2006.22528167

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.