399
Views
7
CrossRef citations to date
0
Altmetric
Articles

Replicability Crisis and Scientific Reforms: Overlooked Issues and Unmet Challenges

ORCID Icon
 

ABSTRACT

Nowadays, almost everyone seems to agree that science is facing an epistemological crisis – namely the replicability crisis – and that we need to take action. But as to precisely what to do or how to do it, there are no firm answers. Some scholars argue that the current statistical inferential framework is inadequate and therefore we should focus on improving statistical methods. Some others claim instead that the only way to fix science is to change the scientific reward system, promoting quality rather than quantity of scientific publications. However, every positive proposal, either methodological or social, has a valid counterargument. Here I want to make explicit some reasons for explaining the persistence of disagreement on the solutions to the crisis. Focusing on issues which have been overlooked in the debate might help to better evaluate scientific reforms. With this regard, philosophical knowledge can be mobilised to take action in response to the replicability crisis.

Notes

1 In the literature there is a terminological confusion and no standardized usage of ‘reproducibility’ and ‘replicability’. Some distinguish between them, often referring to replicability as the ability to get the same result from a new sample, whereas reproducibility is the ability to get the same result from the same data. Here I am using the term replicability in an intuitive rather than technical sense. For a catalog of uses of the recurrent terms ‘reproduce’ and ‘replicate’ see (Barba Citation2018).

2 The claim was based on a survey showing that scientists at Amgen succeeded to replicate only 13 out of 67 papers (most of them from the field of oncology).

3 Philosophers of science, ranging from Hempel to Popper to Kuhn and Kitcher, have long understood that the scientific method is not a smooth and steady process, but rather a meandering path between confirmation and falsification. Then, it should come as no surprise that published research sometimes cannot be reproduced. Error is an integral part of science. However, the replicability crisis is ‘particularly significant’ in this sense because it goes beyond the usual (and reasonable) error rate in scientific experimenting. We can discuss the acceptable percentage of non-reproducible studies, but the large numbers showed by meta-research clearly show that something is wrong.

4 See, e.g. Meskus, Marelli, and D’Agostino (Citation2017); Abbott (Citation2016); Enserink (Citation2012); Kurzrock, Kantarjian, and Stewart (Citation2014).

5 See also Nature's editorial ‘Rewarding Negative Results Keeps Science on Track.’ 2017. Nature 551 (7681): 414–414. https://doi.org/10.1038/d41586-017-07325-2.

6 Although the two parties have never made this basic disagreement explicit, we can clearly notice it at the rhetorical level where the debate has borne the hallmarks of a political (rather than scientific) confrontation. In non-academic/informal discussions, defenders of statistical reforms have been ‘playfully’ called ‘methodological terrorists’ (Fiske Citation2016) or ‘p-value police’ (Mayo Citation2018) because of their insistence on statistical methodology, while advocates of social reforms can be labeled ‘research integrity czars’ (Oransky and Marcus Citation2018) for their insistence on promoting good scientific behaviours.

7 Philip Kitcher has extensively discussed the issue of science governance (see, e.g. Kitcher Citation2011). But his research has mostly focused on setting the ‘research agenda’. More recently, the RRI framework has been proposed in Europe to deal with ethical issues and societal needs with regard to current and future research and technologies.

8 This claim is in contrast with a large tradition in Philosophy of Science (see, e.g. Hacking Citation2015) according to which ‘elevated’ discussion in philosophy of science have not been very fruitful.

Additional information

Funding

This work was supported by European Research Council [Starting Investigator Grant 640638].

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.