1,307
Views
13
CrossRef citations to date
0
Altmetric
Articles

Are peer reviews of grant proposals reliable? An analysis of Economic and Social Research Council (ESRC) funding applications

ORCID Icon &
Pages 91-109 | Received 13 Jul 2019, Accepted 24 Dec 2019, Published online: 06 Mar 2020
 

ABSTRACT

Peer review is widely used throughout academia, most notably in the publication of journal articles and the allocation of research grants. Yet peer review has been subject to much criticism, including being slow, unreliable, subjective and potentially prone to bias. This paper contributes to this literature by investigating the consistency of peer reviews and the impact they have upon a high-stakes outcome (whether a research grant is funded). Analysing data from 4,000 social science grant proposals and 15,000 reviews, this paper illustrates how the peer review scores assigned by different reviewers have only low levels of consistency (a correlation between reviewer scores of only 0.2). Reviews provided by “nominated reviewers” (i.e. reviewers selected by the grant applicant) appear to be overly generous and do not correlate with the evaluations provided by independent reviewers. Yet a positive review from a nominated reviewer is strongly linked to whether a grant is awarded. Finally, a single negative peer review is shown to reduce the chances of a proposal being funding from around 55% to around 25% (even when it has otherwise been rated highly).

Highlights

  • Peer review scores assigned by different reviewers have only low levels of consistency (a correlation between reviewer scores of only 0.2).

  • Reviews provided by “nominated reviewers” (i.e. reviewers selected by the grant applicant) appear to be overly generous and do not correlate with the evaluations provided by independent reviewers. Yet a positive review from a nominated reviewer is strongly linked to whether a grant is awarded.

  • A single negative peer review is shown to reduce the chances of a proposal being funding from around 55% to around 25% (even when it has otherwise been rated highly).

Acknowledgements

I am grateful to the ESRC, one of the seven Research Councils of UKRI, for sharing their data. This project would not have been possible without their support.

Notes

1 This statement describes how applicants will “act to enable the research to connect with others and make a difference conceptually and instrumentally.” https://esrc.ukri.org/research/impact-toolkit/developing-pathways-to-impact/?_ga=2.152184825.1305920688.1553508319-271472340.1553508319.

2 See nominated reviewer section of https://je-s.rcuk.ac.uk/handbook/index.htm.

3 The database used in this paper suggests that receiving less than three reviews is rare.

7 The author initially requested the data under Freedom of Information legislation. Although this was rejected, it started a conversation with the ESRC. It was agreed that a limited amount of data could be provided for the purposes of writing this academic paper, with it being kept within a secure server at UCL and not to be further shared.

8 Any grant application that involved the author (either as an applicant or as a reviewer) was also excluded from the database that the ESRC provided. Information for the 2018/19 financial year was partial as the data was received part way through this period.

9 Note that polychoric (rather than Pearson) correlation is used to account for the categorical nature of ESRC peer review scores. This is a technique for estimating the correlation between two latent variables that are assumed to be continuous and normally distributed, based upon observed ordinal data.

10 Weighted Kappa statistics give more weight to larger disagreements between reviews (cells are further away from the leading diagonal on the cross-tabulation). Hence a difference between two reviewers who score a proposal 5 and 2 is treated as lower agreement than two reviewers who score a proposal a 4 and 3. (Unweighted Kappa would treat these two situations equally).

11 We conducted robustness tests in which we (a) analyse only proposals with three reviews and (b) analyse proposals with between three and six reviews. These did not substantively alter our findings.

12 These figures increase marginally to 0.48 (four reviewers) and 0.53 (five reviewers) when nominated reviewers are excluded.

13 As noted in the methodology section, these and subsequent figures in this section are computed only for proposals with three or four reviewers.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 250.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.