Abstract
This paper reviews changes in New Zealand universities since the introduction of the Performance Based Research Fund (PBRF) in 2003, and evaluates changes in relation to the stated objectives. This stocktake of research findings is in part a response to the official report of the Review Panel, which made no attempt to review evidence of performance. A key objective was to achieve an improvement in research quality. It is suggested that improvements have been related closely to the incentives created by the scheme, and achieved by considerable staff turnover. The present stocktake of the changed nature of universities and the details of the evaluation process suggests that substantial simplifications could usefully be made while maintaining incentives that are at the heart of any PBRF.
KEYWORDS:
Acknowledgements
We are grateful to the New Zealand Tertiary Education Commission (TEC) for providing the anonymised longitudinal data used here. The data are not publicly available and were provided by the TEC following a confidentiality agreement. We should like to thank Norman Gemmell and two referees for comments on an earlier draft.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Notes
1 In international contexts, see Whitley (Citation2008) and OECD (Citation2010).
2 This does not of course imply support of the metrics, and it is perfectly reasonably to argue that ‘success’ of the PBRF in its own terms has resulted in ‘failure’ regarding a different set of metrics.
3 Research funding was included in the bulk funding of tertiary education institutions on the basis of EFTS, adjusted by weighting for different course costs. This bulk funding was intended to cover capital and operating costs, as well as tuition and research. In allocating funding, there was little attention to accountability, capacity building, and governance.
4 Recommendations not subsequently adopted include the quality rating of academics using a mix of performance-indicators (such as bibliometric measures) and peer-review, and those ratings should be determined by institutional self-assessment, subject to five-yearly external audits (of a random sample of about 10% of staff), conducted by independent, multi-disciplinary assessment panels.
5 A suggestion of a two-step process to evaluate the quality of researchers was not adopted.
6 Data were obtained from: https://www.tec.govt.nz/funding/funding-and-performance/performance/financial/.
7 Furthermore, Gibson, Anderson, and Tressler (Citation2017) show that journal citations and academic salaries in the US are not closely correlated. See also Gibson, Anderson, and Tressler (Citation2014).
8 In 2012, the weight for C(NE) was the same as for C. However, in 2018 the weight for C(NE) was increased to 4. The weight for R(NE) remained zero over both periods.
9 The peculiarities of the system are explored further in Buckle and Creedy (Citation2019b), who also examine the iterations taken by the peer review panels, involving large adjustments to initial scores, in arriving at the final QC categories for individuals.
10 This improvement in research performance after 2003 is consistent with work evaluating changes in the productivity of NZ universities, by Smart (Citation2009) and Gemmell, Nolan, and Scobie (Citation2017).
11 For details, see Buckle and Creedy (Citation2019b, Citation2020). The early effect on publishing in economics of the PBRF was examined by Anderson and Tressler (Citation2014).
12 Buckle and Creedy (Citation2019a) list a set of hypotheses regarding the changes within universities that would be expected to arise from the incentives created by the PBRF. Statistical analyses show strong support for those hypotheses.
13 This is demonstrated, using a decomposition of quality changes, by Buckle, Creedy, and Ball (Citation2021).
14 For universities and discipline groups respectively, see Buckle and Creedy (Citation2019a, Citation2020).
15 This is demonstrated by Buckle and Creedy (Citation2019a) and Buckle, Creedy, and Ball (Citation2021), who examine equilibrium distributions arising from unchanged transitions.
16 The composition of each discipline groups is listed in Buckle and Creedy (Citation2020).
17 For details, see Buckle and Creedy (Citation2022, pp. 21–22).
18 Hazledine and Kurniawan report Web of Science (2004) estimates of costs incurred by the administering agencies and compliance costs of universities for the 2003 round. The Web of Science estimate is less than 2% of the total funds to be allocated, which they report was comparable with Hong Kong and the UK.
19 Stern (Citation2016, p. 20) recommended that in the UK scheme all research-active staff should be examined.