2,251
Views
34
CrossRef citations to date
0
Altmetric
Original Articles

University rankings: do they matter in the UK?

Pages 137-161 | Received 14 Jun 2011, Accepted 23 Jul 2012, Published online: 08 Oct 2012
 

Abstract

This paper offers the first comprehensive analysis of the effect of changes in university rankings on applicant and institution behaviour in the UK. When their rank worsens, universities are found to experience small but statistically significant reductions in the number of applications received as well as in the average tariff score of applicants and accepted applicants. Although the effects found are stronger for certain types of students and institutions, they tend to be modest overall, and suggest that other factors play a more important role in attracting applicants to universities.

Acknowledgements

I would like to thank Arnaud Chevalier for comments on various drafts of this paper. I am also grateful to participants at the Department for Innovation and Skills (BIS) Higher Education Evidence Group and the 2011 Work Pensions and Labour Economics Study Group (WPEG). The paper further benefited from comments and suggestions from two anonymous referees.

Notes

1. See, for instance, Provan and Abercromby (Citation2000), Clarke (Citation2002), Eccles (Citation2002), Yorke and Longden (Citation2005), Turner (Citation2005), Dill (Citation2006), and Birnbaum (Citation2007).

2. This is over and above the general circulation of these newspapers in print. According to recent statistics (Ponsford Citation2011), the Guardian newspaper has a circulation of around quarter of a million, and the Times' is just under half a million.

3. Parker and Summers (Citation1993) is an earlier paper which looked mainly at the effect of changes in tuition and fees on the matriculation rate of applicants admitted to a group of selective liberal arts colleges. Although they also look at college rankings, they do not explore the effect of changes in rankings. They find that the higher an institution's ranking, the higher its matriculation rate.

4. In 1992, many polytechnics and colleges of higher education were given university status in the UK. The term ‘pre-1992’ therefore refers to institutions that already had university status prior to that date.

5. One difficulty encountered in building a time series is institutional changes – mergers, in particular. The general approach taken was to treat institutions as separate before they merged, and then as one (separate) institution afterwards.

6. Note that each applicant in the UCAS system is entitled to a certain number of choices/applications – six prior to 2008/2009, and five from that year onwards. The analysis in this paper will be at the level of the choice/application, and not the applicant level.

7. This distinction is important because EU students in the UK are eligible to pay the same fees as domestic students. Other overseas students are subject to much higher fee levels.

8. For the time period studied, university fees for domestic and EU students were virtually all the same. Only a handful of institutions briefly charged slightly lower fees when the fees cap was raised in 2006/2007 to £3000.

9. The data are available from the website www.publicgoods.co.uk. Sometimes a range of fees was provided, and a judgment needed to be made to choose either the lower or higher fee stated to make sure that the time series for each institution was as consistent as possible.

10. Although rare, there are some institutions which refuse to participate in the rankings. Also note that there is no change between 2007 and 2008 in the Times rank as no Times league table was published in 2008. So the rankings for 2008 are assumed to be the same as in 2007.

11. Throughout this paper, the highest rank an institution can attain is 1. An institution will be considered to ‘rise’ in the rankings if it moves, for example, from position 7 to 3. Vice versa, a move from rank 3 to 7 will be described as a ‘drop’ in the league tables. All results presented in Section 5 will be based on a one place ‘drop’ in the rankings (i.e. an increase in the actual ranking value).

12. In the UK, the number of full-time undergraduate places available to UK-domiciled students is, to a large extent, fixed exogenously by the funding councils. International students do not, therefore, displace home students. So the proportion of international students accepted should be independent from the proportion of UK-domiciled students accepted.

13. As it happens, this variable is rarely significant and so, for reasons of space, coefficients on this variable have not been reported in what follows. There is a small effect of the lagged Guardian score on the average tariff score of applicants (half the size of the current rank's effect, and significant only at the 10% level).

14. The average annual change in ranking is also much larger for non-Russell Group institutions than for Russell Group institutions: 5.6 (Times), 13.5 (Guardian), 22.5 (THES), and 15.4 (ARWU) for non-Russell Group Institutions, compared to 2.7 (Times), 4.7 (Guardian), 13.9 (THES) and 8.7 (ARWU) for Russell Group Institutions.

15. Unless, of course, prospective students are aware of and heed such methodological changes, in which case they could dismiss the information value of league tables (or, at least, sudden and large annual changes).

16. The only exception being that the inclusion of institution-specific time trends turns the effect of a drop in the ARWU rankings on international student fees positive (i.e. fees increase when an institution falls down the ARWU rankings) – which appears counterintuitive and is not corroborated by any of the other results. Results are available upon request.

17. Percentage changes here and elsewhere in the paper are calculated on the basis of the mean value for the institutions with non-missing rankings information.

18. These models assume that league tables are mutually exclusive and that students only use one of the rankings in deciding which universities to apply to. In practice, however, it is possible (and likely) that candidates consult a range of rankings and make a decision based on some weighted average of universities' positions across the various league tables. To test for this, a new variable was created which measures an institution's average ranking across both the Times and Guardian guides, and the main analysis from was re-run. The results are available upon request, and are very similar to those obtained here. I also ran a regression where both the Guardian and Times rankings were entered simultaneously. Again, the results are similar to those obtained in , and are available upon request.

19. These are the universities of: Birmingham, Bristol, Cambridge, Cardiff, Edinburgh, Glasgow, Leeds, Liverpool, Manchester, Newcastle, Nottingham, Oxford, Sheffield, Southampton, as well as Imperial College London, King's College London, the London School of Economics and Political Science, Queen's University Belfast and University College London. For more information, see: http://www.russellgroup.ac.uk/

20. These results are confirmed by regressions re-run on the broader group of institutions that were consistently ranked in the top tercile of the Times and Guardian rankings: a 10-place drop in the rankings of these institutions leads to a fall of between 1.4% (the Guardian) and 4.7% (the Times) in the number of applications received. Terciles were derived by standardising and averaging the annual Times and Guardian rankings for each institution over the period 2002–2009, and splitting the sample into three based on this average rank.

21. See note 20 for an explanation of how these terciles were derived.

22. In the case of Black and mature applications, the simple models used in this paper result in a within-institution R2 of around 0.40. Although relatively large considering the parsimonious models used, it also suggests that a large proportion of the annual variation in applications to institutions is driven by factors other than rankings.

23. Although the fee cap was also raised in 2006/2007, it was part of a comprehensive package of reforms to student finance (including the introduction of student loans to cover the cost of fees and the deferral of repayments until after graduation), which meant that the net upfront cost of going to university was actually reduced (Dearden, Fitzsimons and Wyness 2011). As a result, it is unsurprising that no clear effect of raising the fee cap on the relationship between university ranking and applications to university could be detected (results are available upon request). If anything, the data suggest that rankings may have become less important. Although seemingly counter-intuitive, this finding would be consistent with the conclusion of Dearden, Fitzsimons and Wyness (2011) that the net result of the 2006/2007 reforms was a reduction in the net upfront costs of going to university.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.