1,020
Views
1
CrossRef citations to date
0
Altmetric
Editorial

World university subject rankings: a threat for the sociology of sport?

University rankings have finally reached sport and movement studies. In September 2017, ShanghaiRanking published a Global Ranking of Sport Science Schools and Departments (http://www.shanghairanking.com/Special-Focus-Institution-Ranking/Sport-Science-Schools-and-Departments-2017.html). In February 2018, QS (Quacquarelli Symonds) followed and published the QS World University Rankings by Subject 2018: Sports-Related Subjects (https://www.topuniversities.com/university-rankings/university-subject-rankings/2018/sports-related-subjects).

With regard to the acknowledgement of sport as an academic subject, this could be considered as a success – at least for those universities that are ranked on top of the list. For these ‘top-universities’, ranking positions obviously play a relevant role for their PR strategies, as can be seen on their homepages. The statements imply that ‘the higher the ranking, the better the universities’.

It is difficult to argue against this assumption – at least as long as these rankings are based on a fair and valid comparison of similar institutions. But is this the case in the rankings of sports-related subjects? In order to answer this, one has to critically examine the criteria which are used to quantify quality differences within sport-related research at universities.

QS World University Rankings by Sports-Related Subjects 2018 apply four different criteria, two referring to data from expert ratings and two referring to a citation data analysis sourced from Scopus. These four criteria are (https://www.topuniversities.com/subject-rankings/methodology):

  1. Academic reputation: Respondents had to list up to 10 domestic and 30 international institutions which they consider to be excellent for research in the given area.

  2. Employer reputation: Employers were asked to identify up to 10 domestic and 30 international institutions in their respective field they consider excellent for the recruitment of graduates.

  3. Citations per paper: Citations data spanning a five-year period were sourced from Scopus.

  4. h-index: Data were sourced from the Scopus.

ShanghaiRanking’s Global Ranking of Sport Science Schools and Departments is based on three main criteria, which are further differentiated into five sub-criteria (http://www.shanghairanking.com/Special-Focus-Institution-Ranking/Methodology-for-Sport-Science-Schools-and-Departments-2017.html):

  1. Research output, which is operationalized by papers indexed in Web of Science (20%).

  2. Research quality, which is measured by the number of citations of papers published by an institution (20%), the number of citations per paper (25%), and the number of papers published in top 25% journals (25%),

  3. International collaboration, measured by the percentage of an institution’s publications with international co-authorship (10%).

While ShanghaiRanking’s Global Ranking of Sport Science Schools and Departments appears to be based on objective data only, QS World University Rankings by Sports-Related Subjects 2018 also use subjective data collected by an expert survey for ranking the analyzed institutions. Expert ratings to evaluate academic and employer reputation are not unusual in research rankings. However, their validity depends on the size and representativeness of the collected data: how many people were asked from which subjects and which academic background did they have. The problem in this regard is that institutions that conduct sport-related research and teach sport-related subjects differ substantially from each other in their internal structure. While some institutions only consist of research groups in biomechanics, human movement studies, physiology, and sport medicine, others also comprise research groups in physical education, sports sociology, sometimes even sports philosophy. Ranking these different institutions using the same measurements is highly problematic, even if the composition of respondents reflects a representative sample of academics employed by the existing sport-related institutions worldwide. In the top-ranked institutions in QS World University Rankings by Sports-Related Subjects 2018, for example, the average proportion of sport sociologists within the institutions is significantly lower than the average proportion of academics in exercise physiology in the same institutions. If now, however, the percentages of excellent sport sociologists and exercise physiologists are equal, then the probability that institutions with excellent sport-sociological research groups are identified by the mentioned ranking methodology is lower than in the case of institutions with excellent research groups in exercise physiology department. In fact, the probability that a research group for sport sociology gets credit as being part of an excellent institution is (given that the number of researchers from exercise physiology participating in the survey is higher than the number of sport sociologists) even less dependent on the quality of sport sociological research in this group than on the quality of the research in exercise physiology in the same institution.

Not only the subjective, but also the apparently objective categories are problematic with regard to creating a fair and valid ranking of sport-related research institutions. The number of citations per person, for example, does not depend primarily on the quality of the research but rather on the size of the academic community and the culture of citing peers. Comparisons should therefore be restricted to comparable sub-disciplines and not include the entire, heterogeneous bundle of sport-related subjects into one ranking. A similar criticism applies to the h-index, which measures the number of citations of a researcher’s publications within a specific time frame. Even though some sport sociologists have relatively high h-indices, the h-index-numbers in sport sociology and other sport-related social scientific sub-disciplines are usually significantly lower than in sport-related natural scientific sub-disciplines or sports medicine, not least because the number of peers is smaller. Furthermore, for researchers from sport sociology it makes a huge difference on which basis the h-index is calculated. Calculations based on Google Scholar, for example, usually show significantly higher h-indeces for sport sociologists than calculations based on Scopus or on the Web of Science. The explanation is easy: Many journals which are important in the sport sociological community, including our journal, the European Journal for Sport and Society, were recorded by Google Scholar at the time of the data collection of the sport-related world rankings, but not included into Web of Science, some not even into Scopus. The h-index of a sport sociologist can therefore significantly differ depending on the data base used (For some sport sociologists, the difference between an h-index calculated on the basis of Google Scholar compared to Scopus or the Web of Science amounts to more than 20 points).

Hence, neither h-indices nor citations per person can be regarded as valid indicators for quality differences between institutions which differ regarding their departmental structure. Even more: The average h-index per researcher in a sport-related institution depends to a relevant extent on the proportion of researchers from natural scientific sport-related sub-disciplines or sports medicine within the staff of this institution (not least, because the citation behaviour is affected by factors which are highly field-dependent, such as number of researchers, publication strategies, and citation culture).

Against this background, it makes a huge difference for the ranking of a faculty, school or institute that does sport-related research which sport-related sub-disciplines are incorporated at this institute, faculty or and school.

Putting it a little more provocatively: the world rankings of schools and departments in the field of sport-related subjects compare apples with oranges. How absurd it is to compare the research quality of structurally heterogeneous faculties, institute, schools, and departments in a diverse research field like sport science using the same methodology becomes particularly obvious in the criterion ‘number of papers published in top-25%-journals’. According to the methodology of ShanghaiRanking’s Global Ranking of Sport Science Schools and Departments, ‘the top 25% journals are those with an impact factor in the top 25% according to Journal Citation Report, 2015’ (http://www.shanghairanking.com/Special-Focus-Institution-Ranking/Methodology-for-Sport-Science-Schools-and-Departments-2017.html#2). The fact that counting the number of articles disregards that sport science schools and departments differ substantially regarding their size is only one shortcoming of this criterion. Another is that the journals’ impact factors differ substantially between sub-disciplines in sport science. Sport science schools and departments that mainly consist of social scientific sub-disciplines have a clear disadvantage compared to institutions with big sports medicine research groups.

For sport sociology specifically, one of the main problems of the world rankings of schools and departments lies in the fact that social sciences have a totally different system of publishing research findings than natural sciences or medicine do. They do not only have smaller scientific communities but also different publication strategies. For instance, in the sport-related world subject rankings, publications in volumes or monographs are not included when it comes to the evaluation of research quality. However, many sport sociologists still publish relevant papers in volumes or write books in order to present their research findings, because books give them a more suitable format to discuss their results in detail. Rankings which are only based on journal-related citation analyses therefore have the effect that a relevant amount of papers of researchers in sport sociology is not taken into account when it comes to the assessment of research quality.

The rankings have further shortcomings. For example, the primary focus is on English publications because only few journals that publish articles in other languages than English are included into the Web of Science and Scopus. However, in European countries the official journals of the national societies for the sociology of sport often publish in the language of the respective country. One could argue that English is the world language of research and that the choice of the journal’s language is the responsibility of the national associations. This argument is surely convincing. Nevertheless, it has to be pointed out that the world rankings only collect part of the relevant data and therefore cannot exactly assess the real amount and quality of research.

This leads us to the question: What are these rankings good for? From the perspective of sport sociology, I am tempted to say: for nothing. To discuss this question more constructively: World rankings could initiate a self-reflective discussion within the sport sociological community about the question whether it is reasonable to follow their logic or to ignore them. In any case, world rankings of sport-related subjects pose significant risks for the future development of sport sociology. If these rankings are taken as basic orientations for the distribution of financial and personal resources within universities, the most logic strategy would be to focus primarily on sub-disciplines which promise publications in top 25% journals, high h-indices, and a high number of citations. In this sense, world rankings of sport-related subjects possibly create ‘realities’ in which sports sociology does not have the same value as other sub-disciplines, independent of its relevance and practical impact. Such a total abstraction from content and practical needs would contradict the Humboldtian principles of academic freedom and autonomy for educational institutions, and the idea of pursuing research for the benefit of free and open societies.

Disclosure statement

No potential conflict of interest was reported by the author.

Additional information

Notes on contributors

Ansgar Thiel

Ansgar Thiel is a Director of the Institute of Sports Science at the Eberhard Karls University Tübingen and a Professor for sport sociology. In his current research, he focuses on activity- and health-related biographies, social determinants, and biopsychosocial effects of physical activity, body-related stigmatization and health, social aspects of health in elite sports, and modernization processes in the context of sport.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.