Publication Cover
International Review of Sociology
Revue Internationale de Sociologie
Volume 26, 2016 - Issue 3
224
Views
0
CrossRef citations to date
0
Altmetric
Themed Section/Section Thématique: Politics of Numbers: Sociological Perspectives on Official Statistics

Measuring university-based research in Europe: competing indicators and underlying assumptions

&
Pages 424-439 | Received 30 May 2016, Accepted 28 Sep 2016, Published online: 24 Nov 2016
 

ABSTRACT

Modern societies have a growing need for information and numbers for governing social life. Numbers have the ability to represent a complex reality in a simplified and linear form, easily communicated. Far from being the product of a mere technical process, numbers are the result of a process that “is fundamentally social – an artifact of human action, imagination, ambition, accomplishment” (Espeland, W.N., and Stevens, M.L., 2008. A sociology of quantification. European journal of sociology, 49 (3), 401–436, p. 431). In the modern policy-making climate, numbers become key mechanisms for simplifying, classifying, comparing and evaluating. Along with this, the fields of visibility of evaluative objects, meanings and understandings (Dean, M., 2010. Governmentality. Power and rule in modern societies. 2nd ed. London: Sage) are re-framed consistently with what Clarke, J. (2004. Changing welfare, changing states. New direction in social policy. London: Sage) terms a ‘performance/evaluation nexus’ that links effort, values, purposes and self-understanding to measures and comparisons of outputs (Ball, S.J., 2012. Performativity, commodification and commitment: an I-Spy guide to the neoliberal university. British journal of educational studies, 601, 17–28). In this paper, we focus on the field of higher education (HE), where numbers, in the form of performance indicators, benchmarks and headline targets, are frequently used to strategically orient the sector towards the objectives and goals of the Bologna Process and of the overall Europe 2020 agenda (Waldow, F., 2014. From taylor to tyler to no child left behind: legitimating educational standards. Prospects, 45 (1), 49–62). We aim to offer a comparative overview of the complex spectrum of metrics, provided at the supranational level, within the field of higher education by focusing on the European Research Area (ERA) in order to map and analyse some of the crucial issues in play. A second ambition of this paper is to move from a mapping and analytical perspective to a deconstruction of a specific subset of research metrics, with the aim of challenging the ‘self-evident truths’ and the dominant conventional wisdom that define current European metrics in order to bring into question whether they contribute to restructuring the universities’ research environments, affecting research policies and procedures. Performance indicators are posited to be‘conceptual technologies’, encompassing theoretical and normative assumptions that shape the objects they aspire to measure.

Acknowledgements

This article is the outcome of the collaboration of the authors. However, in order to ascribe responsibility, we declare that Dora Gambardella authored Sections: 3.1. Aims and methodology and 3.2. Results. Rosaria Lumino wrote the sections: 2. Steering the field of Higher Education by numbers: how the performance evaluation nexus matters and 3. Research indicators in Europe: what we measure? The Introduction and the Concluding remarks are co-authored. We are grateful to the guest editor and the journal editors for their comments and suggestions in the revision of the early drafts of this article.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes on contributors

Dora Gambardella is Associate Professor at the Department of Social Sciences, University of Naples, Italy. Her main research interest concern theories and methods of Policy Evaluation as well as the comparative study of reforms and change in social policies. Among her latest publications is Evaluative knowledge and policy making. Beyond the intellectual virtue of Téchne, McGraw Ill Education, 2015, coauthored with R. Lumino.

Rosaria Lumino is Postdoc Research Fellow at the Department of Political Science, University of Naples Federico II. Her main research interests are in the field of evaluation studies and minimum income schemes. Among her latest publications is Evaluative knowledge and policy making. Beyond the intellectual virtue of Téchne, McGraw Ill Education, with D. Gambardella.

Notes

1. On this topic social science literature offers many contributions. A more recent perspective is in Daston and Galison (Citation2007).

2. The Autonomy Scorecard offers a typical instance of European benchmark strategy. Launched in 2009 with the funding of the EU Commission, the scorecard is designed to rate and rank national HE systems through measuring, scoring and weighting different elements of institutional autonomy (Estermann et al. Citation2011).

3. That is, educating some of the workforce, advancing economic development and performing beneficial research (Juhl and Christensen Citation2008).

4. Since 2004 ‘Classifying European Institutions of Higher Education’ has been working with an associated pilot project, European Multidimensional University Ranking System, aimed at mapping multiple excellences (e.g. teaching, innovation, community engagement and employability). For further details see: http://www.umultirank.org/.

5. European University Data Collection (Eumida Citation2010) uses only two core indicators to study the feasibility of a sustainable European system of data collection on the activities and performance of European HE institutions in the areas of education, research and innovation: the status of intensive research (dichotomy) and the number of graduations at the doctoral level.

6. To date, three types of clusters have been identified: ‘ERA-compliant', organisations which are implementing some or all of the ERA actions with high intensity; ‘Limited compliance to ERA', organisations which are implementing some of the ERA actions with low intensity; ‘ERA not applicable', organisations in which research is a minor activity or in which the implementation of the ERA actions is not compatible with their mandate (ERA Citation2015).

7. Other ERA priorities concern optimal transnational cooperation and competition of national programmes, an open labour market for researchers, gender equality in research, and access to and circulation and transfer of scientific knowledge, plus a crosscutting focus on international cooperation.

8. Many reports stress the necessity to balance peer review and indicators-based assessment, even if ‘informed peer review’ is recommended in association with transparency of the method and reviewers competence.

9. This structure differs from Hansen’s more articulated proposal (Citation2010) that distinguishes first-order indicators, aimed directly at measuring research performance, in input, process, structure and results indicators. Results indicators can be divided into output and effects indicators. We will turn to these classifications in the results presentation.

10. However, the collection of this data proves to be quite complex, as the patents are not always clearly linked to HE organisations that have financed the patents themselves due to the existence of different national legislations.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.