1,473
Views
5
CrossRef citations to date
0
Altmetric
Editorial

Statistics and statecraft: exploring the potentials, politics and practices of international educational assessment

This is the age of ‘Governance by Indicators’ (Davis, Kingsbury, & Merry, Citation2012). Major intergovernmental organisations (IGOs) such as the World Bank, UNESCO and the OECD produce a vast range of indicators each year. Quantification has been ramped up to such an extent that we even have indicators for such intangible things as quality of life and happiness. Indicators developed by the OECD include global statistics on energy investment, ‘Skills for Jobs’ indicators, ‘Green Growth Indicators’, ‘Trade Facilitation Indicators’ and the several other widely used education, health and economic indicators. Their annual At a Glance series includes Entrepreneurship at a Glance, Education at a Glance, Health at a Glance, Government at a Glance and Society at a Glance.

Not only do we now measure a very wide range of things, we do so on a global scale, using comparative indicators. This is made feasible by the development of global indicators underpinned by assumptions of similarities and sufficiently universal values. Although there is critique of the use and validity of comparative measures, the import of the taken-for-grantedness of comparison as a way of knowing, and the use of comparisons in government decision-making, is, I think, underappreciated. In a 2011 paper, I had inquired the following:

By what means does PISA gain knowledge and speak with confidence about diverse cultures and distant nations? How does it acquire a voice from nowhere’ (Haraway, Citation1988; Suchman, Citation2000), and become a modern-day Oracle that countries might consult for policy advice? (Gorur, Citation2011, p. 76)

For me, this question is about two things: (1) how the OECD or indeed UNESCO or the World Bank came to be regarded as authoritative sources of information; and (2) the ‘nuts and bolts’, the ‘boring’ and often neglected aspects (Lampland & Star, Citation2009; Star, Citation1999) of global infrastructures of comparative measurement – such as the development of global indicators and classifications that underpin and make possible these comparisons. As Anand puts it,

The dullness of infrastructure has political effects. It enables their various managerial authorities – officials in public utility commissions and departments of environmental services – to remain faceless. It allows their practices to remain illegible in opaque institutions. (Anand, Citation2017a, p. 2)

These often invisible infrastructures (Gorur, Citation2013) are overlooked, and critical gaze is more often diverted to influential individuals or phenomena such as neoliberalism and globalisation as explanations for the mobilisation of ideas and the spread of practices. Scholars in STS have, however, shown how banal and seemingly unremarkable phenomena, such as the standardisation of plug and socket design, or the width of airline seats or the levels of fluorine in water, for instance, perform particular distributions and exclusions (Star, Citation1999). Anand’s (Citation2017b) work on water infrastructures demonstrates how such banal infrastructures often inflict systemic injustice on marginalised communities such as slum dwellers in Mumbai and majority black communities in Michigan.

In education, the post-war periods, and, later, in 1960s, saw a rise in the establishment of global institutions that had the mandate to support nations devastated by war and colonisation to build their economies. Building the economy meant mapping the human capital and ‘manpower’ needs – and this in turn made monitoring and improving literacy and work-related skills a major preoccupation of the post-war agenda. Intergovernmental institutions such as the International Labour Organisation, the World Bank and UNESCO did much to encourage the gathering of statistics in many developing nations, and to create the foundations for large-scale comparison in range of fields, including education, and in linking education statistics with labour and economic statistics (Smyth, Citation2005, Citation2008).

Along with this hard power – i.e. the transfer and convergence of measurement technologies and the incorporation of national statistics into global statistics – came the soft power of these instruments and institutions – the spread of norms and desires which complements the hard policy and infrastructural tools (Stone, Citation2004). It continues today through international rankings that create new role models and new aspirations. It is relayed through ‘capacity building’ that will enable many more middle- and low-income nations to gather statistics compatible with global frameworks, with a view to facilitating not only analysis for national governance purposes, but expressly to feed into international comparisons – comparisons which influence policy as well as the discursive imaginary within which policy is made. IGOs also produce explicit guidelines to help other organisations to measure – with such publications as OECD Guidelines on Measuring the Quality of Work Environments and OECD Guidelines on Measuring Trust.

The ‘teaching and learning’ activities of IGOs do much to stabilise and make durable the effects of numbers, as Grek has demonstrated in this issue. Some of the influence of International Large-Scale Assessments (ILSAs) can be attributed to the focus in the media on the rankings and the performance of nations in what has come to be seen as the global education race (Hargreaves, Citation2011). How ILSAs are taken up, to what extent rankings matter in public perception and opinion, and thus in politics, could be influenced by the media handling of this phenomenon. Hamilton (this issue) examines how different stakeholders and the media insert particular ideas into public conversations, translating ILSA data into forms that are more readily accessed by policymakers. But, she concludes, while the effects of such mediatisation are difficult to determine, ILSAs operate to affect the public imagination in more profound ways, which cannot be attributed in a straightforward manner to how they are reported in the popular media.

Grek (this issue) also finds that the popular media, on its own, is perhaps less influential in producing the influence of ILSAs. While the media blitz following the release of PISA results fades soon after the results are published, the numbers persist – not because they carry a persuasive logic on their own, but because they are sustained through pedagogic interventions (more soft power). Based on empirical work in Sweden, Grek demonstrates that OECD’s Country Reviews, which involve face-to-face meetings and country visits resulting in the generation of a report on the nation’s education system, can have a profound and durable influence on policies.

These twin technologies – a global infrastructure of indicators on the one hand, and a global epistemology on the other – are the means by which the OECD is renewing its capacity to have an impact globally, argues Addey (this issue). Examining the case of the recent expansion of the Programme for International Student Assessment (PISA) in the form of PISA for Development (PISA-D), and using Ecuador as a case study, she demonstrates how the global is being inserted into the local through a coupling of PISA-D and national assessments.

Even well-resourced efforts to influence policies and promote the harmonisation and convergence of practices are seldom fully realised, as many historical studies have shown. Lingard and Lewis (this issue) take up this inquiry of how countries – even those with similar Federal governance structures – respond to ILSAs. Comparing the response to PISA and PISA for Schools in Australia and the US, they explain that the differences in the structures of Federalism in the two countries account for the differences in the ways these assessments are taken up.

Large-scale assessments and comparisons are growing exponentially. More assessments – especially regional and national assessments – are being developed. PISA, for example, has developed PISA-D and PISA for schools, and there are plans for PISA at different age levels – for example at the start and end of primary school. More subjects are being examined – PISA, for example, has added financial literacy, teamwork and global competencies to its range of assessments. All in all, the ILSA phenomenon has expanded to such an extent that it is difficult to take it all in at once. Critique of these assessments, while plentiful, are partial: like the six men of Indostan, for the most part, each disciplinary perspective has examined a different aspect of this ‘elephant’. As a result, there is often a disagreement between disciplines about the nature, scope, influence, usefulness and appropriateness of ILSAs between scholars from different disciplines.

Attempting to explore and bridge the divide between statistical science on the one hand and politics on the other, Guadalupe (this issue) argues that critiquing ILSAs as merely scientific artefacts would miss the point, as these are also essentially political tools. At the same time, neglecting to evaluate the validity claims and other technical aspects of ILSAs on the grounds of their political nature would be a mistake, he warns. He calls for critique that recognises the politics of knowledge production and is able to grasp the dual nature of numbers as both technical and political.

What theoretical tools might aid us in exploring both the ‘science’ side and the ‘politics’ side of ILSAs? I explore this question in Towards Productive Critique of Large-Scale Comparisons in Education (Gorur, this issue). Advocating for an understanding of ILSAs as ‘sociotechnical devices’, I engage with the promise concepts from Science and Technology Studies (STS) holds for grappling with the dual nature of ILSAs as social and political objects.

The papers in this Special Section of Critical Studies in Education arise from a sustained examination of the topic The Potentials, Politics and Practices of International Educational Assessment through six international seminarsFootnote1 conducted between 2014 and 2016 by the Laboratory of International Assessment Studies,Footnote2 funded by an ESRC grant. The idea behind these seminars was to bring together different stakeholders concerned with the production, consumption and critique of ILSAs in an effort to collectively understand the ILSA elephant. Such forums are rare – it is much more common for the measurement community to speak among themselves and the sociologists and the policy scholars to hold separate conversations, even when they are all talking about ILSAs. STS scholars are relatively late entrants to this conversation, though it is now becoming more common to see names such as Hacking and Porter and Desrosières in reference lists of papers on ILSAs.

Critique in the field of ILSAs, hitherto (including in this issue), has heavily focused on PISA and Trends in International Mathematics and Science Study (TIMSS), and far less on the many regional assessments that are undertaken in Africa, South America and other parts of the Global South. With the focus on assessment and monitoring in UNESCO’s Sustainable Development Goals, more assessments are being developed and undertaken and, at the same time, more assessments are gaining attention from researchers. The developments in the Global South are important to monitor. According to UNESCO (Citation2015), more than half of the 250 million children who cannot read or count have been in school for four or more years. The issue of poor student outcomes (however they are measured) is extremely important to address – it is an issue of social justice. But it is also a difficult one to address, given its complexity. Are ILSAs the right instruments to support accountability and raise student outcomes? If so, what kind of ILSAs? How can the nations now being drawn into the ILSA fold avoid some of the pitfalls faced by other nations? These are some of the questions that confront afresh those of us who are in the field of education and development.

There is now a very encouraging trend – an increase in interest on the part of various actors in acknowledging some of the issues with ILSAs and an openness to conversations, to learning, to critique – and best of all, to collaboration. There is willingness on the part of such bodies as the OECD and Educational Testing Services to invite sociologists and anthropologists to engage with them to explore how ILSAs can be made more useful or be better utilised. And I believe there is a move from problematisation and deconstruction on the part of some critics to a better engagement with processes of measurement (and indeed of policy and politics) that is leading to a more nuanced – and hopefully more purposeful – critique. The papers in this issue contribute to this movement.

Disclosure statement

No potential conflict of interest was reported by the author.

Notes

1. The six seminars were held at the University of Edinburgh; Teachers College, Columbia; Universidad del Pacífico; University of East Anglia; Lancaster University and Humboldt University, with support from each of these universities in terms of administrative support and the provision of venues and hospitality.

References

  • Anand, N. (2017a). The banality of infrastructure. Items: Insights from the social sciences. [Online Journal]. Retrieved July 29, 2017, from http://items.ssrc.org/the-banality-of-infrastructure/
  • Anand, N. (2017b). Hydraulic city: Water and the infrastructures of citizenship in Mumbai. Durham: Duke University Press.
  • Davis, K. E., Kingsbury, B., & Merry, S. E. (2012). Indicators as a technology of global governance. Law & Society Review, 46(1), 71–104. doi:10.1111/j.1540-5893.2012.00473.x
  • Gorur, R. (2011). ANT on the PISA trail: Following the statistical pursuit of certainty. Educational Philosophy and Theory, 43(S1), 76–93. doi:10.1111/j.1469-5812.2009.00612.x
  • Gorur, R. (2013). The invisible infrastructure of standards. Critical Studies in Education, 54(2), 132–142. doi:10.1080/17508487.2012.736871
  • Haraway, D. (1988). Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575–599. doi:10.2307/3178066
  • Hargreaves, A. (2011). Foreword: Unfinished business Finnish lessons: What can we learn from educational change in Finland? New York and London: Teachers College Press.
  • Lampland, M., & Star, S. L. (2009). Standards and their stories: How quantifying, classifying, and formalizing practices shape everyday life. Ithaca and London: Cornell University Press.
  • Smyth, J. A. (2005). UNESCO’s International Literacy Statistics 1950-2000 Background paper prepared for the Education for All Global Monitoring report 2006 - Literacy for life. Paris: UNESCO.
  • Smyth, J. A. (2008). The origins of the international standard classification of education. Peabody Journal of Education, 83, 5–40. doi:10.1080/01619560701649125
  • Star, S. L. (1999). The ethnography of infrastructure. American Behavioral Scientist, 43(3), 377–391. doi:10.1177/00027649921955326
  • Stone, D. (2004). Transfer agents and global networks in the transnationalization of policy. Journal of European Public Policy, 11(3), 545–566. doi:10.1080/13501760410001694291
  • Suchman, L. (2000). Located accountabilities in technology production ( Working Paper). Lancaster University. Retrieved from http://www.comp.lancs.ac.uk/sociology/soc039ls.htm
  • UNESCO. (2015). Education 2030 Incheon Declaration and Framework for Action. Paris: Author.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.