1,691
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Perceptions of key education actors towards PISA: the case of Scotland

ABSTRACT

This study explores the perceptions of government officials, teachers, and parents in Scotland regarding the use of Programme for International Student Assessment (PISA) results to evaluate national education performance. International large-scale assessments (ILSAs) such as PISA have been increasingly influencing education policymaking worldwide, but there is limited understanding of how education actors perceive these assessments. This study uses Scotland’s recent declining performance in PISA after the implementation of the Curriculum for Excellence (CfE) reform in 2010 as a case study. Interviews with key officials involved in implementing the CfE, and parents’ and teachers’ representatives reveal their doubts about the validity of PISA as a means of assessing Scottish education performance. This is because PISA data lack comparability and accountability, as students’ academic performance is not impacted by PISA scores and teacher evaluations are not based on them. They do not consider that the PISA results reflect the effects of the CfE reform. These views contradict the Scottish government policy that PISA, among other assessments, is meant to inform policymakers and evaluate reform effects. Our findings have implications for countries that routinely use PISA to assess their educational performance and do not consider the views of education actors.

Introduction

The increasing influence of international large-scale assessments (ILSAs) by international organisations (IOs) in shaping national education policies has attracted global attention. In particular, the role of the Organisation for Economic Co-operation and Development (OECD) Programme for International Student Assessment (PISA) has been vital in setting global standards for the learning achievement of pupils at the secondary level. Scholars argue that through such initiatives, IOs claim a subtle yet powerful authority in pushing for convergence of schooling systems across countries (Breakspear, Citation2012; Grek, Citation2012). Despite this, the reaction of social agents towards globally diffused policies, including ILSAs, is largely ignored in the literature (Marsh & Sharman, Citation2009). A small body of literature discussing how national performance in PISA and other ILSAs elicits controversies among different agents mainly concentrates on specific groups of actors, such as politicians and the media (Grek, Citation2012; Grey & Morris, Citation2018). Hence, we have limited insights into the perceptions of those involved in policy implementation processes on the ground including government officials and teachers (e.g. Giudici, Citation2020), and those on the service recipients’ side, parents and students. This paper addresses this research gap by examining the viewpoints of these actors about the adoption of ILSAs in national education policies. Here, ‘actors’ refer to collective individuals as social agents: more specifically, central and local government officials and representatives from teachers’ and parents’ associations.

We take advantage of the unique case of the Scottish education system which underwent a radical reform for all pre-tertiary levels in 2010, known as the Curriculum for Excellence (CfE) reform. The CfE reform has fundamentally changed the curriculum for 3–18-year-olds across Scottish schools. We use this major event to examine the reactions of education actors, directly involved with the reform implementation processes, towards PISA in assessing the performance of Scottish education, especially the effect of the CfE reform on pupils’ achievement. In brief, after the CfE reform was implemented, Scotland’s PISA performance, especially in mathematics and science, gradually declined (see ). This has led to debates about CfE reform mainly in the parliament and media (BBC, Citation2016b). However, little is known about whether actors directly involved in CfE implementation processes such as public officials and teachers’ unions, and parents’ associations representing service recipients are reacting in the same way to this declining performance in PISA results.

Figure 1. Mean scores of Scotland and other parts of the UK in mathematics, reading and science in selected PISA waves.

Notes. (a) The vertical straight line indicates the implementation of CfE in 2010. (b) The five plausible achievement scores (10 for PISA 2015 and 2018) along with student and school survey weights have been considered in calculating the mean scores in three skill areas in the selected waves. (c) OECD’s average scores also include the UK.
Source. Own calculations based on PISA data (OECD, Citationn.d.).
Figure 1. Mean scores of Scotland and other parts of the UK in mathematics, reading and science in selected PISA waves.

The CfE reform makes the case of Scotland suitable to study the perceptions of education actors as social agents about ILSAs, and specifically about PISA. There are two main reasons for this. Firstly, PISA is used by the Scottish government to continuously evaluate the progress of pupils and assess the impact of any educational reforms including the CfE reform (The Scottish Government, Citation2011b). Indeed, PISA provides a major source of internationally comparative rich data to measure the performance of educational systems every three years among participating nations in the areas of mathematics, reading and science literacy (Schleicher & Zoido, Citation2016). Second, the Scottish government occasionally invites the OECD to evaluate the education system using its technical expertise. The OECD regularly reviews policies and practices of countries with similar socioeconomic circumstances that perform highly in PISA (Schleicher & Zoido, Citation2016). Scotland invited the OECD before and after the CfE reform was implemented (e.g. OECD, Citation2007, Citation2015). Scholars consider this ‘reaching out’ a reciprocal process which, on the one hand, allows Scotland to project its self-representation as an autonomous economy on the global stage and promote its national independence project. On the other hand, it shows the OECD’s diffusion of soft power within nations (Grek, Citation2012; Lingard & Sellar, Citation2014). These points indicate that the influence of OECD’s policy agendas on Scottish education as an intergovernmental entity is mutually agreed.

However, while PISA is recognised as an evaluation tool by policymakers, the views of other actors towards it may differ. This is because, unlike compulsory exit exams, education actors including teachers, parents and policy implementers have no legally binding accountability or obligations related to students’ performance on PISA. PISA scores do not reflect students’ academic performance in school, and teachers are not evaluated based on them. Hence, these actors may have different views about using PISA for measuring national-level performance. This paper examines the perceptions of government officials, and representatives of parents’ and teachers’ associations regarding this. We focus on teachers’ unions and parents’ associations at the national level in Scotland as these would arguably embody their macro-level and institutional views as pressure groups. We conducted in-depth interviews with a small sample of these actors. We also situate the Scottish case in a comparative context showing how it can have broader implications as PISA occupies important space in education policymaking globally.Footnote1 While limited in terms of sample size, which is partly due to a focus on macro-level actors, our study brings fresh perspectives for understanding the perceptions of education actors towards ILSAs.

Theoretical framework

The influence of ILSAs and PISA on national education policies

An established body of literature has discussed how IOs influence national educational systems worldwide by diffusing new norms and policy practices through ILSAs including PISA. This diffusion aims to close policy gaps between countries by introducing new ideas and practices (Mangez & Hilgers, Citation2012; Michel, Citation2017; Sellar & Lingard, Citation2014; Steiner-Khamsi, Citation2012). For instance, Ringarp and Rothland (Citation2010) document that as a result of performing well below the OECD average in PISA 2000, Germany travelled outside its border to Sweden and Finland to look for new education models. Somewhat similarly, Sweden, which ranked below some other countries, also started investing in new solutions to improve achievement. This phenomenon is known as policy borrowing from one country to another. In this process of transfer, education is often isolated from political, cultural and economic contexts (Steiner-Khamsi, Citation2004; Steiner‐Khamsi, Citation2006; Steiner-Khamsi & Quist, Citation2000).

The past few decades have experienced rapid growth of the global testing culture not only through PISA but also Trends in International Mathematics and Science Study (TIMSS), Progress in International Reading Literacy Study (PIRLS), and other ILSAs including some recent ILSAs specifically in the Global South. However, PISA has gained special strategic prominence due to the participation of an increasing number of countries (Meyer & Benavot, Citation2013). In many countries, PISA results are also one of the main yardsticks to evaluate the performance of the educational system compared to top-scoring nations (Grek, Citation2009).

Past research has explored the reasons for adopting PISA and other ILSAs and their increasing impact on national education policies and public debates in various ways (e.g. Gorur & Wu, Citation2015; Rautalin et al., Citation2019; Steiner-Khamsi & Waldow, Citation2006; Waldow, Citation2009). However, the diffusion of ILSAs is discussed without considering the views of different social agents or actors as we define it here. For instance, Addey et al. (Citation2017) present seven comprehensive reasons for countries to participate in ILSAs. These include the need to obtain ‘reliable’ evidence, increase global competitiveness, build capacity, continue to receive aid, use ILSAs as a political tool, and implement reforms based on assessment results. Empirical evidence from many OECD countries reflects these points. For instance, Ertl (Citation2006) and Waldow (Citation2009) provide evidence from Germany that as a result of poor performance in PISA, the country adopted several measures including the introduction of ‘national standards’. These have led to the education system becoming more outcome-based, competency-oriented and guided by external assessments. Dobbins and Martens (Citation2012) find similar trends in France.

Research also suggests that PISA has introduced policy contradictions between countries. For instance, Bulle (Citation2011) argues that the factors which are highlighted as the reasons for Finnish success in PISA have not originated from the OECD. However, they are often promoted in other countries. Further, many of the educational policies promoted by the OECD have been found to be linked to the gradual weakening of academic performance of students in France during the past two decades (Bulle, Citation2011).

Nonetheless, as argued, despite these controversies, existing literature considers ILSAs to be diffused by ‘invisible hands’ without elaborating who they are and how they perceive ILSAs. In other words, we know little about the perceptions of national stakeholders, especially those involved in education policy implementation processes. Some research has pointed out how policymakers in the parliament react to the national performance in ILSAs (Lundahl & Serder, Citation2020; Rautalin et al., Citation2019). For instance, Grey and Morris (Citation2018) examine the reactions of media and politicians to PISA 2012 results in England, in which the former criticise the government based on PISA performance while the latter defend their policy decisions. In this way, PISA has built an invisible but powerful grid of ‘accountability’ for policymakers including the education ministry towards media and relevant stakeholders.

Why an actor (agent)-based approach?

We differentiate between those involved as ‘policymakers’ and ‘implementers’; their views towards PISA or other ILSAs may differ in many regards. On the one hand, there is the strongly held view that considers the diffusion of ILSAs as part of a political commitment by countries to be globally competitive (Addey et al., Citation2017). Policymakers are primarily involved in setting this type of commitment. They are also the main controllers of the use of ILSA data in education reforms. In addition, policymakers are at the forefront in considering a country’s comparative position in ILSAs. We discuss this in relation to the case of Scotland in the following section. On the other hand and by contrast, actors who are directly involved in policy implementation on the ground such as parents, teachers and education officials in both central and local governments frequently have little say over whether a country should participate in ILSAs. Besides, as we have argued, their opinions regarding this remain silent or unheard. This is similar to the cross-border diffusion of other norms and practices in which the voice of micro-social agents stays invisible (Marsh & Sharman, Citation2009). Our study sheds light on the views of actors on the side of policy implementation. In doing so, we focus on three specific groups of actors – parents, teachers and government officials – in this study. These actors are closely connected with the education service provision (government officials and teachers) and the receiving processes (parents).

Because of the positional differences of policymakers on the one hand, and actors involved in policy implementation on the other, their views towards ILSAs would arguably be different. For instance, since neither parents and teachers nor education officials in the government are accountable to participate in ILSAs, they would not be likely to be concerned about the results of these assessments. By contrast, countries’ comparative ranking in ILSAs may stimulate debates and controversies among policymakers criticising education policy choices, and reforms.

In this study, in order to examine their macro views about ILSAs as civil society and social forces, instead of a focus on individual parents and teachers, we focus on the representatives of their collective associations. The following section discusses the relevance of Scotland in the context of this research.

OECD and PISA in Scottish education

Scotland’s long and distinct history separates the country’s education sector from the rest of the UK. This distinctiveness existed before political devolution in 1999. Hence, the relationship of Scottish education with the OECD including during the CfE reform remains different from that of other member countries, even from other devolved administrations such as Catalonia in Spain (Grek, Citation2012). Scholars argue that Scotland’s participation in OECD’s PISA and the country’s occasional reaching out to this international organisation to seek technical expertise demonstrate how global forces help shape the identity of a devolved ‘nation within a nation’ (Lingard & Sellar, Citation2014).

To elaborate, England and Scotland have been participating in PISA separately with adequate sample sizes in the test since 2006 (Grek, Citation2009). In publicly available PISA data, the UK is divided into two subnational categories (1) Great Britain (that is, England, Northern Ireland and Wales), and (2) Great Britain: Scotland. Nonetheless, UK’s PISA results provide data on all four nations with larger sample sizes than would be required (Wheater et al., Citation2013). Grek (Citation2012) argues that this provides increased recognition of a separate status for Scotland while creating ambiguity about what the UK means in terms of education policy and educational performance. Grek further finds that while Wales and Northern Ireland accept representation in the PISA Governing Board through English policy actors, Scotland wants to have a place on the Board by sending an observer and directly negotiating with the OECD.

Increasing global competitiveness among other major economies is a key driver of joining PISA (Addey et al., Citation2017). UK policymakers also stress that the OECD has the expertise and necessary competence to assess the educational performance of competitive nations by comparative study. Given that PISA is viewed as the ‘gold standard’ among all other ILSAs, better performance in PISA results receives considerable media attention. Scotland’s higher performance in PISA compared to the rest of the UK and average OECD scores enables the country to be more visible and helps it in its efforts to acquire a recognisable and separate entity from the UK (Grek, Citation2012). Lingard and Sellar (Citation2014) suggest that in seeking the OECD’s expertise while participating in PISA separately Scotland expresses its desire and motivation for independence. International visibility helps Scotland to be perceived as a ‘Scottish nation’. Indeed, the Scottish Nationalist Party (SNP) disseminated a video in 2012 emphasising the importance of PISA in representing Scotland on a global stage (Lingard & Sellar, Citation2014).

Despite the perceived importance of PISA to policymakers as a tracking mechanism through a systematic gathering of data, the participation of nation-states including Scotland in technical meetings of the OECD is largely political. Grek’s (Citation2012) study suggests that while OECD proceedings are quite technical in nature, there is little or no scope for debate as the decisions are highly controlled by the OECD. This does not appear to be an issue for Scotland as their main aim is that they participate in PISA and perform well in PISA results. While ritualistic and symbolic, PISA participation is important for small and peripheral nations such as Scotland in order to help them escape the shadow of others. There is a perception among Scottish policymakers that participating in PISA and reaching out to the OECD as a competent authority may help to build Scotland’s reputation and provide greater visibility among other industrialised nations by strong performance (Grek, Citation2012).

This discussion shows that policymakers in Scotland view the OECD’s PISA as a tool to inform policymaking about the state of educational performance while also mobilising the ethos of nationalism through high visibility. However, do officials involved in policy implementation and parents’ and teachers’ representatives think in the same way? This question is addressed in this paper. The following section demonstrates how the OECD is involved in the CfE reform and how the decline in PISA performance in Scotland has led to controversies.

OECD in CfE reform: debate around the decline in PISA performance

Similar to other nations, the Scottish government occasionally invites the OECD to provide a ‘diagnosis’ for its education sector using the organisation's technical expertise. It may be said that this shows the influence of the organisation in the country’s education policymaking processes as a separate ‘nation within a nation (UK)’. Prior to implementing the Curriculum for Excellence, the Scottish Executive invited the OECD in 2006 to ‘examine the performance of its school system’ (OECD, Citation2007, p. 1), and to provide advice on the adequacy of recent reforms and how to make Scottish education globally competitive (OECD, Citation2007). This request was repeated in the post-implementation era after 2010 when the OECD was again invited to review the Scottish education system. The OECD has acclaimed the CfE reform as a ‘watershed moment’ stating that it can make the education system globally competitive (OECD, Citation2015).

The OECD also influences education policymaking through PISA, since PISA is one of the government-approved learning assessment datasets to evaluate the performance of Scottish education. Thus, a government report states that ‘[t]here are also existing monitoring commitments, such as PISA and the data gathered on achievement of Curriculum for Excellence levels, which provide an insight into policy impact’ (The Scottish Government, Citation2017, p. 10). The government commits to continue participating in PISA to ‘inform Scottish Government about standards and trends in achievement’ (The Scottish Government, Citation2011a, p. 47).

Nevertheless, Scotland’s PISA performance experienced a gradual decline after the reform was implemented (see ), which led to an outcry within the education community criticising the CfE reform. Many have linked the decline in achievement scores directly to the reform. They argue that the CfE reform could be detrimental to learning because of its atheoretical and ahistorical nature, a focus on broad and general education rather than subject-specific learning, and excessive reliance on teachers without equipping them with appropriate resources (e.g. Baumfield et al., Citation2010; Brown, Citation2014; Priestley & Humes, Citation2010).

Critics, including opposition politicians, blamed the incumbent party (SNP) and the CfE reform for the declining performance of Scotland in PISA 2015. Nevertheless, the first minister defended the reform saying that this was the right way forward for Scottish schools while also taking responsibility for the decline in achievement (Davidson, Citation2016; Warrell, Citation2016). An opposition lawmaker associated this with the incompetence of the government (BBC, Citation2016a, Citationb). Opposition politicians continued blaming the SNP for Scotland’s declining performance in PISA 2018, especially in mathematics and science. The government’s response was defensive, claiming that PISA does not cover everything to evaluate the whole educational system (BBC, Citation2019).

These controversies illustrate the influence of PISA in Scottish education and how it is used to contest policy choices of the incumbent regime by the opposition, as Addey and associates (Addey et al., Citation2017) also suggest. This paper examines the views of three different groups of education actors, from a collective sense, about PISA. We enquire: what are the perspectives of some key education stakeholders in Scotland about using PISA to evaluate the country’s education system?

Methodology

Data

In order to address this question, we conducted interviews with senior education officials in central and local government and some key representatives from parents’ and teachers’ unions. We used a purposive and a snowball sampling technique. To interview parents’ and teachers’ representatives, we applied a purposive sampling technique; this was judged suitable in the study context since there is a small number of national-level unions for both groups. We focused on key representatives as our target was to examine the response towards PISA from national or macro-level education stakeholders who we judged would be better informed about CfE reform decisions. We contacted three key representatives from two different parents’ unions by email, of whom two agreed to be interviewed for the study. We also emailed two representatives from two different teachers’ unions, of whom one agreed to be interviewed (see ). We use pseudonyms for respondents and anonymise their organisations’ names to protect their identity.

Table 1. Description of interview participants.

We used a snowball sampling method to select senior education officials from an education department and from local government. Directly contacting senior officials was challenging as their contact information is not visible on websites. The snowball sampling strategy helped us to get their contact information from some of the interviewees. We contacted by email six central-level education officials who were directly involved in the implementation of CfE of whom four agreed to be interviewed. However, among the four, three officials were unwilling to meet in person but gave us their responses by email to an open-ended questionnaire which was also used for in-person interviews.Footnote2 Although we had sent emails individually, the three officials coordinated among themselves and provided one combined response.Footnote3 We interviewed one central-level senior official in person. The central-level officials included two curriculum experts, one statistician, and one evaluation officer engaged in Scottish participation in international assessments including PISA and also in running national assessments. Additionally, we emailed three education officials from local government in Edinburgh and Glasgow. Only one official from Glasgow agreed to be interviewed. In total, there were five in-person and three email interviews. In-person interviews were carried out at the respondents’ workplace or their preferred location using semi-structured questionnaires, each designed for three different types of respondents.

Similar to other respondents, we also adopt pseudonyms for all officials and anonymised their organisations in the paper (). Conducting in-depth interviews with the officials who were directly involved in carrying out the CfE reform provides meaningful viewpoints about PISA results in evaluating educational performance from people with technical expertise.

The University of Glasgow’s School of Social and Political Sciences Ethics Forum reviewed and approved the data collection process. The objectives of the study were clearly explained to the participants. They were also assured about the confidentiality and anonymity of their personal and interview data. However, it was made clear that they could still be identified by others working in the same field. The participants were told that they could opt out at any point in the study without giving any reason. In this article, we use neutral pronouns they/them to refer to individual participants.

We coded the interview texts according to more frequently mentioned concepts and phrases. We then divided the texts based on the main themes or concepts pertinent to answering the main research question. Subsequently, we compared the perspectives of all respondents about the respective themes.‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬

Findings

CfE’s initial aims

We first discussed with interviewees the overall objectives of the CfE reform to understand how far it has focused on learning outcomes to make the educational system internationally competitive. This then led us to explore whether the selected education stakeholders find PISA an appropriate tool to assess educational performance if learning outcomes and competitiveness are central to the reform.

To our questions about the main objectives of the CfE reform, all participants pointed out that it has mainly emphasised improving learning outcomes. For instance, the central education official (1) stated that:

Previously we had said, right, you need so many periods of English, so many periods of maths. So curriculum was all defined in terms of inputs and this was a curriculum being defined in terms of outcomes we want for young people. [Central education official (1)]

The official from the local city council said that the previous curriculum ‘was too basic’ as it had not taken into account issues such as technology and problem-solving skills. Thus, it needed to be refreshed so that young people can ‘be versatile and problem-solving’ by acquiring ‘deeper learning’, ‘not just that kind of surface level’.

However, while agreeing with other participants on the CfE’s main objective, the teachers’ representative was critical of the increased assessments. They blamed politicians for focusing more on data gathering. They argued that ‘for a period of time, schools and local authorities worked to reduce the assessment. Mmm… but politicians have a difficulty understanding anything other than figures’. Thus, the drive for evidence of outcomes, especially in literacy and numeracy, has increased the amount of assessment, the respondent added. Similarly, the parents’ representative (1) also contended that parents were optimistic that the CfE would reduce assessments and promote spending more time on learning, ‘[a]ctually, it hasn’t. And in fact, I suspect it increased’.

Taken together, the chief objective of the CfE reform was initially to improve learning outcomes through a revised curriculum. Meanwhile, more assessments were introduced emphasising an evidence-based approach to evaluating students’ performance. The firm adoption of PISA as an assessment tool, as mandated by the Scottish government (The Scottish Government, Citation2011a), may also be a part of this approach.

Stakeholders’ perceptions about PISA to assess the impact of CfE

Although all interview participants agreed with the main objectives of the CfE reform, a number of them raised questions about the way learning outcomes are measured using different data sources such as PISA. They included interviewees from all three groups. They were doubtful about the validity of these data to measure the effects of CfE. The reasons for such doubt include limited accountability of teachers and students, and also assessment questions coming from outside of the curriculum content.

To elaborate on opposing views about PISA, the participants suggested that the standardised tests are not administered in alignment with the CfE framework and attributed the declining results to the ‘limitation’ of the testing system. The explicit position of four interviewees, excluding the three central officials in the group and the parents’ representative (1), was that the downward trend cannot be linked to the CfE reform.

There should rather be ‘valid data’ to measure CfE effects, the central education official (1) argued. They questioned the measurement validity of PISA stating that:

I don’t think it’s [PISA] a valid measure. And I would say the same of the Scottish Survey of Literacy and Numeracy. It’s exactly the same thing because of its no accountability at local level. People didn’t really understand what it was. […] They [teachers] are not accountable for it. They [students] won’t be told whether they passed or failed, the teachers won’t be told whether they passed or failed and then this data came out and you know press are telling us our education system’s rubbish. And it’s not. You know if you look at our national qualifications they are going up. [Central education official (1)]

The central education official (1) further remarked that there is no preparation for PISA in schools and students just ‘don’t know what it’s for, they don’t value’. They are also not accountable for the result and will not be told if they pass or fail. The official continued by saying:

Speaking to my colleague who is a maths specialist, for instance, and he was saying some of the things appear in the PISA assessment are not taught in Scottish maths programme until children are older than 15. So, you know it’s unrealistic to expect them to be able to get a high grade in those maths qualifications… Now I don’t know the detail of that. That was a maths specialist saying that. [Central education official (1)]

We asked: what could be a valid measure to evaluate the impact of CfE then? Responding to this question, the central education official (1) emphasised that national assessment data such as results from national qualifications at different grade levels could be a better alternative. This, they think, would be more valid than using PISA or other ILSAs.

The local government official also pointed towards the same issues saying that there is no accountability at the local level for PISA as the participation is voluntary. Besides, in the official’s view, the average Scottish people are not concerned about PISA, and they would wonder ‘PISA is about the leaning tower; it’s not about an exam system’. The official also emphasised that there is no accountability for teachers and students towards this ‘European assessment’ stating that ‘you have got to ask yourself how is that assessment administered in our schools’. The official shared a personal story of how PISA is administered in schools based on their experience of working as a headteacher. During the test in school, the test takers check all the participating students’ names and put them in a room asking to fill out their names. The school then assures students saying that:

[T]his is an assessment being done as part of a big European study. Don’t worry about it; just do your best and it will be fine. [Local government official]

The local government official further contended that, because of these reasons, students would not take PISA as seriously as their secondary school qualifications. What is further interesting here is the ‘misconception’ of schools, according to the local government official, about PISA being a European study. It should be noted that PISA is a global study involving both OECD and non-OECD members from different continents. However, the more important aspect of the interview was the comparison with other countries. The official suggested that the Scottish schooling culture is different from that of other countries and comparing it with the PISA performance of other countries would therefore not be appropriate. The official continued:

So how is it [PISA] treated in Singapore or China? It is culturally so different. Listen to it. So, in Singapore in China, it’s like you know your country depends on you [your PISA performance]. And we don’t have that culture. I mean we just don’t.

So culturally you know the only people were using PISA at that kind of level are politicians. […] And nobody, nobody actually implementing it really cares. And that’s a problem in Scotland. It’s a big problem … international rating. I think our education system is really good. It gives a breadth of knowledge for young people. [Local government official]

As we can see, while questioning the comparability of PISA, the local government official suggested that it is more of a concern for policymakers than other education actors such as parents, teachers and also education officials themselves. This corresponds to our initial expectation that, although PISA results have led to controversies in parliament, other actors may have remained less concerned.

In a somewhat similar tone, the second representative (2) from a parents’ organisation argued that PISA data consist of a small sample of students. Data are also cross-sectional, which means different students are assessed in every wave. Thus, based on those data, in the participant’s opinion, it cannot be said that Scottish education is not doing well. We note that while cross-sectional at the student level, PISA assessment instruments remain consistent in each wave, which allows existing research to draw comparisons between and within countries over time (e.g. Chmielewski, Citation2019).

This parents’ representative (2) also remarked that it also matters ‘what PISA measures… and what you value’. Scottish education values more than just what is measured in PISA such as being more compassionate, having better communication skills, and continuing to learn regularly rather than just being a ‘robot’ by making them do what needs to be done. PISA-related learning, on the other hand, could be more appropriate in some other high-performing countries as they prepare students for that. Because of these reasons, the parents’ representative (2) contended that ‘[p]arents are not bothered by all these data’, further arguing that ‘[a]s a parent, I am not concerned about the data’. The parents’ representative (2) also asserted that ‘I am sure it [CfE] has flaws, but it has, in my eyes, fewer flaws than anything we have done before’.

Again, we see that there is a convergence of views from a parent’s representative and the local government official regarding the comparability of PISA.

Likewise, the teachers’ union representative maintained that ‘these results tell you only a small part of it’ and can be explained in a variety of ways. The respondent argued that PISA results are inadequate to assess Scottish education in the following way:

PISA is a particular test that doesn’t attempt to test that we train our young people in Scotland to engage with. So, what does it tell us? It tells us that compared to other countries that train their young people to do those tests, we’ve done less well. Do we want our young people to pass these tests, or do we want them to engage with a wider, more socially just curriculum? […] If we want to prepare our kids to prepare for PISA and prepare for those kinds of things, we can prepare them for that. But it’s not … it’s not … it’s not what Scottish education chose to do. We chose to develop a curriculum based on four capacities; we chose to develop a curriculum that was about something more than passing tests. [Teachers’ representative]

As we can see here, the teachers’ representative also raised concerns about the comparability of PISA results because of contextual differences. Additionally, the interviewee opined that the OECD does not take into consideration everything when they measure achievement, such as teachers’ working hours which are very long, as well as Scottish qualifications such as Highers [secondary school qualifications] and apprenticeships, which have been improving. The respondent stated that:

In terms of PISA, in terms of OECD, there are factors in that. So, Scottish schools have some of the highest class sizes across the world. […] So, if teachers in Scotland had 20 in their classes as other countries do, maybe the results would be different. Teachers in Scotland have some of the longest working hours, pupil contact hours. […] So it’s easy to take single points in time out but they don’t measure everything. The main things for me and for parents in Scotland are the Highers. […] So the parents know that young people are gonna be well equipped for the world of work or for further education, higher education. [Teachers’ representative]

The four other respondents remained somewhat neutral about the reliance on PISA data to assess educational performance. The group of central education officials that were interviewed by email suggested that PISA is used to assess the educational performance of Scotland at the secondary level and also to measure the impact of reforms including CfE. They did not point out any specific weaknesses or strengths of PISA, rather directed us to the sources that state the government policy to continue depending on PISA for evaluation purposes as mentioned. For instance, one of the government documents they provided to us states that PISA and other achievement data offer insight into the policy impact of the CfE (The Scottish Government, Citation2017). Similarly, they provided us with another document that also supports PISA being a tool for measuring the policy impact of the CfE and other reforms. The document also asserts that Scotland will continue participating in PISA to inform the Scottish government about trends in achievement (The Scottish Government, Citation2011a).

Both of these documents suggest that Scotland recognises PISA as a valid measure to evaluate the country’s educational system. This conflicts with the views of selected policy implementers and representatives from parents’ and teachers’ unions concerning PISA. We recognise the limitations of the email interview which restricted us to back-and-forth communication, unlike other interviewees. Besides, since the participants were communicating with us using institutional emails, privacy concerns may have prevented them from giving critical views.

Additionally, the parents’ association’s representative (1) did not specifically question the validity of PISA but opined that the decline in results is actually related to a broader changing social phenomenon. The achievement results could be related to socioeconomic circumstances of families as, for instance, many parents cannot invest time in families because of spending more time in jobs, sometimes even two jobs, to keep their family going and cannot spend more time with children. Thus, the decline in results could be related to that although there is no clear evidence on this issue. However, the interviewee continued by saying that there could be many factors related to it and it is really complicated. ‘We tend to take a very kind of binary approach to these things. We like to kind of attribute x to y. And of course, it’s not that simple’, the parents’ association’s representative (1) added.

Indeed, the respondent’s argument about socioeconomic status (SES) being associated with declining achievement points towards the global widening of gaps between lower- and higher-SES pupils as recent findings suggest using PISA data (Chmielewski, Citation2019). However, Chmielewski also finds that achievement gaps are not increasing in the UK, both in England and Scotland.

To summarise the findings from the interviews, some of the key education stakeholders involving both the engineers of the CfE implementation (government officials) and delegates representing parents and teachers do not think that PISA is a valid measure to assess the performance of Scottish students, let alone evaluating the impact of the CfE reform. While some respondents did not necessarily point out any problems with PISA, their arguments did not support this assessment tool either (which we consider as neutral views).

Discussion and implications for other contexts

PISA has had growing prominence in the global education policy landscape. Yet, our study presents some sceptical views of education stakeholders in Scotland concerning the appropriateness of PISA to track the educational performance of the country. Specifically, as we illustrated in the previous section, half of the interview respondents did not consider PISA to be suitable to assess the CfE reform and the performance of Scottish education. They were questioning the comparability of PISA across countries as it does not consider diverse local contexts and what different educational systems value other than just literacy and numeracy skills. Indeed, existing research also criticises PISA for lacking comparability as it implicitly considers the curriculum across countries to be somewhat similar while overlooking cultural contexts (Gorur, Citation2016; Hopfenbeck et al., Citation2018).

Other alleged limitations respondents mentioned include limited accountability of teachers and students about scores in PISA. The local government official in the interview said that ‘nobody actually implementing it really cares’. Some other respondents also stated that they are not ‘concerned’ about PISA data.

The purpose of PISA is not necessarily to influence specific education reforms such as the CfE. PISA was established by governments ‘to compare the quality, equity, and efficiency of their school systems on a regular basis, in terms of the learning outcomes’ (Schleicher & Zoido, Citation2016, p. 456) and is used by participating nations to measure the comparative performance of their schooling systems every three years as measured by the performance of 15-year-olds on reading, mathematical and scientific literacy. Such measures are taken as indicative of the global standing of a schooling system. Thus, the Scottish government, as a global trend, uses PISA to evaluate the country’s education system. Hence, the denial of using PISA as a performance yardstick by some key stakeholders in Scotland goes against the mandate of the government, as we discussed before, that information from international surveys such as PISA is to be used to ‘inform Scottish Government about standards and trends in achievement’ (The Scottish Government, Citation2011a, p. 47). This, nevertheless, does not mean that the respondents who were critical of PISA did not see Scottish education as internationally competitive. They rather pointed out that a more suitable way of comparing Scottish education would be using its secondary school qualifications and achievement afterwards in higher education and careers.

PISA data are taken seriously for debate in the parliament between different political parties where the first minister took the responsibility for the declined achievement in PISA 2015 (BBC, Citation2016b). This aligns with Addey et al.’s (Citation2017) reasoning why PISA is used as a blaming instrument by political parties and opposition forces depending on countries’ performance.

Our study shows that there is a mismatch between what policymakers envisage about the application of PISA data in assessing educational performance and how the implementers and other actors dismiss it. Due to this discrepancy, our findings may have implications for other countries that use PISA to inform policymaking. There is limited evidence about the views of teachers and, more importantly, policy implementers from countries that embrace PISA to evaluate the education system. We do not know if they perceive PISA akin to policymakers or politicians. This gap may exist across countries, which suggests the absence of accountability at two levels. First, the utilisation of PISA data in education policymaking does not reflect the views of education providers, especially those involved in implementing policies, and beneficiaries including parents and students. Second, prior to actual PISA tests, a pilot test is administered in participating nations to test the validity and suitability of test items. However, Grek (Citation2012) suggests that due to the highly technical nature of PISA instruments, member countries have limited scope to participate and detect potential weaknesses in data. This one-sided approach may provide incumbent policymakers with the opportunity to attribute the reason for poor performance on survey instruments, while the opposition may blame allegedly poor policy choices. This blaming process may get more reinforced in contexts where policy implementers, and parents’ and teachers’ representatives, show little enthusiasm for PISA. This is due to a lack of bottom-up accountability.

It would, therefore, be paramount to consider the involvement of relevant policy implementers and stakeholders in deciding how much PISA results should be emphasised in evaluating policy reforms. In the absence of a systematic accountability framework, the use of ‘unfiltered’ PISA in education policymaking may have serious consequences for future decision-making.

Limitations

While we focus on some key education stakeholders, our sample is limited in terms of size and diversity of selection. Even though we selected some officials that were involved in implementing the CfE and were leading some departments in Scottish education, a bigger sample size could have brought more diverse perspectives. Similarly, parents and teachers from non-leadership positions could also diversify opinions about PISA. Future research may address these issues. Despite this, we think our study makes a unique case by presenting what some other major stakeholders, instead of only politicians, think about PISA.

Furthermore, our study did not have any first-hand views from policymakers even though we cite different news articles demonstrating politicians’ perspectives about PISA. Finally, our study lacks students’ viewpoints about PISA. Future research could involve students when studying this area.

Ethics approval

This study was approved by the University of Glasgow’s Ethics Committee for Non-Clinical Research Involving Human Subjects (PGT/SPS/2017/063/URBAN).

Acknowledgements

I gratefully acknowledge the funding received for this study from the UK Economic and Social Research Council (ESRC) and the Scottish Graduate School of Social Science (SGSSS). I thank Keith Kintrea, Anja Giudici, Priyadarshani Joshi, and Jiaqi Liu for their invaluable feedback on earlier versions of this manuscript. I am also thankful to Ingrid Lunt, the editor of the Oxford Review of Education, and two anonymous reviewers for their helpful comments and suggestions.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Additional information

Notes on contributors

Mobarak Hossain

Mobarak Hossain is a Postdoctoral Prize Research Fellow in Sociology at Nuffield College, University of Oxford. His research areas include social stratification and mobility, sociology of education, and the political economy of educational reforms. His work has appeared in journals such as Sociology of Education, Research in Social Stratification and Mobility, International Journal of Educational Development, and Social Science and Medicine - Mental Health.

Notes

1. Since PISA assessment includes only 15-year-olds, our analysis of perceptions of education actors is limited to this age group. More than 85% of pupils in the PISA sample for Scotland are from Grade 11 while the remaining being from Grades 12 and 10. These grade levels have been affected by the CfE reform.

2. The language in the questionnaire was slightly changed for the email interview as there was no back and forth communication, unlike in-person interviews with semi-structured questionnaires.

3. We did not notify officials that their colleagues were also contacted for interviews.

References

  • Addey, C., Sellar, S., Steiner-Khamsi, G., Lingard, B., & Verger, A. (2017). The rise of international large-scale assessments and rationales for participation. Compare: A Journal of Comparative and International Education, 47(3), 434–452. https://doi.org/10.1080/03057925.2017.1301399
  • Baumfield, V., Hulme, M., Livingston, K., & Menter, I. (2010). Consultation and engagement? The reshaping of teacher professionalism through curriculum reform in 21st century Scotland. Scottish Educational Review, 42(2), 57–73. https://doi.org/10.1163/27730840-04202005
  • BBC. (2016a). Scottish schools drop in world rankings. BBC. https://www.bbc.co.uk/news/uk-scotland-scotland-politics-38207729.
  • BBC. (2016b). Sturgeon: ‘No excuses’ for scots school performance. BBC. https://www.bbc.co.uk/news/uk-scotland-scotland-politics-38250199.
  • BBC. (2019). Pisa: Mixed report for Scottish education in world rankings. BBC. https://www.bbc.co.uk/news/uk-scotland-50642855#:~:text=Scotland’s%20scores%20in%20the%202018,data%20to%20504%20this%20time
  • Breakspear, S. (2012). The policy impact of PISA: An exploration of the normative effects of international benchmarking in school system performance. OECD Education Working Papers.
  • Brown, S. (2014). The “curriculum for excellence”: A major change for Scottish science education. The School Science Review, 95(352), 30–36.
  • Bulle, N. (2011). Comparing OECD educational models through the prism of PISA. Comparative Education, 47(4), 503–521. https://doi.org/10.1080/03050068.2011.555117
  • Chmielewski, A. K. (2019). The global increase in the socioeconomic achievement gap, 1964 to 2015. American Sociological Review, 84(3), 517–544. https://doi.org/10.1177/0003122419847165
  • Davidson, G. (2016). Pisa: Performance of Scottish pupils in maths and science at record low. The Scotsman. https://www.scotsman.com/news/politics/pisa-performance-scottish-pupils-maths-and-science-record-low-1400806
  • Dobbins, M., & Martens, K. (2012). Towards an education approach à la finlandaise? French education policy after PISA. Journal of Education Policy, 27(1), 23–43. https://doi.org/10.1080/02680939.2011.622413
  • Ertl, H. (2006). Educational standards and the changing discourse on education: The reception and consequences of the PISA study in Germany. Oxford Review of Education, 32(5), 619–634. https://doi.org/10.1080/03054980600976320
  • Giudici, A. (2020). Teacher politics bottom-up: Theorising the impact of micro-politics on policy generation. Journal of Education Policy, 36(6), 1–21. https://doi.org/10.1080/02680939.2020.1730976
  • Gorur, R. (2016). Seeing like PISA: A cautionary tale about the performativity of international assessments. European Educational Research Journal, 15(5), 598–616. https://doi.org/10.1177/1474904116658299
  • Gorur, R., & Wu, M. (2015). Leaning too far? PISA, policy and Australia’s ‘top five’ambitions. Discourse: Studies in the Cultural Politics of Education, 36(5), 647–664. https://doi.org/10.1080/01596306.2014.930020
  • Grek, S. (2009). Governing by numbers: The PISA ‘effect’in Europe. Journal of Education Policy, 24(1), 23–37. https://doi.org/10.1080/02680930802412669
  • Grek, S. (2012). What PISA knows and can do: Studying the role of national actors in the making of PISA. European Educational Research Journal, 11(2), 243–254. https://doi.org/10.2304/eerj.2012.11.2.243
  • Grey, S., & Morris, P. (2018). PISA: Multiple ‘truths’ and mediatised global governance. Comparative Education, 54(2), 109–131. https://doi.org/10.1080/03050068.2018.1425243
  • Hopfenbeck, T. N., Lenkeit, J., El Masri, Y., Cantrell, K., Ryan, J., & Baird, J.A. (2018). Lessons learned from PISA: A systematic review of peer-reviewed articles on the Programme for International Student Assessment. Scandinavian Journal of Educational Research, 62(3), 333–353. https://doi.org/10.1080/00313831.2016.1258726
  • Lingard, B., & Sellar, S. (2014). Representing your country: Scotland, PISA and new spatialities of educational governance. Scottish Educational Review, 46(1), 5–11. https://doi.org/10.1163/27730840-04601002
  • Lundahl, C., & Serder, M. (2020). Is PISA more important to school reforms than educational research? The selective use of authoritative references in media and in parliamentary debates. Nordic Journal of Studies in Educational Policy, 6(3), 193–206. https://doi.org/10.1080/20020317.2020.1831306
  • Mangez, É., & Hilgers, M. (2012). The field of knowledge and the policy field in education: PISA and the production of knowledge for policy. European Educational Research Journal, 11(2), 189–205. https://doi.org/10.2304/eerj.2012.11.2.189
  • Marsh, D., & Sharman, J. C. (2009). Policy diffusion and policy transfer. Policy Studies, 30(3), 269–288. https://doi.org/10.1080/01442870902863851
  • Meyer, H.D., & Benavot, A. (2013). PISA, power, and policy: The emergence of global educational governance. Symposium Books Ltd.
  • Michel, A. (2017). The contribution of PISA to the convergence of education policies in Europe. European Journal of Education, 52(2), 206–216. https://doi.org/10.1111/ejed.12218
  • OECD. (2007). OECD review of the quality and equity of education outcomes in Scotland: Diagnostic report.
  • OECD. (2015). Improving schools in Scotland: An OECD perspective.
  • Organisation for Economic Co-operation and Development. (n.d.). PISA data. Author.
  • Priestley, M., & Humes, W. (2010). The development of Scotland’s curriculum for excellence: Amnesia and déjà vu. Oxford Review of Education, 36(3), 345–361. https://doi.org/10.1080/03054980903518951
  • Rautalin, M., Alasuutari, P., & Vento, E. (2019). Globalisation of education policies: Does PISA have an effect? Journal of Education Policy, 34(4), 500–522. https://doi.org/10.1080/02680939.2018.1462890
  • Ringarp, J., & Rothland, M. (2010). Is the grass always greener? The effect of the PISA results on education debates in Sweden and Germany. European Educational Research Journal, 9(3), 422–430. https://doi.org/10.2304/eerj.2010.9.3.422
  • Schleicher, A., & Zoido, P. (2016). The policies that shaped PISA, and the policies that PISA shaped. In K. Mundy, A. Green, B. Lingard, & A. Verger (Eds.), The handbook of global education policy (pp. 374–384). Wiley Blackwell.
  • The Scottish Government. (2011a). Building the curriculum 5 - a framework for assessment.
  • The Scottish Government. (2011b). Curriculum for excellence: Building the curriculum 5 a framework for assessment.
  • The Scottish Government. (2017). A research strategy for Scottish education.
  • Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New global modes of governance in education. British Educational Research Journal, 40(6), 917–936. https://doi.org/10.1002/berj.3120
  • Steiner‐Khamsi, G. (2006). The economics of policy borrowing and lending: A study of late adopters. Oxford Review of Education, 32(5), 665–678. https://doi.org/10.1080/03054980600976353
  • Steiner-Khamsi, G. (2004). Blazing a trail for policy theory and practice. In G. Steiner-Khamsi (Ed.), The global politics of educational borrowing and lending (pp. 201–220). Teachers College Press.
  • Steiner-Khamsi, G. (2012). Understanding policy borrowing and lending: Building comparative policy studies. In G. Steiner-Khamsi & F. Waldow (Eds.), World yearbook of education 2012: Policy borrowing and lending in education (pp. 3–17). Routledge.
  • Steiner-Khamsi, G., & Quist, H. O. (2000). The politics of educational borrowing: Reopening the case of Achimota in British Ghana. Comparative Education Review, 44(3), 272–299. https://doi.org/10.1086/447615
  • Steiner-Khamsi, G., & Waldow, F. (2018). PISA for scandalisation, PISA for projection: The use of international large-scale assessments in education policy making–an introduction. Globalisation, Societies and Education, 16(5), 557–565. https://doi.org/10.1080/14767724.2018.1531234
  • Waldow, F. (2009). What PISA did and did not do: Germany after the ‘PISA-shock’. European Educational Research Journal, 8(3), 476–483. https://doi.org/10.2304/eerj.2009.8.3.476
  • Warrell, H. (2016). Ambitious education reforms fail to lift UK Pisa school rankings.” Financial Times. https://www.ft.com/content/1ea10484-bb14-11e6-8b45-b8b81dd5d080
  • Wheater, R., Ager, R., Burge, B., & Sizmur, J. (2013). Achievement of 15-year-olds in wales: PISA 2012 national report: OECD programme for international student assessment. National Foundation for Educational Research (NFER).