4,540
Views
5
CrossRef citations to date
0
Altmetric
Articles

Student access to the curriculum in an age of performativity and accountability: an examination of policy enactment

ORCID Icon, ORCID Icon, &
Pages 228-248 | Received 05 Feb 2018, Accepted 27 Dec 2018, Published online: 30 Jan 2019

ABSTRACT

The curriculum is often the target of reform and governments use a range of accountability measures to ensure compliance. This paper examines the decisions schools in England make regarding history provision, in a period of curriculum change, and the potential consequences of these decisions. Drawing on a large, longitudinal data set, of primary and secondary material, the study examines the relationship between the number of students entered for public examination in history in England and a range of situated and material factors. The data suggest that particular measures of accountability are effective in shaping school decision-making, but the type of school, socio-economic nature of the school intake, and students’ prior attainment are also important factors in understanding the decisions made. This does result in an inequitable access to history education; this inequity exists between different types of schools and socio-economic areas, and is also evident within schools where students with low prior attainment are less likely to be allowed to study history.

Understanding how schools respond to policy in the prevailing accountability and performativity climate is the focus of this paper. In particular, it examines how ‘situated’ (e.g. nature of the intake, local area) and ‘material’ (e.g. staffing levels, financial resource) factors (Braun, Maguire, and Ball Citation2010) intersect with policy to determine which types of students get access to specific areas of the curriculum. While the data for this paper draw on one specific area of the curriculum, namely history, to look at the decisions made by schools, its findings have implications for other parts of the curriculum.

The policy ‘climate’

The publication of international comparison data, such as the ‘Trends in International Mathematics and Science Study’ (TIMSS) and ‘Programme for International Student Assessment’ (PISA), has been seen as one driving force behind the recent ‘policy epidemic’ in education (Ball, Citation2003), observable in many countries (Connell Citation2013). These policy changes are usually well-intentioned, aiming to raise educational standards and improve the economic fortunes of states and their populations, but have been met with varying degrees of success (Goodson Citation2010).

The focus on curriculum

Among these policy changes the curriculum has been a particular focus of attention, for, as Young (Citation2014, p. 8) argues, the curriculum is ‘the pre-eminent issue for all of us in education’. For many countries, the curriculum is seen as the means by which to raise educational standards to gain the advantage in an increasingly globalised economic climate and to meet the perceived needs of a knowledge economy (Ball Citation2017). Debates about the curriculum are wide ranging, focusing on the range of subjects that should be taught and the particular knowledge to be imparted within each subject, and also how those subjects should be related to one another and how the knowledge within them should be structured and sequenced (Winter Citation2012). For example, should the curriculum develop students’ ‘core knowledge’ and cultural literacy (Hirsch Citation1993) or their disciplinary understanding (Young & Muller, Citation2010)? Ultimately such debates raise fundamental questions about the purpose(s) of education.

From curriculum policy espousal to enactment within a performative culture

Although a government may espouse a particular policy, any reform needs to be enacted at a local level. It is therefore important to understand both the decisions that are made by schools and the factors that influence them. This is particularly true if we accept that the existing performativity climate presents schools with difficult choices.

The influence of neo-liberal thinking in the process of policy implementation appears to give decision-making powers and resource to those at the local level (Adams Citation2016; Olssen and Peters Citation2005). However, the use of ‘policy technologies’ (Ball, Citation2003), such as arms-length governance, are designed to ensure that reforms are implemented and their impact monitored; targets are set and performance is measured, and the results are then used to count, classify and measure outcomes, and ultimately dictate further policy decisions (Ball Citation2015). According to Ball (Citation2003) the process of de-regulation is effectively one of ‘re-regulation’, whereby the ‘evaluative state’ (Adams Citation2016) attempts to exercise control over local decision-making.

This presents a challenge for schools. As Davies and Hughes (Citation2009) demonstrate, there are a series of potential policy ‘fractures’ that can occur within and between different stages of policy implementation. One issue is the potential for ‘ideological’ and ‘agency’ fractures, as competing and contradictory policies are developed and pushed into the public domain by different organisations, both inside and outside government. These unresolved contradictions are replicated further down the system, and ultimately individual schools and teachers have to interpret the demands made of them and make choices between difficult and sometimes irreconcilable options. Among the challenges that many schools and teachers face is the tension between meeting accountability measures, upon which schools are judged, and pursuing desirable educational goals (Braun et al. Citation2011; Solomon and Lewin Citation2016).

The outcomes of this process of implementing change are varied. Braun, Maguire, and Ball (Citation2010) show how schools respond differently to the same policies in a study that used semi-structured interviews to gather data from key decision makers in particular case-study schools. School responses were shaped by a complex interaction between four factors: ‘situated’ (to do with the nature of the locality and intake) and ‘material’ (to do with deployment of resource)), ‘professional’ (such as the values and beliefs of those involved) and ‘external’ (levels of support from external agencies, reputation) (Braun, Maguire, and Ball Citation2010). The four schools in their study were all state-maintained, co-educational high schools, with examination results that were around the national average, but they were located in different socio-economic areas, and had different reputations within their localities. The two schools with poorer reputations tended to be more susceptible to policy changes, feeling that they had to adapt and adopt policies to enhance their standing, whereas the other schools that were considered ‘successful’ were more self-confident in what they were doing, and therefore tended to assimilate changes into existing practice or simply ignored policies.

Yet some policies seem to have greater significance because of their associated performativity measures and can fundamentally reorient what happens within schools. This is evident in Solomon and Lewin’s (Citation2016) study of a single school in a deprived area which suffered from high levels of unemployment. Although the school had higher than average examination results for students in the senior school (pupils aged 14 to 16), a regulatory inspection had criticised the lower school curriculum (for pupils aged 11 to 14). To address this concern, the school wanted to introduce a radical change in how its curriculum was constructed and how learning spaces were used, but interview data from senior staff showed that they felt constrained by the need to maximise performance in examination results. Thus, tracking of student progress and target-setting tended to dominate the actions and thoughts of teachers. As a consequence, the decision to experiment with a more personalised learning experience, with an emphasis on promoting more self-directed learning, was abandoned.

The perceived significance of some performance measures means that particular policy initiatives can be given greater prominence by schools. Maguire, Braun, and Ball (Citation2015) found, for example, that in England the 2005 requirement to include English and maths results in the public reporting of examination results gave these subjects much higher status within the curriculum, attracting more favourable staffing, resourcing and timetabling, and therefore provided greater access to these subjects for students. This had a detrimental impact on how other subjects were perceived and supported within the school.

Curriculum reform in England

In recent years there have been significant shifts in curriculum policy in England. At the heart of these changes has been the debate about what type of knowledge should be promoted in the curriculum. Under the New Labour government at the start of the millennium, there was a drive towards greater genericism in the curriculum, which looked to develop transferable skills, such as information handling and critical thinking. Schools were encouraged to experiment with curricula where the value of subjects per se was seen as less important. This trend was reversed under the Conservative-Liberal coalition government of 2010 and subsequent Conservative governments (2015 onwards), and there has been a more ‘traditional’ emphasis on a subject-based curriculum. This has led to the government pushing schools, especially in the examination years (where students typically study for their General Certificate of Education or GCSEs) for students aged 13/14–16, to teach a range of more ‘academic’ subjects. These subjects specifically include English, maths, sciences, a modern language and a humanities subject (defined narrowly as either history or geography). These have become known as the English Baccalaureate (or EBacc) subjects and are seen as academically challenging, and therefore a way of raising standards across the educational landscape. They are also seen as facilitating subjects for access to higher education (Russell Group Citation2013/14), thereby promoting social mobility, and in the case of history, as a source of social cohesion (Harris Citation2013). To encourage schools to embrace this policy the government now publishes schools’ success in the range of EBacc subjects as one of its key performance indicators in the annual publication of school league tables. The EBacc was first introduced in 2010 and in 2017 the government announced that by 2025 it wanted 90% of students to be studying a combination of EBacc subjects.

Within this context history provides an interesting insight into the decisions schools make regarding the curriculum. History has never been a compulsory subject at examination level, and has a reputation for being a demanding subject, partly due to its abstract and conceptual nature (e.g. Wineburg Citation2007), as well as the literary demands it imposes. Given this context, what choices do school leaders make? Do they look to optimise their school’s performance in relation to the designated measures, perhaps by manipulating students’ access to certain subjects? Or do they pursue more general educational goals such as access to a broad and balanced curriculum and allow students genuine freedom of choice over their examination subjects?

Bergh (Citation2015) argues that there is a need to explore the relationship between performance measures and the nature of schools’ responses more fully. Previous studies have indicated that there are unintended consequences to these curriculum reforms, e.g. there are discrepancies in who gets access to particular areas of the curriculum. Tinsley and Board (Citation2017) show that disadvantaged students are less likely to have an opportunity to study a foreign language, and an earlier study by Harris, Downey, and Burn (Citation2012) similarly found some evidence that students from poorer socio-economic backgrounds had more restricted access to history within the school curriculum. This study also noted an association between the time that schools allocated to the subject within Key Stage 3 (KS3, which covers the first two or three years of secondary school for students aged 11–13/14 in England) and students’ subsequent decisions to continue studying history for GCSE. Given the government’s ambition for the vast majority of students to study subjects like history, it is important to understand how schools react to curriculum reform.

Research in this area tends to take the form of case studies (e.g. Solomon and Lewin Citation2016), which have provided in-depth insight into some of the issues, but have necessarily been focused on a few instances of school decision-making. By contrast, the present paper draws upon a large dataset to examine what schools actually do as they deal with curriculum policy initiatives, and explores the relationship between curriculum reform and the school ‘situated’ factors identified by Braun, Maguire, and Ball (Citation2010) along with some of the ‘material’ factors, showing how this relationship impacts on students’ access to the curriculum.

Methodology

Research questions

The data reported in the current study are taken from an annual national online survey conducted on behalf of the Historical Association (HA). The surveys reported here span the period 2010–2014. The survey was originally launched to address a lack of empirical data about the state of history in secondary schools, such as the time allocated to the subject within the curriculum, the proportion of students studying the subject, and teachers’ reactions to reforms. The lack of official data collected by the Department for Education, combined with teachers’ anecdotal concerns reported to the HA, highlighted a need to monitor the specific decisions that were being taken across a wide range of schools. The current article addresses the following research questions, looking at situated factors (school type, nature of the intake) and material factors (time allocation) and access to history in the curriculum:

  • 1: What is the relationship between the number of students taking GCSE history in Year 10Footnote1 and:

    1. School type

    2. Socio-economic status (of the area in which the school is situated)

    3. Academic outcomes for schools (achievement and progress measures)?

  • 2: What is the relationship between the number of students taking GCSE history in Year 10 and the KS3 time allocation for history in schools?

  • 3: Is there a relationship between the attainment level of students permitted to study history at GCSE and:

    1. School type

    2. Socio-economic status (of the area in which the school is situated)

    3. Academic outcomes for schools?

Data collection

The survey data combine the HA annual survey and Department for Education (DfE) performance table data taken from the years 2010–2014. The HA survey data focuses on specific issues relating to developments within the history curriculum, while the DfE performance data provide additional contextual information about the performance of those schools which responded. The objective was to create a detailed, longitudinal data set, to identify key trends and determine the impact of policy and curriculum developments on history education in England. Each spring term invitations to complete the online questionnaire were sent to all secondary schools in England. Over the five years of the survey, responses were received from 2156 secondary schools in total, varying between 8% and 12% of all schools in England from year to year (see for response rates by year). The survey data were examined to remove within-year duplicates (multiple responses from the same school within the same year); where there were duplicates, responses were kept from the most senior teacher within the history department or if seniority could not be discerned, the most complete set of responses was kept. However, duplication across years was permitted and was taken into account when conducting cross-year analysis (see results section for more details). Fifty-eight per cent of the total responses were from comprehensive schools, 25% were from academies (pre- and post-2010), 10% from independent schools and 8% from state-run grammar schools. Schools that categorised themselves as ‘sixth-form colleges’ (for students aged 16–18/19) or ‘other’ constituted less than 3% of total responses.

Table 1. urvey responses by year.

The English education system is diverse in the types of schools that exist. Most schools are state-funded and include comprehensive schools, academy schools (which differ from comprehensives in that they are independent of local educational authority control and are funded directly from central government) and grammar schools. Grammar schools only exist in a few educational regions and students are selected by examination at the age of 11, which means that comprehensives and academies tend to be the most common type of school and cater for students from across the ability range. Academies created pre-2010 were situated in areas of high deprivation, where the existing schools were perceived as having little impact on young people’s academic outcomes; so the intention was to give schools more freedom to meet the needs of their students. This freedom was extended in 2010 so that any school could gain academy status. Consequently, pre-2010 academies reflect a very specific context, whereas this is not the case for those adopting this status after 2010. Academy schools are therefore categorised in the data as pre- or post-2010 to reflect this difference. Most schools take students from the age of 11–18 years, but in some regions, students move at the age of 16 to specialist sixth-form colleges which take students aged 16–18 years.

The combined data consist of 47 items in total. Not all respondents completed all questionnaire items and therefore missing data were coded accordingly to allow for accurate statistical analysis of responses. Seven items relate to school characteristics such as type of school (e.g. comprehensive, grammar, independent), age range of school (e.g. 11–16 years), postcode and number of students on roll and provide detail about the situated nature of the schools (see Appendix A for example questionnaire items).

The schools’ postcodes from the HA surveys were used to cross-reference the survey data with information about individual school performance taken from the Department for Education performance tables (https://www.compare-school-performance.service.gov.uk/). This permitted the analysis of history provision in relation to a range of school contextual and external factors. These included data related, for example, to the percentage of students obtaining five A*–C grades at GCSE (including English and mathematics) and to the percentage of students identified with special educational needs (SEN). A measure of socio-economic status (SES) was obtained for each school based upon an Income Deprivation Affecting Children Index (IDACI) which provides a score and rank for social deprivation for each postcode in England (see http://imd-by-postcode.opendatacommunities.org/). The remaining 40 survey items relate to the organisation and the delivery of the history curriculum (e.g. length of KS3, number of students taking history as a GCSE subject in Year 10, etc.).

Data analysis procedures

A large number of items in the survey are discrete point responses that were then coded numerically when entered into SPSS ® for analysis. As a result, the dataset includes a range of categorical variables (e.g. type of school), count variables (e.g. time allocation Year 7) and continuous variables (e.g. IDACI score). Appendix B contains a detailed description of the variables included in the analysis. Following Velleman and Wilkinson (Citation1993), the main dependent variable ‘% history uptake in year 10ʹ is considered as a count or continuous variable in this study because, even though the variable comprises discrete values (1–5), the values are positive integers and the distance between these values is consistent. The same also applies for the ‘time allocation for history in Year 7ʹ variable which was also treated as a continuous variable for the analysis. Alpha levels for significance testing were set to 0.05 for all statistical tests. Normality of distribution of the three primary independent variables ('% history uptake in year 10ʹ, ‘IDACI score’ and ‘% EBacc achievement’) was verified indicating that the use of parametric tests was appropriate.Footnote2 The statistical analyses were conducted with SPSS and STATA and further details of the statistical tests used will be presented with the results.

Results

Research question 1: What is the relationship between the number of students taking GCSE history in Year 10 and:

  1. School type

  2. Socioeconomic status (of the area in which the school is situated)

  3. Academic outcomes for schools (EBacc achievement)?

Descriptive statistics were firstly calculated for the period 2010–14 to show the percentage of students who take GCSE history in Year 10, arranged by school type (). The data show that just over a third of pre-2010 academies reported having only 0–30% of learners taking history at GSCE in Year 10, which is comparable with state comprehensives. This is in contrast to the relatively low proportion of post-2010 academies, state grammar schools and independent schools that reported such low levels of history uptake. Just over half of the pre-2010 academies that responded reported a history uptake of 31–60%, slightly lower than for state comprehensives. Around 70% of post-2010 academies reported a history uptake of 31–60% which is a much larger proportion of respondents than for the grammar schools and independent schools. Conversely, the proportion of respondents reporting a GCSE history uptake in Year 10 of 61–100% was much higher among those from the grammar and independent schools than among those working in academies. In summary, the descriptive data shown in demonstrate the differing history uptake profiles across the different school types. Both independent and grammar schools were most likely to report history uptake of between 61% and 100%, with state comprehensive schools the least likely to report a comparable level of history uptake (only 11%). An examination of the data on a year-by-year basis showed that the percentages reported in remained relatively stable for all school types across the lifetime of the survey.

Table 2. Number of students (in percentage bands) studying history GCSE in year 10, by school type.

A one-way ANOVA was conducted with the full dataset to evaluate whether there were significant differences in history uptake in Year 10 for different school types. The results show that there was a significant difference in history uptake between different types of schools with a medium effect size (f = 28.485, p = <.001, ηp2 = 0.09). Post hoc tests indicate that state comprehensive schools had a significantly lower history uptake than all other school types, except pre-2010 academies. On the other hand, independent schools had a significantly higher history uptake than all other school types, except state grammar schools for which uptake levels were broadly similar.

The association between the socioeconomic status of the area in which the school is situated and history uptake was explored by conducting a one-way ANOVA with the full dataset to investigate whether IDACI mean scores differed across schools in the different history uptake bands (see ). Overall a significant difference was found in mean IDACI scores between schools in different history uptake bands with a medium effect size (f = 22.34, p < .001, ηp2 = 0.06). Post-hoc Bonferroni tests indicate that the IDACI mean scores for schools with a history uptake of 0–15% and 16–30% were significantly higher than those for schools in the other three uptake bands (higher IDACI scores signifies higher levels of deprivation).

Table 3. Means and standard deviation IDACI scores by school according to the percentage band of pupils studying history GCSE in Year 10.

For a year by year analysis, an ordinary least squares (OLS) regression was conducted with history uptake as the dependent variable. The results indicate that IDACI score had a significant relationship with history GCSE uptake in all years except 2012 (see ). However, the data did not fully satisfy all the assumptions for the use of OLS regression. Therefore, in the interests of rigour, a negative binominal regression, used for modelling count variables, was also conducted (see Appendix C for results table). In both regression models, the parameters are negative (but only significant in the OLS model) which means that variation in the proportion of students studying history at GCSE varied negatively by family income and background.

Table 4. Regression table for history GCSE uptake as dependent variable and school IDACI score as independent variable.

shows, for example, that in 2010, if a school was in a lower income area the percentage of students studying history would be 2.260 points lower (on the uptake band scale) than in a school in a higher income area and that this difference is significant. The limitations of the data measurement mean that the R squared value is relatively low for the OLS model. However, if the uptake of history GCSE data consisted of the actual percentage uptake rather than uptake bands (count data), it is likely that the R Square values would be much higher.

To investigate how school academic outcomes relate to history GCSE uptake, the percentage of pupils achieving the EBacc variable was used as the measure of attainment for each school from 2011 onwards (the year the measure was first reported). Data for academic outcomes for individual subjects by individual school are not publicly available. However, data for the number of students who achieve the EBacc in each school are published annually. For students to be counted within the EBacc measure they have to take either history or geography as one of their GCSE courses. Therefore, in lieu of history specific attainment data, the EBacc results can be used as a proxy measure to judge the relationship between schools with high history GCSE take up and high academic outcomes in this subject area. The data displayed in show that for all schools from 2011–2014, those with the highest percentage of history GCSE uptake were also those that reported the highest percentage of students achieving the EBacc and vice versa. The results of a series of one-way ANOVA show that the difference in the percentage of EBacc achievement between history uptake bands was significant for all years except 2012 with medium to very large effect sizes (2011: f = 19.46, p = <.001 ηp2 = 0.20; 2012: f = 1.30, p = .271, ηp2 = 0.02; 2013: f = 11.794 p = <.001, ηp2 = 0.10; 2014; f = 3.728, p = .006, ηp2 = 0.08).

Table 5. Descriptive statistics of the percentage students studying GCSE history and school EBacc outcomes.

Furthermore, the results of a series of Pearson correlations (see ) to measure the relationship between history GCSE uptake in Year 10 and school attainment (EBacc measure) show that there was a significant if modest correlation between history GCSE uptake and attainment for all years combined, as well as for each year individually, except 2012. The lack of significant correlations in 2012 is likely to be accounted for by the lower response rate to the survey in that year and a more even spread of EBacc results across the GCSE history uptake bands, which is due to a small number of schools reporting relatively high EBacc achievement despite a reported low history GCSE uptake. This analysis implies that schools with high levels of GCSE history uptake have high levels of achievement in the EBacc subjects, but such schools are typically in the independent and grammar school sector, or, where schools are state maintained, tend to be in areas of higher socio-economic status.

Research question 2: What is the relationship between the number of students taking GCSE history in Year 10 and time allocation for history in schools?

Table 6. results of a Pearson correlation between percentage of GCSE uptake and % student achieving EBacc.

displays the number of schools for each history average time allocation band, by school type (years 2010–2014). Overall, only a small proportion of schools reported offering less than 45 min of history per week in Year 7. Over a third of comprehensives, state grammar schools and post-2010 academies reported offering more than 90 min of history per week. In contrast, the proportion of pre-2010 academies offering more than 90 min was only 22%. Furthermore, pre-2010 academies reported offering the least amount of time for history in Year 7, with over a third reporting a time allocation of 46–60 minutes.

Table 7. number of schools for each history average time allocation band by school type (years 2010–2014).

To investigate the relationship between time allocation in Year 7 and history GCSE uptake in Year 10, a series of Spearman correlations were conducted and the results show that there was significant correlation between the time allocation for history in Year 7 and Year 10 GCSE history uptake across all years (see ). While the correlations were significant, they were arguably rather weak. This is due to the nature of the data measurement; if actual time allocation (in minutes) and the actual percentage uptake were recorded, we would expect to see stronger correlations between these two variables.

Table 8. Correlation table Y7 history time allocation and Y10 GCSE uptake.

The vast majority of schools offered a three-year KS3 and there was no statistically significant correlation between the length of KS3 and Year 7 time allocation for history. As such the analysis indicates that length of Key Stage did not appear to influence GCSE history uptake in Year 10.

To evaluate the interaction of IDACI scores, Year 7 time allocation and history GCSE uptake, an ordinary least squares regression analysis was undertaken (), with the percentage of students studying history as the dependent variable and IDACI score and Year 7 time allocation as the independent variables. The results show that time allocation in Year 7 had a positive and significant relationship with Year 10 history uptake for all years except 2012, even when taking into account IDACI scores. These results suggest that if schools within a lower income area increased the time allocation for history in Year 7, then GCSE history uptake might increase. While these findings are significant, the R square values once again remain low due to the way the data were collected. Again, if the variable ‘average time allocation in year 7ʹ was collected with the actual number of minutes devoted to history teaching in Year 7, it is likely that these R square values would be much higher and we would have a more accurate indication of how the number of minutes spent on history teaching in Year 7 influenced the number of students who go on to study history in Year 10. This is an important point to consider for future research.

Research question 3: Is there a relationship between the attainment level of students permitted to study history at GCSE and:

  1. School type

  2. Socioeconomic status (of the area in which the school is situated)

  3. Academic outcomes for schools?

Table 9. Regression table for history GCSE uptake as dependent variable and school IDACI score and Year 7 time allocation as independent variables.

displays data regarding the nature of the choice offered to students in relation to history GCSE and the type of school (2012–2014). The data indicate that a majority of schools allowed pupils to choose history at GCSE or to choose between history and geography. Moreover, only a very small number of comprehensives and post-2010 academies claimed to restrict access to history for students identified as lower attaining (and sometimes described as ‘lower ability’).

Table 10. History GCSE options by type of school (2012–2014).

Nevertheless, even though a large proportion of schools claimed to have open access to history GCSE, the data in show that around a third of comprehensives and post-2010 academies that responded to the question, and around 20% of pre-2010 academies and independent schools actively discouraged lower attaining students from selecting history as a GCSE subject.

Table 11. tudents steered towards history by school type (2012–2014).

The results of a one-way ANOVA show that there was a significant difference in IDACI scores between schools in terms of the way in which they steered students towards history GCSE (f = 5.523, p = <.001, ηp2 = 0.05). A Bonferroni post-hoc analysis shows that schools that steered students towards history GCSE if they had a predicted score of C or above, had a significantly higher IDACI score than schools in the other three groups. However, while schools in more deprived areas may have actively encouraged students with a higher chance of achieving C or above at GCSE in history, the data do not support the assertion that schools in deprived areas were more likely than other schools to actively discourage lower attaining learners from taking history at GCSE, as the data indicate that this, in fact, happened across a range of school types. Finally, the survey data were analysed to ascertain how prior attainment may have influenced entry to the EBacc (and therefore EBacc subjects including history), across school type. For convenience, data are reported for all years 2011–2014 in total rather for each year individually. The results displayed in show the percentage of students entered for Ebacc by prior attainment.

Table 12. ercentage of students entered for Ebacc by prior attainment.

Even though there were a small number of responses from grammar and independent schools, the data show that there was a clear trend across all school types for the majority of students entered for the EBacc to have higher prior academic attainment. The concern here is for those students with low prior attainment and whether they are being denied access to an area of the curriculum that may potentially be beneficial to them.

Discussion

The findings from the examination of the datasets give some clear indications about how schools respond to accountability measures. The introduction of the EBacc has led to a rise in the number of students entered for subjects like history and the indications are that increasing numbers of students are being entered for this range of subjects (JCQ Citation2016), but the data presented here show that this rise is unevenly spread across schools, and highlights the fact that a complex range of situated, and to a lesser extent, material factors affect how schools respond to such initiatives in how they organise their curriculum and access to aspects of that curriculum.

At one level, schools are responding in ways that the government wants, i.e. more students are being entered for subjects like history, promoted through the EBacc (JCQ Citation2016). This shows that such forms of arms-length governance can influence school decision-making. It can be argued that the introduction of the EBacc has been a positive move on the grounds that subjects like history should be studied by all students to the age of 16 (as Cannadine, Keating, and Sheldon Citation2011 have argued in the case of history) and it thus helps to ensure that all students have a broad and balanced curriculum. However, schools also appear to be in a ‘catch-22ʹ situation regarding entries to the EBacc – the government has indicated that it wants at least 90% of students to take the EBacc subjects by 2025, while reforms to GCSE have allegedly made them more challenging. So schools are being pressured into entering more students for specific subjects and simultaneously being judged by how many students do well in these areas. Since the nature of the EBacc measure means that schools are being judged in terms of their students’ raw attainment, rather than their progress over time, they have responded by entering more middle and high-attaining students. Overall the introduction of this performance measure appears to have created an inequitable situation regarding access to subjects like history. To address this the government has attempted to encourage schools to enter lower attaining students for EBacc subjects by introducing another performance measure, known as Progress 8. This was introduced in 2016 and measures a student’s progress (rather than raw attainment) at the age of 16, over eight subject areas taken at examination level (but which have to include the EBacc subjects plus any other three subjects studied). Early indications of the impact of this are mixed; Burn and Harris (Citation2017) found that although fewer schools reported deliberately steering students away from studying history, many teachers felt that the increased challenge of the GCSE exam would mean lower attaining students were more likely to struggle. Hence, many schools still continue to restrict access to the history curriculum. This appears to be an example of a policy fracture (Davies and Hughes Citation2009) where government actions to increase uptake of subjects like history are countered by other government reforms to increase the degree of challenge in examinations.

Schools are then left to decide how to navigate this policy landscape, and the result can be inequitable access to particular areas of the curriculum. This inequity can be seen in the situated factors, explored in the data, that influence a young person’s access to history. One is school type. Independent and grammar schools, that have a reputation for being more ‘academic’, do have a significantly higher proportion of students that study history at examination level. Given the ‘facilitating’ nature of the subject for access to higher education, this is likely to perpetuate the situation where students from such schools are more likely to gain access to prestigious universities. Although there are many state-maintained comprehensive and academy schools that also have large numbers of students opting to study history, these schools are overwhelmingly in areas of higher socio-economic status. Although a link between socio-economic factors and school decision-making clearly exists, the data presented here suggests that the nature of this relationship is uncertain. The findings do, however, raise concerns about the type of curriculum available to students from poorer economic backgrounds. Students from such backgrounds are less likely to opt, or perhaps even to be given the chance to opt for ‘academic’ subjects like history. This seems to be an issue affecting other areas of the curriculum, as Tinsley and Board (Citation2017) highlight similar concerns in relation to foreign language learning. This would raise serious questions about the development of a ‘two-tier’ curriculum, where lower attaining students are denied access to subjects available to their higher attaining peers (Harris and Burn Citation2011). Although there may be perfectly legitimate reasons for this, lack of access to ‘facilitating’ subjects is a possible factor in these students being less likely to apply for high-status universities (Doward Citation2017), and thus may contribute to social immobility.

Another factor that appears to be a strong influence over a young person’s access to subjects like history is prior attainment. Students with high – or medium-level prior attainment on entry to secondary school are increasingly likely to be entered for EBacc subjects like history. However, those with low prior attainment are significantly less likely to be entered. By definition such students will not be found in grammar schools and, because of the persisting correlation between socio-economic advantage and academic outcomes, are rarely to be found in independent schools either. They are far more likely to be found in areas of low socio-economic status. While there may be legitimate reasons for offering students opportunities to undertake more practical subjects, at present it appears that those attending schools in areas of low socio-economic status have restricted access to important areas of the curriculum and ‘powerful knowledge’ (Young and Muller Citation2010), and social immobility is likely to be reinforced. The potential exists that such young people will be condemned to a self-perpetuating cycle of low attainment, which will also affect the outcomes of their families in future generations (e.g. Griggs and Walker Citation2008).

While some of the factors that we have shown to be important – type and locality of school – are obviously fixed, individual schools have the power to make their own curricular decisions; for example, about how much time is allocated to different subjects. There is some correlation between how much time is given to history in the lower part of the secondary school and how many students choose to study it as an optional public examination; the reason for this association is not entirely clear, but the amount of time allocated to the subject may send out messages to students about its perceived value, or it may be that departments considered to be already effective are given more curriculum space by senior school leaders. Schools also have the power to decide who can and cannot opt for a subject like history. Although most schools do provide students with a free choice of subjects, there is a significant minority that are making choices for students (e.g. Burn and Harris Citation2014). This is a concern as it is not clear whether such choices are being made in the interests of individual students or whether, within a culture of performativity and accountability, schools are making decisions that will cast them (rather than their students) in a ‘positive’ light, depending on the accountability measure being used to judge ‘success’.

The range of factors that appear to restrict young people’s access to particular areas of the curriculum, especially those from poorer socio-economic areas and with lower levels of attainment, is concerning. Although there are differing views about the nature of school history and what students should learn in history, there are strong arguments that history ought to be studied by all students. Advocates of ‘cultural literacy’ (Hirsch Citation1987) argue for ‘the need to create a public sphere of knowledge that enables all cultural groups to engage with common issues: that is, issues that go beyond people’s local culture’ (Lambert Citation2011, 254–255). For those that advocate a disciplinary approach to studying a subject (e.g. Cain and Chapman Citation2014) the genuine concern is that ‘students whose own experience is least likely to offer them other means of access to the powerful knowledge that derives from disciplinary thinking’ (Harris and Burn Citation2011, 259) are denied access to new ways of seeing, understanding and thinking about the world in which they live. Within the current policy context, however, it would appear that performance measures act as a powerful lever in determining who gets access to a range of subjects. This can actually undermine attempts by schools to provide students with a broad and balanced education. Given the high stakes accountability system in which schools operate, and at a time when the government has taken steps to increase the difficulty of public examinations, it is clear that schools are responding to this measure by encouraging more students to take subjects associated with the EBacc, but at the same time it appears that only those from higher socio-economic backgrounds and/or with suitably high levels of prior attainment are able to take advantage of these developments. Such is the unintended consequence of this current policy driver. What both governments and schools need to consider is what their priorities are and how best to meet these – at present, there is a danger that some schools’ decision-making processes prioritise meeting accountability measures, rather than the needs of some students, and are thereby creating a two-tier curriculum.

Acknowledgments

We would like to thank Prof Suzanne Graham and Dr Daisy Powell for their helpful advice and comments in preparing this article.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Notes on contributors

Richard Harris

Richard Harris is an Associate Professor in history education at the University of Reading. His interests are mainly related to the history education, especially issues relating to the curriculum, as well as the place of diversity within the history curriculum and the public and political discourse around history education.

Louise Courtney

Louise Courtney is a lecturer in Language and Education at the University of Reading, Institute of Education. Her research interests include the qualitative and quantitative analysis of the influence of contextual variables, such as teaching time and teacher expertise, and individual learner variables, on learner engagement, motivation and attainment in modern foreign languages and education more broadly.

Zain Ul-Abadin

Zain Ul-Abadin is a PhD economics student at the University of Reading. His research is based on subject choices and returners to education. Previously he studied a MSc in business and financial economics.

Katharine Burn

Katharine Burn is Associate Professor of Education at the University of Oxford where she acts as Director for the Oxford Education Deanery, a multi-strand partnership that supports research engagement and knowledge exchange with local schools.

Notes

1. Year 10 (Y10), students aged 14–15, is traditionally the year in which students start their GCSE examination courses.

2. Normality and homogeneity of variance assumptions for both variables were assessed by examining histograms and normality tests for the dataset as a whole and for yearly data. Normality tests (Shapiro–Wilks) indicated that the ‘% history uptake in year 10‘ scores were not normally distributed (S-W .91, df 1743, p < 0.0001, skew. −.230, kurt. −.697) as was also the case for IDACI scores (S-W .87, df 1831, p < 0.0001, skew. 1.282, kurt. 1.225) and ‘% Ebacc achievement’ (S-W .90, df 1315, p < 0.0001, skew. 1.136, kurt. .936). In all cases, the histograms and normal Q-Q plots suggest, however, that deviations from normality were not severe and the values of skewness and kurtosis were in the acceptable range of −2 to +2 (George and Mallery Citation2010). Therefore, following Field (Citation2013), it was decided that parametric tests were robust enough to cope with the slight deviations from normality for this score given the large sample size.

References

  • Adams, P. 2016. “Education Policy: Explaining, Framing and Forming.” Journal of Education Policy 31 (3): 290–307. doi:10.1080/02680939.2015.1084387.
  • Ball, S. 2003. “The Teacher’s Soul and the Terrors of Performativity.” Journal of Education Policy 18 (2): 215–228. doi:10.1080/0268093022000043065.
  • Ball, S. 2015. “Education, Governance and the Tyranny of Numbers.” Journal of Education Policy 30 (3): 299–301. doi:10.1080/02680939.2015.1013271.
  • Ball, S. 2017. The Education Debate. 3rd ed. Bristol: Policy Press.
  • Bergh, A. 2015. “Local Quality Work in an Age of Accountability – Between Autonomy and Control.” Journal of Education Policy 30 (4): 590–607. doi:10.1080/02680939.2015.1017612.
  • Braun, A., M. Maguire, and S. Ball. 2010. “Policy Enactments in the UK Secondary School: Examining Policy, Practice and School Positioning.” Journal of Education Policy 25 (4): 547–560. doi:10.1080/02680931003698544.
  • Braun, A., S. Ball, M. Maguire, and K. Hoskins. 2011. “Taking Context Seriously: Towards Explaining Policy Enactments in the Secondary School.” Discourse: Studies in the Cultural Politics of Education 32 (4): 585–596. doi:10.1080/01596306.2011.601555.
  • Burn, K., and R. Harris 2014. “Historical Association Survey of History in Schools in England 2014.” http://www.history.org.uk/news/news_2303.html
  • Burn, K., and R. Harris 2017. “Historical Association Survey of History in Schools in England 2017.” https://www.history.org.uk/ha-news/categories/455/news/3452/history-in-schools-2017
  • Cain, T., and A. Chapman. 2014. “Dysfunctional Dichotomies? Deflating Bipolar Constructions of Curriculum and Pedagogy through Case Studies from Music and History.” The Curriculum Journal 25 (1): 111–129. doi:10.1080/09585176.2013.877396.
  • Cannadine, D., J. Keating, and N. Sheldon. 2011. The Right Kind of History. Basingstoke: Palgrave MacMillan.
  • Connell, R. 2013. “The Neoliberal Cascade and Education: An Essay on the Market Agenda and Its Consequences.” Critical Studies in Education 54 (2): 99–112. doi:10.1080/17508487.2013.776990.
  • Davies, P., and J. Hughes. 2009. “The Fractured Arms of Government and the Premature End of Lifelong Learning.” Journal of Education Policy 24 (5): 595–610. doi:10.1080/02680930903125145.
  • Doward, J. 2017. “Wrong A-Level Choices Prevent Poorer Students Gaining Elite University Places.” The Guardian, August 14. https://www.theguardian.com/education/2017/aug/12/poor-students-miss-out-on-elite-universities
  • Field, A. 2013. Discovering Statistics Using IBM SPSS Statistics. London: Sage.
  • George, D., and M. Mallery. 2010. SPSS for Windows Step by Step: A Simple Guide and Reference, 17.0 Update (10a Ed.). Pearson: Boston.
  • Goodson, I. 2010. “Times of Educational Change: Towards an Understanding of Patterns of Historical and Cultural Refraction.” Journal of Education Policy 25 (6): 767–775. doi:10.1080/02680939.2010.508179.
  • Griggs, J., and R. Walker. 2008. “The Costs of Child Poverty for Individuals and Society.” Joseph Rowntree Foundation. https://www.jrf.org.uk/sites/default/files/jrf/migrated/files/2301-child-poverty-costs.pdf
  • Harris, R. 2013. “The Place of Diversity within History and the Challenge of Policy and Curriculum.” Oxford Review of Education 39 (3): 400–419. doi:10.1080/03054985.2013.810551.
  • Harris, R., C. Downey, and K. Burn. 2012. “History Education in Comprehensive Schools: Using School-Level Data to Interpret National Patterns.” Oxford Review of Education 38 (4): 413–436. doi:10.1080/03054985.2012.707614.
  • Harris, R., and K. Burn. 2011. “Curriculum Theory, Curriculum Policy and the Problem of Ill-Disciplined Thinking.” Journal of Education Policy 26 (2): 245–261. doi:10.1080/02680939.2010.498902.
  • Hirsch, E. D. 1987. Cultural Literacy: What Every American Needs to Know. New York: Vintage Books.
  • Hirsch, E. D. 1993. “The Core Knowledge Curriculum – What’s behind Its Success?” Educational Leadership 50 (8): 23–30.
  • JCQ. 2016. “GCSE, Project and Entry Level Trends, 2016.” https://www.jcq.org.uk/examination-results/gcses/2016
  • Lambert, D. 2011. “Reviewing the Case for Geography, and the ‘Knowledge Turn’ in the English National Curriculum.” The Curriculum Journal 22 (2): 243–264. doi:10.1080/09585176.2011.574991.
  • Maguire, M., A. Braun, and S. Ball. 2015. “Where You Stand Depends on Where You Sit’: The Social Construction of Policy Enactments in the (English) Secondary School.” Discourse: Studies in the Cultural Politics of Education 36 (4): 485–499. doi:10.1080/01596306.2014.977022.
  • Olssen, M., and M. Peters. 2005. “Neoliberalism, Higher Education and the Knowledge Economy: From the Free Market to Knowledge Capitalism.” Journal of Education Policy 20 (3): 313–345. doi:10.1080/02680930500108718.
  • Russell Group. 2013/14. “Informed choices: A Russell Group guide to making decisions about post-16 education.” https://www.russellgroup.ac.uk/media/5272/informedchoices-print.pdf
  • Solomon, Y., and C. Lewin. 2016. “Measuring ‘Progress’: Performativity as Both Driver and Constraint in School Innovation.” Journal of Education Policy 31 (2): 226–238. doi:10.1080/02680939.2015.1062147.
  • Tinsley, T., and K. Board. 2017. “Language Trends 2016/17. Language Teaching in Primary and Secondary Schools in England: Survey Report.” British Council. https://www.britishcouncil.org/sites/default/files/language_trends_survey_2017_0.pdf
  • Velleman, P. F., and L. Wilkinson. 1993. “Nominal, Ordinal, Interval, and Ratio Typologies are Misleading.” The American Statistician 47 (1): 65–72.
  • Wineburg, S. 2007. “Unnatural and Essential: The Nature of Historical Thinking.” Teaching History 129: 6–12.
  • Winter, C. 2012. “School Curriculum, Globalisation and the Constitution of Policy Problems and Solutions.” Journal of Education Policy 27 (3): 295–314. doi:10.1080/02680939.2011.609911.
  • Young, M. and J. Muller. 2010. “Three Educational Scenarios for the Future: Lessons from the Sociology of Knowledge.” European Journal of Education 45 (1): 11–27. doi:10.1111/j.1465-3435.2009.01413.x.
  • Young, M. 2014. “What Is a Curriculum and What Can It Do?” The Curriculum Journal 25 (1): 7–13. doi:10.1080/09585176.2014.902526.

Appendix A. Example questionnaire items

Appendix B. Detailed description of variables used in the analysis

Appendix C. Results of negative binominal regression