1,319
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Careers Hubs: pilot of a place-based school improvement network in England

ORCID Icon &
Pages 988-1004 | Received 16 Jun 2021, Accepted 14 Feb 2022, Published online: 02 May 2022

ABSTRACT

The Careers Hubs pilot (2018–2020) tested a place-based network model designed to facilitate English schools and colleges delivering career guidance, measured primarily by adherence to eight benchmarks of good practice. Using a standardised measurement tool, career guidance in hub schools was observed to improve faster than a matched comparison group (estimated at +1.0 benchmarks for Wave 1 hubs, effect size 0.4, n = 1,948, p < 0.001). Results are robust to a range of modelling techniques, control variables and adjustments for possible selection effects, suggesting that the networks did support self-assessed school improvement. A qualitative evaluation identified collaboration between Careers Leaders, leadership support, employer engagement, and regional strategy alignment as key drivers of progress.

Introduction

Career guidance is one among several important elements supporting young people to navigate education, employment, and life pathways (e.g. Anders, Citation2017; Hodgson & Spours, Citation2013). High quality and impartial guidance is one of the differentiating factors set out by Hodgson and Spours (Citation2013) to define systems that afford either high or low opportunity for progression for young people aged 14–19. Anders (Citation2017) uses large-scale English longitudinal datasets to describe the positive potential of guidance in navigating decision points connected to widening participation in higher education. Meanwhile, Thomsen (Citation2017) uses qualitative longitudinal research in northern England to describe the negative potential of low quality guidance as an additional challenge faced by disadvantaged young people at risk of exiting education for unemployment post-16. This paper reports a mixed-methods evaluation of a large-scale government pilot to improve career guidance in secondary education in England via modestly-resourced, place-based networks: “Careers Hubs”.

Hughes et al. (Citation2021) identify Careers Hubs as one of the strengths of the current policy environment in England for building better futures for young people, while emphasising a number of challenges and weaknesses across the system. Outside of England, researchers have also discussed the value of professional networks to improve career guidance provision. Lairio and Nissilä (Citation2002) explore the potential value of professional networks to support school-based guidance activity in a Finnish context, which Godden (Citation2022) also identifies in the usage of career guidance policy documents in the Ontario province of Canada.

As a new policy, launched in 2018, there has been little academically published evidence on the impact or success factors behind Careers Hubs. This paper helps inform the discussion around Careers Hubs as an experiment in policy design for career guidance delivery, both for the ongoing roll-out in England and for the potential investigation of the model by other jurisdictions. We present a quantitative assessment of the progress made in Careers Hubs between 2018 and 2020 compared to schools outside of Careers Hubs, with interpretation and implications for policy and practice informed by a qualitative evaluation of the programme. The outcome measure in the quantitative assessment is the change over time in a self-reported measure from schools and colleges, in which users describe their careers provision via an online survey tool called Compass, leading to a score from zero to eight. The Limitations section of this paper discusses considerations and caveats concerning the use of self-report data via this particular tool.

Policy context in England

The Education Act 2011 transferred responsibility for the delivery of career guidance for secondary age students in England from local authorities to schools and collegesFootnote1, marking a significant change in policy and delivery (Andrews, Citation2011). The lack of dedicated school budgets and accountability measures led to concerns over the adequacy, quality, and consistency of provision in the years following the Act (Langley et al., Citation2014; Moote & Archer, Citation2016; Watts, Citation2013).

The publication of the Gatsby Benchmarks of good career guidance in 2014 provided schools and colleges with a framework based on international evidence intended to promote positive outcomes from career guidance, including improved transitions between education and employment and better fit choices of education and career pathways; details on the underlying research, rationale, and examples of each benchmark can be found in Gatsby (Citation2014). In summary, Gatsby (Citation2014) promoted the term “career guidance” to encompass diverse activities targeted at the secondary education phase, where good provision would be characterised by eight benchmarks:

  1. A stable careers programme.

  2. Learning from career and labour market information.

  3. Addressing the needs of each pupil.

  4. Linking curriculum learning to careers.

  5. Encounters with employers and employees.

  6. Experiences of workplaces.

  7. Encounters with further and higher education.

  8. Personal guidance.

The online survey tool, Compass, is available to schools to track their achievement of these benchmarks. Quantitative and descriptive measures are used for each of these benchmarks wherever possible, such as requiring 76% or more students to have had a meaningful experience of a workplace by the completion of lower secondary education at the age of 16, supported by examples of activities that might constitute a meaningful experience given to aid completion. The primary outcome measure in this study is based on the number of these eight benchmarks that careers provision fulfils in a given school or college.

Adoption of the Gatsby Benchmarks became a requirement in England under the government’s 2017 Careers Strategy (DfE, Citation2017a) and over the following three years, the majority of schools and colleges started tracking their progress against this framework (The Careers & Enterprise Company, Citation2020b), with the extent of progress influenced by a multitude of contextual factors (Houghton et al., Citation2020). The benchmarks continue to be supported in the government’s latest white paper “Skills for Jobs’ (DfE, Citation2021).

The necessity, inherent in Gatsby’s definition of career guidance, for schools to connect students with employers and providers of post-16 and higher education has long led to the usage and development of local networks. The Careers & Enterprise Company was founded by the government in 2015 as a national body providing coordination and delivery support to improve careers provision in secondary education. It initially focused on employer engagement, as required for benchmarks five and six, by matching schools and colleges with volunteers from business, called Enterprise Advisers, who use their business experience and professional networks to support career programmes and opportunities for young people (The Careers & Enterprise Company, Citation2018). To quicken the pace in improving career guidance across all eight Gatsby Benchmarks, government introduced a place-based network of Careers Hubs, linking schools within localities with each other as well as with employers, further and higher education providers, careers provider organisations, and Local Enterprise Partnerships (LEPs).Footnote2

Government launched the pilot of Careers Hubs in September 2018 to test the efficacy of Careers Hubs in facilitating improvements in school and college career guidance, in parallel with the expectation that all providers have a named Careers Leader in place to lead their career programmes (DfE, Citation2017a; DfE, Citation2018). A Careers Leader is a leadership role undertaken by someone working for a school or college, who may also have additional roles or responsibilities outside of careers. The Careers Leader is responsible for designing and overseeing the organisation’s career provision, accountable for progression towards completing all eight Gatsby Benchmarks, and is typically appointed by the head teacher or principal (The Careers & Enterprise Company and Gatsby Charitable Foundation, Citation2018).

Careers Hubs

In the pilot, Careers Hubs brought together groups of 20–40 schools and colleges in a local area to improve career guidance in line with the Gatsby Benchmarks (SQW, Citation2020). Each Hub is a network supported by a designated, salaried Hub Lead to connect education providers with the LEP and employers. There is typically no specific physical facility associated with a Hub, although Hub Leads are often based in the local authority or LEP buildings and can draw on these spaces for meetings or events. Each school was matched to an Enterprise Adviser, if not already matched, and supported by the pre-existing and separately funded Enterprise Coordinator from the LEP, who support Careers Leaders with their strategy, plan, and connections to employers, with a budget equivalent to approximately £1000 per school or college. Careers Leaders in Hubs had access to centrally-funded training and peer support from other schools and colleges. Schools in the 12 most disadvantaged Hubs also had access to a “virtual wallet” of funding with which to access approved careers provision by a third party organisation. Businesses with experience of supporting career guidance activities were designated as Cornerstone Employers to lead strategic engagement with employers across the Careers Hub area.

During the pilot period, Careers Hubs were launched in two waves: 22 Hubs covering 704 schools and colleges in September 2018 and 20 further new or extended Hubs covering an additional 618 institutions from September 2019. Careers Hubs status was awarded on the basis of an application to The Careers & Enterprise Company and assessment of the strength of leadership and plan, area need for career guidance support, and plans for wider benefit. In some cases, Hub proposals incorporated all or almost all eligible education providers in their local area; in other cases they incorporated a minority or a mixture of local providers. We explore the potential consequences of this as a possible selection effect in a follow-up to our core analysis. In all cases, headteachers actively consented to Hub membership. The staggered roll-out enables analysis of the different rate of self-assessed progress in careers provision between education providers in Hubs and those not in Hubs.

Place-based school improvement networks

Careers Hubs are an example of place-based school improvement networks, being national initiatives that are tailored locally to address student need in under-resourced communities to improve social mobility (DfE, Citation2017b; Greatbatch & Tate, Citation2019). Targeted at areas of need in terms of economic disadvantage and level of opportunity, Careers Hubs promote collaboration between schools aimed at delivering the Gatsby Benchmarks, such as co-delivering employer engagement activities, sharing employer contacts and resources, brokering new partnerships with third party providers, and peer support among Careers Leaders through a range of fora. Continuing a long tradition of place-based reform (Fullan, Citation2015), Hubs represent the “middle tier” of leadership between schools and national policy.

Reviews of school networks have identified positive impacts on professional development and good practice sharing (Armstrong, Citation2015). For example, Armstrong (Citation2015) discusses the Teaching School Alliances, in which schools collaborated on research and development projects, arguing that they enhanced the teaching practice of participants, increasing their innovation, motivation, and openness to sharing. There is less evidence on the direct impact of these school networks on student outcomes, which may be explained by this not being the primary focus of the networks and also as a result of evaluation timeframes.

Gatsby Benchmarks and student outcomes

Building on literature review evidence commissioned by the Education Endowment Foundation about the value of good career guidance for young people’s outcomes (Hughes et al., Citation2016), new findings are emerging that suggest positive impacts for career programmes specifically measured in terms of Gatsby Benchmarks. In the North East Local Enterprise Partnership Gatsby Pilot (2015–17), stronger performance on the Gatsby Benchmarks was associated with improved career readiness scores, using a student self-completion questionnaire (Hanson & Neary, Citation2020) and tentatively also with higher attainment in national exams at the end of Key Stage 4, when students are aged 14–16 (Hanson et al., Citation2021).

Analysis of national data supports the link between school performance against the Gatsby Benchmarks and post-16 student destinations (Percy & Tanner, Citation2021). Using administrative data on careers provision, higher self-reported scores for Gatsby Benchmark delivery were associated with the chances of being in education, employment, or training (EET) after leaving school. Specifically, for each additional benchmark achieved out of eight, there was an average 1.5% increase, statistically significant at the 5% level, in the odds of sustained EET destination outcomes for the 2017/18 graduating Year 11 cohort. This analysis was based on a binomial Generalised Linear Model with controls in place for area and school characteristics, including Ofsted score, Free School Meals (FSM) percentage, local unemployment rate, school-level academic attainment (Attainment 8 score), and type and size of school (n = 2,382). For example, if a school went from zero to eight benchmarks, this model would expect the proportion in sustained KS4 destinations to increase from 92.8% to 93.5%. A similarly significant, slightly larger relationship was identified among the smaller cohort of schools providing Gatsby Benchmark data for the 2016/17 cohort of Year 11 completers.

These emerging findings point towards the potential of the Gatsby Benchmarks and Compass tool measurement as indicators not only of the quality of career guidance but also of student outcomes hoped to follow from good guidance, with respect to at least some short-term outcomes.

Aims of this research

The policy focus in England and the emerging evidence on student impact point towards the importance of better understanding what initiatives can support schools to improve their careers provision as measured by the Gatsby Benchmarks. This research therefore sets out firstly to assess statistically the extent to which the Careers Hubs pilot (2018–2020) facilitated accelerated progress in schools achieving the Gatsby Benchmarks and secondly to understand from stakeholders when and why Careers Hubs might be expected to enhance careers provision, i.e. the success factors for successful implementation.

Methods

The research draws on three sources of evidence: (1) data on careers provision submitted by schools using the Compass self-assessment tool, which is merged with (2) administrative data on school circumstances and Hub participation to permit a statistical impact analysis relating Hub participation to changes in careers provision; and (3) mixed methods data (i.e. stakeholder surveys and qualitative interviews) collected as part of an evaluation project, to interpret the statistical analysis and to identify success factors for Careers Hub implementation.

This methods section first sets out the statistical impact analysis, explaining the school sample selection, the variables used, the approach to analysis, and how missing data were handled. This section ends with a description of the stakeholder surveys and qualitative interviews available for understanding implementation success.

School sample selection and description

The sample is restricted to education providers with Key Stage 4 provision (students aged 14–16), reflecting the original focus of the Gatsby Benchmarks (Gatsby, Citation2014). A separate modelling exercise would be needed for post-16 provision, given the different nature of career guidance and progression for the age 16–18 cohort. For instance, a dedicated version of Compass for Further Education Colleges was launched in September 2018, recognising that colleges have a different institutional setting from secondary schools and that career guidance differs across types of institution (The Careers & Enterprise Company, Citation2020a).

In order to be included in the dataset for each Wave, schools had to have baseline and endline Compass returns that had at least 100 days between their completion to allow time for meaningful progress, and had to be non-private educational establishments with at least some education provision for students aged 14–16. The comparison group set of schools for each Wave met the same criteria and had to be in neither Wave 1 nor Wave 2 Hubs.

This process results in a sample comprising 550 Hub schools and 1,398 non-Hub schools for Wave 1; 485 Hub schools and 1449 non-Hub schools for Wave 2. This compares to a total of around 4,000 non-private schools in England that have Key Stage 4 provision. Descriptive data can be found in in the Appendix. The descriptive data suggests that Hub schools tend to have slightly more financially disadvantaged intakes, to be less likely to have selective admissions policies (especially in Wave 1), and to be lower performing schools from the perspective of the education regulator (i.e. Ofsted) and the schools’ academic progress scores.

Dependent and independent variables

Dependent variables

The outcome metric for the impact analysis is the number of benchmarks achieved out of a total possible score of eight, as derived from the Compass evaluation tool that Careers Leaders complete to track their school’s careers provision against the Gatsby Benchmarks. The statistical analysis additionally matches on the exact number of benchmarks at baseline, which produces a measure of change over time that is more robust to regression to the mean bias than a direct measure of change (e.g. Barnett et al., Citation2005; Yu & Chen, Citation2015). Nonetheless, a robustness check is applied where the dependent variable is instead the direct measure of change, to ensure our results are not sensitive to this modelling choice. The endline number of benchmarks is drawn from schools’ most recent Compass return up to the end of March 2020, capturing actual and planned activity prior to the changes imposed by the COVID-19 lockdowns.

Independent variables

The core set of control variables, mostly drawn from administrative data sourced via www.compare-school-performance.service.gov.uk, cover:

  1. school level of disadvantage: via local area unemployment rates and school-level disadvantage by proportion of pupils eligible for free school meals, motivated, for instance, because areas with greater deprivation may find it harder to engage employers in support of career provision;

  2. school structure, which may influence the common and expected pathways for students: via school type and subtype (e.g. mainstream, special schools or alternative provision), number of pupils, any selective admissions policy, having its own sixth form, and being boys-only, girls-only, or mixed;

  3. variables for variation by geography: dummy variables for the nine regions of England and a measure of rurality;

  4. measures of school performance and academic results, which may influence overall school priorities and emphasis on careers provision: via average GCSEFootnote3 exam results and progress scores from 2017/18 (Attainment 8 and Progress 8 measures, see DfE (Citation2020) for calculation details) and the school-level grading from the education inspectorate Ofsted (entered as a categorical value, allowing Missing to be a value as schools without an Ofsted score may have different characteristics);

  5. whether the school falls into a government-designated Opportunity Area, being neighbourhoods of high deprivation in receipt of additional state support (DfE, Citation2017c).

The nearest neighbour matching draws on all the variables above, as well as a further set of core controls related to the Compass self-evaluations. Most important of these is the baseline Compass score, which contrasts with the dependent variable to measure progress up to March 2020. For Wave 1 Hubs, launched in September 2018, the baseline score is from their latest Compass return from the 2017/18 academic year up to end July 2018. For Wave 2 hubs, launched in September 2019, the baseline score is drawn from their latest return from the 2017/18 or 2018/19 academic years up to end July 2019. The baseline score, measured from zero to eight benchmarks, is included as an exact match requirement in the nearest neighbour implementation, so a school’s performance is only ever compared against a school (or schools) with the same initial Compass score.

We note that a longer time gap between the recording of baseline and endline scores is likely to be correlated with greater progress, given the general improvement trends in England over this period (The Careers & Enterprise Company, Citation2020b). Since Hub schools are encouraged to complete Compass scores on a regular basis, and might have more points in time at which to report progress, a control is included for the number of days between the baseline and endline score for each school.

Approach to statistical analysis

The unit of analysis is individual schools, seeking to understand how any change in number of benchmarks they achieve is related to whether or not they were part of a Hub. Since Hub roll-out was not randomised, it is possible that Hub schools differ in structural ways from non-Hub schools, in ways that might affect their careers provision. In order to better compare Hub school progress against non-Hub schools, we apply a nearest neighbour statistical approach to report an average treatment effect on the treated (ATET), identifying the most similar non-Hub school (or schools, if scores are identical) to each Hub school and vice versa, as measured with Mahalonobis distance across a set of control variables (implemented via teffects nnmatch with robust standard errors in Stata/IC 15.1).

Multiple continuous variables are included directly in the nearest neighbour algorithm using the standard bias adjustment protocol (Abadie & Imbens, Citation2008, Citation2011; StataCorp, Citation2017) and transformed into three categorical dummy variables, lower quartile, upper quartile or middle 50%, allowing for non-linearities. Robustness tests are applied for a range of modelling choices, including approach to control variables and alternative model specifications to nearest neighbour analysis (see the Robustness checks subsection of Results for details).

As this is a non-randomised design, the estimated ATET cannot be guaranteed to represent a causal relationship, although care has been taken with appropriate control variables, estimation technique, robustness checks, and assessment of possible selection bias to increase confidence that the correlation may reflect a meaningful causal component. The likelihood of some causal component to the ATET is reinforced through insights from the qualitative research identifying a plausible theory of change by which Hub participation enhances careers provision.

Missing data

Further adjustments were made for schools with missing data on control variables. A few schools (<1%) were dropped due to missing data that are hard to justify imputing statistically, such as school type. Missing data are imputed for certain continuous control variables in order to maintain the sample size, with robustness checks also completed on the pre-imputation dataset. The key control variables with missingness that are imputed are demographic variables (FSM rates and number of pupils, with <2% missingness), and progress variables (GCSE grade-derived data and post-16 sustained destination data, with 13%−19% missingness). Multiple imputation follows mi impute chained (StataCorp, Citation2017), creating 20 imputed sets given the levels of missingness and following the advice of Graham et al. (Citation2007).

An account of the likely mechanisms for missing variables and the pattern of missingness between hubs supports the assumption that missingness is random with respect to the relationship being analysed, conditional on the variables used in the imputation algorithm and subsequent analyses. The demographic variables are typically missing due to schools changing administrative codes or having incomplete data returns. The progress variables are typically missing because the relevant cohorts are too small to display aggregate data, the school has only recently opened, or the standard metrics do not apply, such as for some alternative provision and special schools. With controls in place for school type, nature of the intake and size, these features are not expected to mediate any relationship between benchmarks and post-16 destinations. Where missingness is over 5% for an individual variable, in the Appendix provides detail on the missingness rate for each variable split between the Hub and non-Hub schools across each wave. shows that missingness is fairly even by Hub status, with the difference by Hub status being less than 0.1%pts of the sample for Wave 1 and less than 0.8%pts for Wave 2.

Survey and qualitative data on implementation

The Careers & Enterprise Company and its evaluation partner, SQW, collected survey and qualitative data to investigate how Careers Hubs were implemented, to understand their impact as perceived by the key stakeholders, and to identify the change mechanisms that facilitated the outcomes. An understanding of such change mechanisms helps to identify success factors for future implementation of Careers Hubs.

These data included surveys of school Careers Leaders in Careers Hubs and the wider Enterprise Adviser Network in summer 2019 (n = 676, response rate  = 22%) and 2020 (n = 675, response = 17%), surveys of Enterprise Advisers in winter 2018/19 (n = 757, response = 28%) and 2019/20 (n = 812, response = 22%), a survey of Hub Leads in spring 2020 and qualitative interviews with Careers Leaders (n = 36), Enterprise Advisers (n = 36), Hub Leads (n = 40) and Enterprise Coordinators (n = 30) in summer 2019 and 2020.

The fieldwork was carried out by SQW, an independent research organisation commissioned by The Careers & Enterprise Company to evaluate the Enterprise Adviser Network and Careers Hubs (SQW, Citation2020). The surveys were anonymous to encourage stakeholders to share views openly and were sent to the whole population. Given that the survey data were neither formally randomly sampled nor weighted to be representative, the survey estimates should be considered as indicative of the views of each stakeholder group, rather than conclusive.

The qualitative interviews were semi-structured, following topic guides, and explored the themes of the surveys in greater depth. The qualitative data were analysed by case and theme in a specialist software package MaxQDA by SQW. The text was systematically tagged with codes from an agreed framework that mapped onto research questions relating to change mechanisms (the range of ways in which Hubs supported progress) and the influence of local context. The coding process identified themes and patterns in the data and supported objective, comprehensive and auditable analysis. The different stakeholder perspectives gathered through the surveys and qualitative interviews were triangulated to identify explanations for the greater achievement of Gatsby Benchmarks in the Careers Hubs.

Results

Impact analysis results

The descriptive analysis in demonstrates that schools in Careers Hubs made faster progress towards achieving the Gatsby Benchmarks amounting to 2.5 benchmarks between baseline and endline in Wave 1 compared to 1.5 in comparison schools, and 1.3 benchmarks in Wave 2 compared to 0.9 in comparison schools.

With controls for other variables, the nearest neighbour analysis reports an estimated average treatment effect on the treated (ATET) of 0.96 more benchmarks achieved than their matched non-Hub schools in Wave 1 (n = 1,948, p < 0.001, S.E. 0.15) and 0.54 more benchmarks in Wave 2 (n = 1,934, p < 0.001, S.E. 0.14). With standard deviation of benchmarks progress in Wave 1 Hub schools of 2.2 and 2.0 for Wave 2 hub schools, this translates into an effect size of 0.38 for Wave 1 and 0.25 for Wave 2. With an average elapsed time of 1.8 years from baseline to endline scores for Wave 1 hub schools and 0.9 years for Wave 2, this suggests that progress continues into the second year of Hub implementation, but likely at a slightly slower rate than the first year.

Robustness checks

Positive, statistically significant findings were identified across a range of 11 alternative model specifications conducted as robustness checks, listed in detail in in the Appendix. Collectively, these robustness checks adjust for a range of possible sources of bias and alternative specifications. For instance, the possibility that Hub schools might be completing Compass more recently and hence show greater progress given the national trend of increased support over time is adjusted in check (i), by imposing a consistency requirement on Compass completion date in addition to matching on the variable for number of days between baseline and endline. Alternative ways of measuring progress are adjusted for in check (iii): change in benchmarks over time as opposed to endline score with nearest neighbour analysis requiring an exact match on the baseline score. Alternative model specifications are also tested: multivariate regression via check (vii) and propensity score matching via check (viii). Different control variable mixes and operationalisations are tested in checks (ii), (iv), (v), and (vi). Finally, consistency to choices around multiple imputation is tested via several models estimated on the pre-imputation dataset, with checks (ix), (x), and (xi).

Across this set of 11 robustness checks for Wave 1 Hubs, the estimated ATET coefficients range from +0.80 to +1.1 and are all statistically significant (p < 0.001). The lowest performing, at +0.80, +0.82, +0.86 and +0.89 ATET, are (i), (vi), (iii) and (ii); all others are +0.95 or higher. For Wave 2, the estimated ATET exceeds the headline specification estimation in all but four models, with a high value of +0.59. The lowest performing specification, both in terms of effect size and p-value was (iii) with an ATET of +0.40 (p = 0.013). The other specifications that underperformed the headline specification, each with an ATET above +0.50, were (ii), (iv) and (x). Please see in the Appendix for the individual model results.

Exploring potential selection effects

Different geographical areas adopted different approaches for inviting and engaging schools into their Hubs. Some Hubs selected all or almost all eligible schools within a certain geographical area, such that it is likely to contain both highly proactive schools and neighbouring schools with less of a focus on improving their career provision – more likely to be a fair representation of the attitudes across the full range of schools. Other Hubs selected only a small subset of the eligible schools within their geographical coverage. Different motivations are likely to exist for narrow selections, ranging from identifying the most proactive schools (such that later roll-out of Hubs might see lower effect sizes than here) to identifying schools in greatest need of support where more investment is needed to drive progress (such that later roll-outs of Hubs might see higher effect sizes). Given these opposing potential implications and the importance of selection effects from a policy perspective of national roll-out, we deploy a mapping technique, representing a limit on maximum potential selection effect in each Hub in order to explore which implication might dominate.

Visual inspection of Hub maps was carried out in order to assign each Hub school to a negligible, moderate, or high level of potential selection effect, reflecting the level of interspersion between Hub schools and the other schools eligible at the time. For instance, if Hub schools constituted a minority of schools within the local area marked out by Hub schools, this was recorded as a high level of potential selection bias. In order to increase sample size and within-sample variation, Wave 1 and Wave 2 Hubs are combined to analyse progress within each of these three categories, using the baseline and endline data requirements of the Wave 2 Hubs. The results (see ) suggest that while a selection effect may be present, the patterns are non-linear and Hubs have a positive effect even when there was very little capacity for any selection effect (+0.43 benchmarks, p-value < 0.01).

Table 1. Hub performance vs. non-Hub schools by level of potential selection effect.

In Hubs with a moderate level of potential selection effect, i.e. those where many but not most of the schools in a region were in the Hub, the ATET is around 50% higher than that of Hubs with negligible potential selection effect. However, schools in Hubs where only a minority of local schools participated, pointing towards high potential selection effect, the ATET is lower at +0.30 (p-value < 0.1). These differences in point estimate should be interpreted indicatively, since the 95% confidence intervals overlap across all three categories.

Insights from Careers Hub Implementation

The quantitative analysis identifies an average of around 0.5 benchmarks progress per year of Hub implementation (being around +1.0 for Wave 1 hubs and around +0.5 for Wave 2 Hubs), but does little to explain why such progress might be achieved or be expected to continue. This is important given tentative evidence of diminished progress made in the second year of Hub implementation compared to the first. Given the multi-faceted nature of the Careers Hubs model, an aim of the third party evaluation was to identify the aspects of implementation that contributed most strongly to the Gatsby Benchmark progress to enable further improvements, replication, and scale-up. The gathering of data from multiple stakeholders across the two-year pilot period using survey and qualitative methods resulted in rich insight from which four themes emerged to explain the difference made by the Careers Hubs approach (SQW, Citation2020). An understanding of the mechanisms that help Hubs to be successful is important in accounting for the black-box measure of improved performance seen in the quantitative analysis and in identifying themes that should be supported and prioritised as the programme expands.

Collaboration among Careers Leaders

In contrast to the often isolated work of Careers Leaders within their own institutions, Careers Hubs established a culture of collaboration between schools and colleges across the local area which was considered to be instrumental in facilitating improvements in career guidance through sharing resources, expertise, and solutions to common challenges. This is illustrated in the 2020 survey, in which 72% of Careers Leaders reported stronger networks with other education providers since joining the Careers Hub. As well as achieving efficiencies in providing career guidance opportunities to students, collaboration between Careers Leaders also served to build confidence, agency, and the profile of careers support within schools.

Collaborative activities were both formal and informal. Many Hubs instituted a structure for sharing learning by appointing “lead” schools or colleges, through online resources or in-person meetings. Lead schools and colleges were appointed in accordance with the strategic purpose and priority of the Hub and took on the role of experts, innovators, connectors, or leaders within their peer group. For example, in one Careers Hub, the lead school took on a formal mentoring role in a six-week programme to support schools with new Careers Leaders or who were struggling to meet the Gatsby Benchmarks. As a result, Careers Leaders focused on specific actions and reported improved confidence. In some Hubs, the lead school role evolved so that the leadership was passed from school to school as provision became more established.

Careers Leaders also came together in the context of formal training or continuing professional development (CPD) events and meetings that might be focused on the delivery of specific Gatsby Benchmarks, and reported a direct impact on the quality of their own school’s provision. Some Hubs matched schools to enable mentoring, formalised shadowing, or targeted support for the schools most in need or peer review. Establishing “communities of practice” was also a common feature of Hubs, bringing Careers Leaders together virtually or in person, sometimes with a specific SEND or college focus. From these formal arrangements flowed informal relationships, providing Careers Leaders with a peer group of like-minded professionals within their locality, as observed by a Hub Lead:

“Having a community of practice is the single biggest thing that has made a difference. This means that we are up-to-speed with what is going on and can share that rapidly with people in our area.” Hub Lead interview, 2020.

Prioritisation by senior leaders

There was strong consensus across stakeholders that senior leadership support was an important facilitator of improvement in careers guidance. Practically, it was associated with careers plans aligning with the mission of the school, becoming embedded across the curriculum, and Careers Leaders securing the resource and commitment needed to deliver the Gatsby Benchmarks. The majority of Careers Leaders (62% in 2020) reported increased prioritisation of career guidance among senior leaders in Careers Hub schools. This was thought to have been achieved through a combination of the increased skill and professionalism of Careers Leaders and the raised profile of the Gatsby Benchmarks through the work of Careers Hubs and the statutory framework. By the end of the pilot period, 61% of the Careers Leaders in Careers Hub areas had started or completed formal face-to-face training, coordinated by The Careers & Enterprise Company, compared to 32% uptake nationally. The training developed leadership skills as well as capacity to improve the delivery of the benchmarks and as a result, many Careers Leaders reported greater competence and confidence to influence the senior leaders in their schools (Williams et al., Citation2020). It seems likely, therefore that increased prioritisation by senior leaders was a result, at least in part, of Careers Hubs participation and a facilitating factor in the higher Compass results reported above.

Employer engagement

Employer engagement to enrich career guidance is at the heart of the Gatsby Benchmarks, based on the evidence that frequent and varied encounters with employers and experiences of workplaces help students to build knowledge, aspirations, and career plans founded on authentic and to up-to-date insight (Gatsby, Citation2014). For all students to have opportunities that are matched to interests and not limited by family contacts, schools need to maintain a rich network of employer contacts.

The evidence suggests that the Careers Hub model was effective in facilitating employer engagement. Over half (56%) of the Enterprise Advisers surveyed in 2020 reported that they connected their matched school to additional local employers for activities such as mock interviews or career talks. This was corroborated by Careers Leaders: 79% reported engaging with additional employers since joining the Hub. Some larger employers in Hubs took on a strategic role, for example helping to shape Hub priorities, encouraging business contacts to volunteer time, or delivering training for teachers. For example, a large utility company in the East of England used its contacts across the region to source Enterprise Advisers to fill vacancies in schools, in collaboration with the Hub Lead. Stakeholders reported in qualitative interviews what was evident in the Compass data, that as a result of the growing network of employers, students were experiencing a greater number and more varied opportunities for engagement. An example described by an Enterprise Adviser involved a student who was struggling with engagement and confidence becoming an apprentice in the Enterprise Adviser’s organisation and flourishing.

Local and national leadership

The fourth theme that emerged from the survey and qualitative evidence as an explanation for the impact on Gatsby Benchmark delivery was local and national leadership. At a local level, the strategic role of the Hub Lead was perceived by different stakeholders from the start of the pilot as a critical element for success. This role was present in every Hub and linked the career guidance in schools and colleges with the strategic priorities of the LEP, thereby elevating the profile of careers provision (particularly where LEP leads took on a role of ambassador for the Hub), improving the flow of labour market information from the LEP to schools and colleges (supporting Gatsby Benchmark 2), and facilitating access to local business groups and community organisations. Hub Leads were also responsible for connecting schools, colleges, and employers within their area and establishing the collaborations described above.

Leadership at the national level was also perceived by the stakeholders participating in the evaluation research as an important facilitator of progress. The Careers & Enterprise Company established and managed contracts with the Careers Hubs, provided leadership through CPD for Hub Leads that enabled shared learning across different parts of the country, developed resources for Careers Leaders and employers, and provided support from senior staff (“Regional Leads”) that oversaw all Hubs and other Company activity in a geographic region. The Compass questionnaire, provided by The Careers & Enterprise Company, enabled schools and colleges to track and improve provision, and use data to engage senior leaders and external stakeholders.

Discussion

By building and strengthening local networks of education providers, employers, and regional governance, Careers Hubs have facilitated the improvement of career guidance for secondary school students. Descriptive analysis shows accelerated achievement of the Gatsby Benchmarks in Careers Hub schools compared to those in the wider network and beyond at around 0.5 additional benchmarks per year of Hub implementation, and comparison analysis confirms the added value of being in a Careers Hub beyond other characteristics that might contribute to improved career guidance.

Concurring with wider evidence on place-based networks, Careers Hubs were successful in improving the quality of provision and the professional competency of staff (Armstrong, Citation2015). While student outcomes are beyond the scope of this review, recent evidence suggests that improvements in the Gatsby Benchmarks are likely to flow through to benefits for students’ skills, knowledge, and engagement post-16 (Hanson et al., Citation2021; Percy & Tanner, Citation2021).

By drawing on evaluation evidence, this paper has identified the features of Hubs associated with improvements to inform the ongoing roll-out in England and for consideration in other contexts. The characteristics of the Hub model identified as contributing to the positive outcomes – collaboration between Careers Leaders, prioritisation by senior leaders, and employer engagement – were facilitated by the coordinating role of Hub Leads. In this respect, Careers Hubs exemplify place-based and community-driven solutions to addressing inequalities through school improvement (DfE, Citation2017b). This “middle tier” of leadership has been found to facilitate effective collaboration between schools and equity of provision (Fullan, Citation2015). The role of the Hub Lead was found to be critical in establishing the communities of practice between Careers Leaders but also in making connections with other players in the system, including education providers, business, and regional economic leaders. Within the structure of Careers Hubs, schools balanced institution-level responsibility for careers programmes with the benefits of local collaboration and partnerships.

A key consideration for the roll out of the Careers Hub model across England is the extent to which this early success will be replicated. An understanding of the underlying success factors, as discussed above, helps to identify implementation priorities to maintain impact as the model scales up. Future expectations can also be moderated by exploring the extent to which success to date might be driven by selection factors that assemble the most proactive schools into early hubs. Point estimates suggest that the average treatment effect on the treated might be up to around a third lower in Hubs where there was almost no possible selection effect (e.g. because all schools in the local area are in the Hub), although this indication should be treated tentatively, given the wide confidence intervals around the point estimates and the reduced performance in Hubs with the highest potential selection effect. Two possibilities can be explored with the present dataset which might drive outperformance to be lower in Hubs with high potential selection effect: that high FSM rates are driving the reduced outperformance and/or that more challenging circumstances for Hub community cohesion reduce outperformance. Other possibilities may exist, which might be explored with further qualitative work or targeted statistical data gathering, as well as the possibility that the differences are an artefact of sample-related variation alone.

The first possibility is that, whereas the majority of schools in Hubs with only moderate potential selection effect were typically chosen in response to where interest was greatest, in high potential selection effect Hubs, schools were more commonly persuaded to engage on account of higher need than surrounding schools, beyond that which is controlled through (inevitably imperfect) matching variables. This reflects the assessment criteria for Hubs which placed an emphasis on disadvantage (SQW, Citation2020, p. 11) and the higher FSM in Hub schools shown in . In other words, the schools selected on the basis of need may only have made modest progress compared to otherwise similar schools, but would have done significantly worse without the Careers Hub support.

A second possible driver is that, being a minority in their area, such Hub schools have reduced alignment with local identities or existing local clusters of schools, contributing to reduced community momentum and reduced opportunities for peer-to-peer learning. In other words, if most schools in your area are not in your Hub, you might see fewer network benefits or feel less motivated to engage. This possibility is supported by identifying higher ATETs in Hubs with schools concentrated into tighter geographical areas (about +0.7 compared to +0.3), where shorter travel times are likely to support community development on average.

These two possibilities reinforce the importance of certain aspects of the four themes from the qualitative work unpacking the success factors behind Careers Hubs. The first emphasises the importance of local leadership, in allowing Hubs freedom to design their own priorities in response to local conditions, such as levels of economic disadvantage, and in being appreciative of such context in evaluating progress. The second emphasises the importance of a dense local network, in which collaboration is straightforward and supported, aligning as far as possible with other local networks.

Implications for policy and practice

Following the success of the pilot, the Careers Hubs are in the process of being scaled up to other parts of England, with the intention of eventual universal coverage. The analysis in this paper suggests that in order to avoid the dilution effects that sometimes characterise the scale-up stage of policy interventions, the new phase of Careers Hubs should aim to retain and develop certain key features under the guidance of the Hub Lead. Since networks require regular engagement to be maintained, we recommend that a mix of structured and organic networks between Careers Leaders is maintained within localities, supported by regular communication and interaction opportunities that enable them to share ideas, resources, and expertise. The increased use of digital technology prompted by COVID-19 offers additional opportunities for networking and careers provision unconstrained by locality.

Access to formal training for Careers Leaders is also important to develop the profession and profile of career guidance, accounting for natural turnover. Training should adapt and develop to maximise the benefits across the school of the participating Careers Leader. Ongoing development of strategies to engage employers will help ensure that students have varied and numerous opportunities to learn from authentic sources. With local labour markets rapidly changing, Local Enterprise Partnerships, businesses, and education providers should work together to ensure students can adapt plans to changing opportunities.

Finally, as Gatsby Benchmarks become established, it will be important to continue monitoring student outcomes. This will help to ensure that Hubs continue to have a positive impact as the pilot scales out of early adopter regions and any early quick wins are exhausted. There may also be value in better tracking the competence that Careers Leaders, Hub Leads, and relevant teachers have in their role delivering careers provision, building on available training programmes where possible, as part of understanding the mechanisms by which improved career infrastructure supports outcomes. Such studies could draw on and adapt standardised self-efficacy scales as used in teachers delivering career education (see, e.g. Souvan, Citation2019).

Limitations and further research

Opportunities for further research can be identified in addressing constraints in this research around sample size, timing with the outbreak of COVID-19 in spring 2020, the use of self-report data, and data availability, as well as supplementing it with qualitative work, especially techniques like longitudinal fieldwork and examination of young person, teacher, careers practitioner, and parent perspectives. Future research can also usefully incorporate student-level data, such as career management skills and knowledge, and school-level outcome measures, such as student destinations.

The use of self-report data merits additional discussion, being the driver of our outcomes measure in the quantiative analysis. Compass returns capture the self-assessment data submitted by schools that describes their current careers provision. The primary analytical concern is whether schools and colleges in Hubs might engage with the Compass tool differently to those outside of Hubs. One possibility is that Hubs place additional emphasis on careers provision and ask schools to monitor and adjust their plans more frequently, including more regular Compass analyses – indeed, such mechanisms are part of the logic for anticipating that Hubs will motivate improvements in careers provision. The statistical methodology adjusts for this where possible, by including a control for the time between the earlier and later Compass reports and a robustness check for the later Compass report being filed between January and March 2020. Nonetheless, statistical adjustments cannot fully compensate for possible differences in Compass tool usage.

Four considerations suggest that such self-assessment data are adequate to support this analysis, while acknowledging a caveat around self-report data. First, at the institution-level, Compass is used for planning and reflection, rather than as part of an external accountability framework. The Careers Strategy (DfE, Citation2017a) set an ambition for all schools to achieve the benchmarks by the end of 2020, but does not assign specific incentives or penalties for this target at the school-level, and there are no additional penalties for Hub schools versus non-Hub schools. Secondly, the majority of questions are phrased as objective statements, reducing the flexibility for subjective judgement and personal bias. Example questions include whether the careers programme is “written down”, “published online”, with “resources/funding allocated to it”; whether 76% or more students in each school year (asked separately) receive impartial and independent career guidance; and whether the school collects and maintains destinations data on each pupil for three years after they leave the school. Full details can be found at: compass.careersandenterprise.co.uk/info. Thirdly, while Compass scores are rarely externally validated formally, Careers Leaders typically have to explain and defend their assessments to their Enterprise Coordinators, who cover a range of local schools. In a Hub setting in particular, where schools and colleges are typically sharing progress and good practice, it would be hard to maintain a highly-exaggerated score (or indeed an overly modest one). Fourthly, schools and colleges typically assign themselves low scores that are only increasing gradually over time, e.g. from an average of 2.13 benchmarks achieved in 2017/18 through to 3.75 in March 2020 (The Careers & Enterprise Company, Citation2020b). A caveat remains that some Careers Hub schools may feel under greater pressure to inflate progress than non-Hub schools, although the additional level of external scrutiny and peer visibility contains the likely scale of any such bias.

Conclusion

This paper has drawn on quantitative and qualitative methods to demonstrate the potential benefits of Careers Hubs for school-level career provision and identify the likely success factors contributing to that benefit. Specifically, we identify a 0.4 effect size average improvement in self-assessed career provision after two years of being in a Career Hub, robust to the inclusion of various control variables and estimation approaches. Key success factors include collaboration between Careers Leaders, support from senior leaders, employer engagement, and alignment with regional strategic priorities. As the Careers Hub policy continues to roll out in England, ongoing scrutiny is warranted, underpinned by programmatic and ad hoc data collection to support ongoing research.

Acknowledgements

The authors are grateful to colleagues for their feedback on earlier drafts and to SQW for their evaluation of Careers Hubs carried out on behalf of The Careers & Enterprise Company.

Disclosure statement

The authors are contracted by The Careers & Enterprise Company to conduct research related to the organisation’s activities, including the research in this paper. The Careers & Enterprise Company manages the Careers Hub programme for the Department for Education in England.

Data availability statement

This paper draws on data jointly held by individual education institutions and The Careers & Enterprise Company for which data policies and permissions prohibit public sharing. Other data used in the paper are publicly available from the UK Government websites. Interested parties are invited to contact the authors to discuss analyses that might be completed on the data as described in this paper.

Additional information

Notes on contributors

Christian Percy

Chris Percy is a Visiting Research Fellow at the University of Derby, UK. His research interests focus on using data and financial modelling to better understand career pathways and school-to-work transitions.

Emily Tanner

Emily Tanner is Head of Research at the Careers & Enterprise Company, UK. Prior to this, she was Head of Children, Families & Work at the National Centre for Social Research (NatCen) and held research posts at Yale University, USA, and, alongside DPhil study, at Oxford University, UK.

Notes

1 In the context of English secondary education, colleges refer to state-funded further education providers, serving young people and adults aged over 16. They are typically larger organisations, providing more diverse and/or more vocationally oriented courses than upper secondary education in schools, and have different governance and funding arrangements. See www.aoc.co.uk for details.

2 LEPs are voluntary partnerships between local authorities and businesses, set up in 2011 by the Department for Business, Innovation and Skills to help determine local economic priorities and promote economic growth and job creation within the local area (The Careers & Enterprise Company & SQW, Citation2020).

3 A General Certificate of Secondary Education (GCSE) is an academic qualification in a particular subject, widely used by state-funded schools for students aged 16 at the end of Key Stage 4, which spans the 14–16 age range. The exams are high-stakes and were nationally administered exams for the years in question in this study. The vast majority of students in England take at least some GCSE exams at age 16 and many will use their GCSE results in applications for continued education or employment.

References

  • Abadie, A., & Imbens, G. W. (2008). On the failure of the bootstrap for matching estimators. Econometrica, 76(6), 1537–1557. https://doi.org/10.3982/ECTA6474
  • Abadie, A., & Imbens, G. W. (2011). Bias-corrected matching estimators for average treatment effects. Journal of Business & Economic Statistics, 29(1), 1–11. https://doi.org/10.1198/jbes.2009.07333
  • Anders, J. (2017). The influence of socioeconomic status on changes in young people’s expectations of applying to university. Oxford Review of Education, 43(4), 381–401. https://doi.org/10.1080/03054985.2017.1329722
  • Andrews, D. (2011). Careers education in schools. Highflyers Publishing Ltd.
  • Armstrong, P. (2015). Effective school partnerships and collaboration for school improvement: A review of the evidence (DFE-RR466). Department for Education.
  • Barnett, A., van der Pols, J., & Dobson, A. (2005). Regression to the mean: What it is and how to deal with it. International Journal of Epidemiology, 34(1), 215–220. https://doi.org/10.1093/ije/dyh299
  • DfE. (2017a). Careers strategy: Making the most of everyone’s skills and talents (DFE-00310-2017). Department for Education.
  • DfE. (2017b). Unlocking talent, fulfilling potential: A plan for improving social mobility through education. CM 9541. Department for Education.
  • DfE. (2017c). Opportunity areas selection methodology. Department for Education.
  • DfE. (2018). Careers guidance and access for education and training providers: Statutory guidance for governing bodies, school leaders and school staff (DFE-00002-2018). Department for Education.
  • DfE. (2020). Secondary accountability measures: Guide for maintained secondary schools, academies and free schools (February 2020). Department for Education.
  • DfE. (2021). Skills for jobs: Lifelong learning for opportunity and growth (CP 338). Department for Education.
  • Fullan, M. (2015). Leadership from the middle: A system strategy. Education Canada, December 2015, 22–26.
  • Gatsby Charitable Foundation. (2014). Good career guidance.
  • Godden, L. (2022). Career guidance policy documents: Translation and usage. British Journal of Guidance & Counselling, 50(1), 157–169. https://doi.org/10.1080/03069885.2020.1784843
  • Graham, J. W., Olchowski, A. E., & Gilreath, T. D. (2007). How many imputations are really needed? Some practical clarifications of multiple imputation theory. Prevention Science, 8(3), 206–213. https://doi.org/10.1007/s11121-007-0070-9
  • Greatbatch, D., & Tate, S. (2019). What works in delivering school improvement through school-to-school support (DFE-RR892). Department for Education.
  • Hanson, J., Moore, N., Clark, L., & Neary, S. (2021). An evaluation of the North East of England pilot of the Gatsby Benchmarks of good career guidance. Gatsby Charitable Foundation.
  • Hanson, J., & Neary, S. (2020). The Gatsby benchmarks and social mobility: Impacts to date. IAEVG conference proceedings: Career guidance for inclusive society, Bratislava, Slovakia, 11–13 September 2019.
  • Hodgson, A., & Spours, K. (2013). Tackling the crisis facing young people: Building ‘high opportunity progression eco-systems’. Oxford Review of Education, 39(2), 211–228. https://doi.org/10.1080/03054985.2013.787923
  • Houghton, A.-M., Armstrong, J., & Okeke, R. I. (2020). Delivering careers guidance in English secondary schools: Policy versus practice. British Journal of Educational Studies, DOI: 10.1080/00071005.2020.1734533
  • Hughes, D., Mann, A., Barnes, S.-A., Baldauf, B., & KcKeown, R. (2016). Careers education: International literature review. Education Endowment Foundation.
  • Hughes, H., Warhurst, C., Benger, E., & Ifans, M. (2021). Building better futures: Decent work, inclusion and careers support services in the UK. British Journal of Guidance & Counselling, 49(2), 213–227. https://doi.org/10.1080/03069885.2021.1900537
  • Lairio, M., & Nissilä, P. (2002). Towards networking in counselling: A follow-up study of Finnish school counselling. British Journal of Guidance & Counselling, 30(2), 159–172. https://doi.org/10.1080/03069880220128038
  • Langley, E., Hooley, T., & Bertuchi, D. (2014). A career postcode lottery? Local authority provision of youth and career support following the 2011 Education Act. International Centre for Guidance Studies, University of Derby.
  • Moote, J., & Archer, L. (2016). Failing to deliver? Exploring the current status of career education provision in England. Research Papers in Education, 33(2), 187–215. https://doi.org/10.1080/02671522.2016.1271005
  • Percy, C., & Tanner, E. (2021). The benefits of Gatsby Benchmark achievement for post-16 destinations. The Careers & Enterprise Company.
  • Souvan, G. (2019). Second school teachers’ self-efficacy for career development teaching and learning. [Doctoral thesis, University of South Queensland]. University of South Queensland. https://eprints.usq.edu.au/41809/.
  • SQW. (2020). Enterprise adviser network and careers hubs: Evaluation report. The Careers & Enteprise Company. https://www.careersandenterprise.co.uk/our-research/enterprise-adviser-network-and-careers-hubs-evaluation-report.
  • StataCorp. (2017). Stata: Release 15. Statistical Software. StataCorp LP.
  • The Careers & Enterprise Company. (2018). Enterprise adviser network roadmap: 3 phase plan. https://www.careersandenterprise.co.uk/sites/default/files/uploaded/cec-enterprise_adviser_network_roadmap_-_v2_digital.pdf.
  • The Careers & Enterprise Company. (2020a). Careers and enterprise provision in England’s colleges in 2019: Detailed Gatsby Benchmark results. https://www.careersandenterprise.co.uk/our-research/state-nation-2019.
  • The Careers & Enterprise Company. (2020b). Careers education in England’s schools and colleges 2020: Working together for young people’s futures. https://www.careersandenterprise.co.uk/our-research/careers-education-englands-schools-and-colleges-2020.
  • The Careers & Enterprise Company, & SQW. (2020). Evaluation of the enterprise adviser network: Enterprise adviser survey 2020. The Careers & Enterprise Company. https://www.careersandenterprise.co.uk/our-research/evaluation-enterprise-adviser-network-enterprise-adviser-survey-2020.
  • The Careers & Enterprise Company, & The Gatsby Charitable Foundation. (2018). Understanding the role of the careers leader. The Careers & Enterprise Company. https://www.careersandenterprise.co.uk/sites/default/files/uploaded/understanding-careers-leader-role-careers-enterprise.pdf.
  • Thomson, R. (2017). Opportunity structures and educational marginality: The post-16 transitions of young people outside education and employment. Oxford Review of Education, 43(6), 749–766. https://doi.org/10.1080/03054985.2017.1352502
  • Watts, A. G. (2013). False dawns, bleak sunset: The coalition government's policies on career guidance. British Journal of Guidance & Counselling, 41(4), 442–453. https://doi.org/10.1080/03069885.2012.744956
  • Williams, J., Akehurst, G., Alexander, K., Pollard, E., Williams, C., & Hooley, T. (2020). Evaluation of the careers leader training. The Careers & Enterprise Company. https://www.careersandenterprise.co.uk/our-research/evaluation-careers-leader-training.
  • Yu, R., & Chen, L. (2015). The need to control for regression to the mean in social psychology studies. Frontiers in Psychology, 5, 1574. https://doi.org/10.3389/fpsyg.2014.01574

Appendix

Table A1. Sample descriptive data [standard deviation in square brackets where applicable].

Table A2. Model results.