1,489
Views
0
CrossRef citations to date
0
Altmetric
Articles

How professionalisation of outreach practitioners could improve the quality of evaluation and evidence: a proposal

ORCID Icon, & ORCID Icon

ABSTRACT

Professionalising outreach and evaluation work would enhance the quality and rigour of provision, benefit widening participation students and achieve regulatory requirements (Bowes et al. [2019]. The National Collaborative Outreach Programme End of Phase 1 report for the national formative and impact evaluations. Office for Students; Rainford [2020]. “Working with/in institutions: how policy enactment in widening participation is shaped through practitioners’ experience.” British Journal of Sociology of Education 42 (2): 287–303). This article presents practitioners’ experiences of how social justice can often feel unaligned to the technical expertise required in rigorous project design and evaluation. Professionalising outreach would achieve both improved practice and meet practitioners’ needs for development and a united professional voice. A professional body sharing standard methods of practice, offering CPD and skills would elevate outreach practitioners to a ‘professional’ standing (Eraut [1994]. Developing professional knowledge and competence. Falmer Press).

Introduction

Widening participation in higher education has been a policy priority for over 20 years in England (Greenbank Citation2006; McCaig and Bowers-Brown Citation2008; Cunninghame Citation2017). Although substantial progress has been made – and despite a significant increase in the number of both all and widening participation students in higher education overall – some groups remain under-represented (Office for Students, Citationn.d.a; Tight Citation2012; Harrison and Waller Citation2017; Rainford Citation2017). Recent studies highlight persistent differences between the most and least advantaged students across the entire student lifecycle and in relation to graduate outcomes (Britton, Dearden, and Waltman Citation2021). Eliminating these gaps and ensuring all students, whatever their background, enjoy access to, and have a fulfilling experience of higher education and beyond, are key objectives of the Office for Students (OfS), the regulator for higher education in England, established in April 2018.

Background to Uni Connect

The Uni Connect programme was originally launched as the National Collaborative Outreach Programme (NCOP) in January 2017 by the Higher Education Funding Council for England (HEFCE) with aims to increase the progression of under-represented groups to higher education. Specifically, it aimed at ‘doubling the proportion of disadvantaged students entering higher education by 2020 compared to 2009, and increasing the number of BME students by 20% by 2020’ (Department for Business Innovation & Skills Citation2016). NCOP has since changed to Uni Connect in name and HEFCE has been replaced by the regulatory body the Office for Students (OfS, est. 2018). The new regulatory aim has been set to eliminating achievement and progression gaps across the student lifecycle and achieve equality of opportunity in higher education within 20 years (Office for Students Citation2020b).

Uni Connect builds on previous national programmes, such as the Aimhigher initiative (Moore and Dunworth Citation2011), and complements institutional outreach provision funded through the mandatory Access and Participation Plans (APPs).

The original 29 Uni Connect partnerships connected Higher Education Providers (HEPs), Further Education Colleges (FECs), schools and other organisations and the partnerships aimed to deliver a sustained, progressive and intensive programme of support to students in specific geographic areas, or 'target wards', in Years 9–13 (students aged 13–19 years of age). In doing so, Uni Connect partnerships aim to reduce the gap in participation between the most and least-represented groups; supporting young people to make well-informed decisions about their future education; supporting effective and impactful local collaboration between higher education providers, schools, colleges, employers and other partners; and, most pertinently for this paper, contributing to a stronger evidence base around ‘what works’ in higher education outreach and strengthening evaluation practice in the sector (Office for Students, Citationn.d.b).

Evidence base

The OfS expects that outreach practices should be informed by the evidence of what works. However, current literature and evidence on the impact of access and participation interventions and programmes is limited (Gorard Citation2013; Moore, Sanders, and Higham Citation2013; Harrison and Waller Citation2016; Younger Citation2019; D Robinson and Salvestrini Citation2020). Much of the most robust evidence has been produced in North America (Younger Citation2019). While potentially insightful, the learning from these studies is not always transferable because of fundamental differences between the education systems which in themselves can influence outcomes.

In 2019, Transforming Access and Student Outcomes (TASO) was established by the OfS to create a What Works centre for higher education: ‘committed to the generation, synthesis and dissemination of high-quality evidence about effective practice in widening participation and student outcomes’ (TASO Citation2019). Although it is still too early to evaluate TASO’s influence and impact on sector evaluation methods and findings, the investment reinforces the regulator’s interest in outcomes focused policy and practice.

Evaluation practitioner understanding of the regulator’s requirements

Uni Connect programmes are currently reviewed and recommissioned, subject to satisfactory progress, every two years. Evaluation points have been 2019 and 2021 with a presently ongoing consultation about the future of Uni Connect (Office for Students Citation2020a).

In 2019, when Phase 1 of the programme came to an end, OfS commissioned CFE research to conduct a review of partnerships evaluation approaches in order to gather evidence of the quality of evaluation practice taking place (Bowes et al. Citation2019). However, prior to the publication of the findings of this review, partnerships were instructed to create local evaluation plans for Phase 2 which began in August 2019. This resulted in a variety of approaches, methods and resources assigned to evaluation.

Throughout Phase 1 and Phase 2, the regulator encouraged partnerships to use their financial resources to conduct ‘flagship RCTs’ and to draw on external knowledge from national evaluators CFE Research and the Behavioural Insights Team. This positioned quantitative driven evaluation as the regulator’s preferential method of understanding impact of Uni Connect programmes (Crockford Citation2020; Younger Citation2019).

In general, the overarching outcomes intended for the Uni Connect programme were measurable, and target driven: doubling disadvantaged learners in HE by 2020 and increasing ethnic minority group representation by 20 percent (Office for Students Citation2018a). The success of these outcomes will only be apparent after funding for the first stages of the programme have finished. In the shorter term, therefore, alternative ways of measuring impact were required to provide proxy measures of success. Uni Connect partnerships were therefore tasked with developing short-term measures and indicators for evaluation purposes.

While the use of experimental methods, and Randomised Control Trials (RCTs) were strongly encouraged, operational circumstances often prohibit the production of a robust control or comparator group (Styles and Torgerson Citation2018). Where randomised trials can be undertaken in outreach activity – for example, texting students as an intervention to remind them to study for exams or not doing so – their content and scope is achievable, and findings can be attributable to the action being undertaken. However, evidence is currently limited on randomised control trials that evaluate multifaceted interventions such as summer schools (TASO, Citationn.d.) and thus indicate further investment is needed and a specific skill set needed to interpret the complexity of these types of outreach interventions (Thomas Citation2000; Gorard Citation2013; Younger Citation2019).

Decisions on future choices such as whether to go to higher education or not, are complex, messy, and not necessarily linear (Dawson, Yeomans, and Brown Citation2018; Younger Citation2019; Lortie-Forgues and Inglis Citation2019). So, while the wish to produce evidence of causal impact by evaluators is laudable, in practice, it can feel like a Sisyphean undertaking to practitioners.

Practitioner experience and reflection

The year one assessment of Uni Connect evaluation noted a greater need for measurability within the local evaluation plans:

Specifying and quantifying objectives, targets and detailing success indicators would further improve some consortia evaluation plans … Evaluation plans and activities would be strengthened if plans could break down overarching outcomes into more discreet, measurable, shorter-term outcomes. (Tazzyman Citation2018)

From this, the OfS placed evaluation practice indicators within Phase 2, including an expectation that resource would be assigned to evaluation practice and detailed parameters for evaluation plans. Within the guidance, partnerships were expected to provide a detailed evaluation plan and complete a self-assessment document on current evaluative practice. These documents were assessed by a newly OfS commissioned capability building arm to the national evaluation team (Office for Students Citation2019a). Partnership plans were given a score of between 1 (‘required practice is either absent or not enough detail has been provided for assessment to take place’) and 5 (‘Excellent sector-leading and / or innovative practice that should be shared as an example of best practice’) for their evaluation plans.

If partnerships received a low score across any section of their evaluation plan, they were requested to address highlighted issues and resubmit their plans before it was authorised by the OfS. This scoring system presented a distinctive benchmarking approach to evaluation during Phase 2 and placed emphasis on the role of the evaluator within the programme.

By creating a set of expected practices within programme guidance, the regulator reinforced the preference for ‘gold standard’ evaluation methods such as RCTs . The requirement is for partnerships to ‘generate robust Type 2 or Type 3 evaluation evidence (qualitative or quantitative) for the majority of your [Uni Connect] NCOP funded outreach activity’(Office for Students Citation2019b).

OfS’ understanding of robust is outlined in their Access and Participation Plan supporting documentation outlining three types of outreach evaluation evidence .

Table 1. Standards of evidence for widening participation (Centre for Social Mobility Citation2019).

Whilst Phase 2 guidance set clear expectations for Uni Connect evaluation, it also introduced risk to an outcome driven programme. In the context of Uni Connect, as a strategically funded programme, continued funding and support relies on using the specific evaluation methods outlined by the funder being used in everyday professional practice. In achieving the desired ‘standards’ by the regulator, a risk arises of alienating other perspectives of ‘success’ aligned with widening participation work.

As Jones and Thomas (Citation2005) argue, outcomes based approaches to determining ‘success’ limit practice to academic and utilitarian approaches, rather than supporting transformative experiences. If our evaluation practice across a strategically funded programme is narrowed to using methods which support a deficit based utilitarian approach, do we risk narrowing the types of interventions employed to only those where outcomes are measurable (Burnett and Coldwell Citation2020)?

Professional judgement and tacit knowledge of alternative and innovative methods of practice and evaluation could be lost. Within a process based system, subtle nuances of learners’ development and belief in their future possible selves (Markus and Nurius Citation1986; Harrison and Waller Citation2018) may be missed in aggregated and big data and enveloped within discourses of performativity (Ball Citation2012; Burke Citation2018).

An alternative vision

The potential risks and limitations of using a ‘measurable impact’ approach to evaluating outreach interventions are already recognised (above, plus Deaton and Cartwright Citation2016). As practitioners, each with unique skills and varying levels of evaluation knowledge, we suggest two cornerstones in which future policy and practice should be grounded: a praxis-based approach to regulatory guidance and policy evaluation and a professional body representing outreach evaluators and practitioners.

A praxis-based approach to regulatory guidance and policy on evaluation

To transform equity of participation in higher education, practitioners, evaluators, and providers must work collaboratively with, and for, the communities and individuals who are currently under-represented within higher education. To do this we must be reflective practitioners actively questioning our roles and purpose of the activities we provide. Critical reflection, questioning and reflection are embedded within definitions of praxis. Indeed, the celebrated philosopher and educator Paulo Friere envisaged praxis as ‘a singularly human endeavour involving cycles of critical reflection and critical action directed at the structures to be transformed’ (Lumb and Roberts Citation2017).

Praxis within widening participation has been argued before, most notably by Burke who states:

Equity work is deeply relational; we cannot create educational transformation without working together across our different contexts, disciplines, experiences, knowledge and expertise. (Burke Citation2018, 14)

To ensure evaluation is fit for purpose we must engage in critical reflection and dialogue that may illicit difficult conversations between practice, policy, and purpose. Kinsella discussing Schon’s concept of reflective practice comments on his observation that ‘the contexts of practice are messy, complex and laden with value conflicts’ (Kinsella Citation2010, 566). Accepting and building on this will offer the opportunity of discussions between evaluators, practitioners, and the regulator on ensuring we are able to contribute towards answers to the questions we wish to solve. A praxis-based approach, where tacit and empirical knowledge are both valued, would ensure the sector is able to directly address limitations in current measures of success.

The OfS has a legal obligation to ‘promote equality of opportunity in connection with access to and participation in higher education provided by English higher education providers and to ensure its activities are ‘transparent, accountable, proportionate and consistent’ (Legislation.gov.uk Citation2017). Within the regulator’s guidance document outlining boundaries of relationships with providers, it is stated the OfS

will not provide advice to providers about how they should run their organisation. Providers should look to other sources, for example to sector bodies, for such advice and support. (Office for Students Citation2018b).

By building upon a praxis-based framework, outreach evaluators could gather critical skills, knowledge, and experience from a range of sector stakeholders. Burke’s praxis-based framework ‘brings all participants together in dialogue across research and practice and opens up pedagogical spaces to deepen levels of understanding from multiple and contested perspectives and dimensions’ (Burke Citation2018, 17). This could provide a solution to frictions between evaluations being ‘done to’ outreach participants and evaluation methods being ‘created with’ outreach participants and, in part, lead to the transformative outreach Jones and Thomas (Citation2005) argue for.

A professional body representing outreach evaluators and practitioners

Given the regulatory importance of outreach within the HE sector, there is a clear need to create a space where outreach practitioners and evaluation professionals can participate towards achieving professional standards. The creation of a professional body would strengthen practitioners’ voice within a praxis-based framework, and provide a community of practice where best practice, professional development and opportunities for collaborative evaluative practice could emerge.

With increasingly complex roles (Rainford Citation2017; Gazeley et al. Citation2018), outreach work now shares key features with existing professions in other sectors such as youth work, careers advice and teaching (Banks and Imam Citation2000). Professions have a commitment to social values (Roth and Briar-Lawson Citation2011) – in this case, widening access, a body of knowledge of best practice and, to a greater extent, an agreed entry route of the educational requirements and work experience that make a rounded widening participation professional. Creating a professional body would also be an opportunity for agreed codes of conduct, committing to ethical and robust practice and a requirement to engage with current literature (Fawkes Citation2014). Professional bodies increase professional legitimacy and enhance the adaption of shared norms and values (Sonneveld et al. Citation2020; Browell Citation2000; Mulvey Citation2013).

The development of a professional body for outreach practitioners would produce common guidelines for evaluative practice in this field, and ensure they are integral to the role of outreach practitioner. This could assist in development of effective and wider ranging evaluation skills. Alternative approaches to evaluation outreach have already been mooted (Harrison and Waller Citation2017; Thomas Citation2000; Clements and Short Citation2020; Formby et al. Citation2020). A professional body could ensure that practitioners have the knowledge and understanding to collaboratively produce and, more importantly, to effectively use these alternative approaches as the backbone of both the delivery and evaluation. Emslie notes that in professionalising youth work ‘the process would involve exploring the distinct contribution of youth work and how this ought to be achieved’ (Emslie Citation2013, 126). The same process for outreach practitioners could have similar impact.

A professional body would support providers with evidence they are engaging in robust evaluation of their publicly funded access and participation activities, without the need to rely on ‘gold-standard’, yet often inappropriate methods. In turn, this would also give the public more confidence that the regulator is fulfilling their remit within the sector as the regulator for students, by challenging practice that could have a detrimental effect on widening participation.

Conclusion

To create a truly transformative and inclusive higher education sector, outreach practitioners and evaluators must work in collaboration with the regulator to create a dialogue between policy and practice. The creation of a professional body would provide a channel for the regulator to effectively create dialogue with practitioners. By doing this, the regulator would achieve its measure of satisfaction and, critically, better understand what needs to change within higher education to genuinely widen participation. In a sector where evidencing causal change is messy, non-linear and complex, practitioners and evaluators must be confident the methods used to determine success are fit for purpose.

By formalising practice into a profession, a space is created for potential innovation and exploration into alternative methods of measurement for the success (or otherwise) of access and outreach programmes such as Uni Connect.

The collaborative nature of outreach work, and specifically the Uni Connect programme, provides an organic context in which to transform performative evaluation to a values-based evaluation approach where inclusion and equity provide the cornerstone to future policy and practice within the higher education sector.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Office for Students under Tender ‘Supporting NCOP evaluation practice’.

Notes on contributors

Naomi Clements

Naomi Clements leads on evaluation for the Southern Universities Network, a partnership of six universities across the South Coast of England. She has experience in a variety of widening participation, access and student recruitment roles. Naomi is currently completing her EdD at University of Portsmouth with her research focussing on frictions between evaluation guidance and evaluation practice in a strategically funded outreach programme.

Sara Davies

Sara Davies is a Senior Research Fellow at the Personal Finance Research Centre, University of Bristol. Sara works broadly on financial behaviours, as well the impact of bursaries and outreach on the student experience. She has been evaluating the Future Quest programme since 2017 https://www.bristol.ac.uk/people/person/Sara-Davies-00f2c28b-6c41-49d6-8dc7-1d8c69288611#publications

Anna Mountford-Zimdars

Anna Mountford-Zimdars is a professor in the Graduate School of Education at the University of Exeter and a Principal Fellow of the HEA. She is the founding and academic director of the Centre for Social Mobility at Exeter. Anna works broadly on student access to, progression within, and success beyond higher education. Her website is available at: http://socialsciences.exeter.ac.uk/education/staff/profile/index.php?web_id=Anna_Mountford-Zimdars

References