ABSTRACT
Youth work’s informal and youth-centred nature raises challenges for evaluation, challenges that are intensified by the growing dominance of measurement, market values and surveillance in the context of the neoliberal restructuring of youth services. This article builds on Griffiths’ (2012, Why Joy in Education Is an Issue for Socially Just Policies. Journal of Education Policy 27 (5): 655–670) philosophical argument for valuing the intrinsic contribution of education, thus conceptualising evaluation as encompassing more than measuring outcomes. It reports the findings of a three-year qualitative study in eight open youth work settings in England that investigated the perspectives of 143 young people, youth workers and policy makers on evaluation in youth work. While young people and youth workers had often participated in evaluations they found meaningful, some approaches to impact measurement were experienced as too formal, intrusive, insensitive and burdensome. The article argues that evaluation and accountability processes must be practice-informed, youth-centred, and anti-oppressive. It recommends the participatory and collaborative development of diverse methods and approaches to evaluation that ‘capture the magic’ of youth work while enabling further reflection and development of practice.
Introduction
How should youth work be evaluated? This has been a controversial question in youth work research and practice internationally over the last decade (Duffy Citation2017; Lovell et al. Citation2016; Ord Citation2014; Taylor and Taylor Citation2013). Aiming to contribute to this debate by providing evidence of the perspectives and experiences of young people and youth workers, this article draws on a three-year qualitative study investigating impact measurement and evaluation in youth work in England (2018–2021).
Our discussion is rooted in an international policy context in which requirements to ‘prove’ impact can constitute a barrier to equitable and community-centred practices (Baldridge Citation2019; Coultas Citation2020). Dominant mechanisms of accountability in public services are based on a neoliberal logic of predefined, standardised and measurable outcomes that encourage competition and comparison between services, and the conversion of data into claims of ‘value for money’. This provides a simplistic and individualised view of how diverse experiences and relationships contribute to young people’s lives in a wider political context of social inequalities (de St Croix, McGimpsey, and Owens Citation2020). Within this international context, England provides an interesting focus due to its long history of youth work contrasted by severe cuts in recent years (YMCA Citation2020), alongside the introduction of a new ‘youth impact agenda’ that was - at least initially - based predominantly on performance targets, predefined outcomes, and impact measurement (De St Croix Citation2018).
These challenges are particularly acute in open youth work, a practice of informal education encompassing youth clubs, detached or street-based youth work, and online group work. Open youth work is open-ended in terms of who participates, how, why, when, and for how long; a crucial aspect is that young people attend by choice (Davies Citation2021). Its guiding logic conflicts with traditional impact measurement methods based on predefined individual outcomes, measured through standardised questions asked of young people before and after an intervention. Open youth work is not an intervention, but rather a practice of indeterminant length; young people engage flexibly on their own terms and outcomes emerge naturally through this process of engagement. While funders and impact measurement advocates are increasingly aware of these tensions (Hill, Scanlon, and Anderton Citation2019) and there is a diverse range of practice in the evaluation field, this article is concerned with how the growing impetus towards the measurement of outcomes and impact is experienced on the ground.
In the next section we draw on literature to elaborate further on the disjuncture between youth work practice and dominant approaches to evaluation, first discussing the policy context, and then conceptualisations of evaluation beyond outcomes. We then introduce our study and its methodology. In the findings section, we discuss what evaluation looks like in youth work, the challenges experienced by young people and youth workers, and how they navigate these challenges. We finish by arguing for a practice-informed, youth-centred, anti-oppressive approach to evaluation in policy and practice, both within and beyond youth work.
Background
We situate this article in conversation with scholarship and practice that engages critically and politically with evaluation, impact measurement, and accountability, within and beyond youth work, in England and beyond. We start from a concern that is widely discussed in youth work practice, research and activism: a disjuncture between the informal and process-oriented nature of youth work, and requirements for prescriptive monitoring and evaluation (e.g. Duffy Citation2017; Fusco et al. Citation2013; Gee Citation2020; Ord Citation2014; Taylor and Taylor Citation2013). The literature highlights key tensions: (1) predefined outcomes and ‘pre and post’ measurement are incompatible with the open timescale and purpose of youth work; (2) anti-oppressive and youth-centred principles are undermined by labelling, surveillance, and externally defined outcomes; and (3) bureaucratic systems of measurement clash with informality. These tensions cannot simply be resolved by selecting ‘better’ evaluation tools, because they raise political questions: Who decides how youth work is evaluated? How is evaluation shaped by structural inequality? How does it challenge and/or reproduce inequalities? How does evaluation challenge and/or enable neoliberal processes of marketisation, competition and performativity? Thus, it is important to start with a focus on policy and politics.
Evaluation in youth policy
In policy terms, youth work is often seen as a ‘Cinderella service’, lacking recognition, status, and resources. While operating ‘under the radar’ enables some flexibility, it renders youth work particularly vulnerable to economic downturns and policy trends. Both in England and internationally, youth work and related practices (such as afterschool, youth development, youth organising, and community education) rely on multiple sources of income including from local and national government, charities, schools, housing providers, philanthropy, fundraising, and revenue (e.g. room rental). This means that policy and practice on evaluation are influenced by multiple resource holders, and by an overall context of precarious and short-term funding.
Evaluation in youth work is currently shaped by three broad interlinking policy agendas. The first and most visible are austerity and cuts. In England, where our study took place, the effects are stark, particularly for low income, working class, ethnically minoritized and rural communities. Local authority spending on youth work in England in 2018 was almost £1 billion (70%) lower than in 2010 (YMCA Citation2020). In London, 130 of 300 youth centres closed between 2011 and 2021; the average number of youth workers per borough fell from 48 to 15 (Berry Citation2021). Austerity acts discursively as a rationale for evaluation, exhorting organisations to provide stronger justifications of impact to survive. Yet cuts are also an impediment to evaluation, reducing staff time and creating an atmosphere of insecurity that militates against evaluation for learning and honest communication of impact and its limitations. Community representatives have drawn attention to the ‘data burden’ that especially affects smaller organisations (Darking et al. Citation2016; see also Lovell et al. Citation2016); it is important to note that these organisations are more likely to be embedded within working class, Black, ethnic minority, disabled, and/or LGBTQ + communities.
The second policy agenda shaping evaluation in youth work is the ‘youth impact agenda’: a broad agreement amongst influential individuals and organisations that youth services must measure their impact, with a common (even if not universal) preference for quantitative techniques and validated psychology tools. This agenda grew in the wake of a House of Commons (Citation2011) enquiry on services for young people that criticised the youth sector for a lack of ‘credible’ evidence of impact. Although the youth impact agenda reflects and reproduces neoliberal managerialism, its adherents are more likely to justify it in terms of science and truth (Duffy Citation2017). Yet there are debates here about whose truth counts; youth worker and activist Farzana Khan has pointed out that services based on Eurocentric or white methods ‘reproduced harm for marginalised communities’ (interviewed by Akram Citation2019) and youth programmes are too often ‘framed as sites of containment and control for Black and other minoritized youth’ (Baldridge Citation2019, 15). Impact measurement implies a shared understanding of what constitutes positive change, often neglecting the effects of unequal social and economic conditions (Recovery in the Bin Citation2016). In recent years, impact advocates have generally moved away from a focus on ‘proving’ the value of youth work towards broader concerns of practice improvement (Hill, Scanlon, and Anderton Citation2019); however quantitative measurement remains dominant.
This prominence is partly incentivised by the third policy agenda influencing evaluation in youth work: decision-making based on ‘social value’ or ‘social return on investment’. In the UK, the interaction of Treasury-led policy and the evaluation practices of a large, short-term youth programme (the National Citizen Service) can be seen as a ‘social investment machine’ that creates new and experimental ways of calculating financial impact from outcomes measures (de St Croix, McGimpsey, and Owens Citation2020). The results are beginning to be seen in evaluation reports, where youth work’s benefits are ‘ascribed a monetary value that signals its relative value compared to other things’ (Kenley and Pritchard Citation2019, 5; see also McMahon Citation2021 on ‘value for money’ discourses in Irish youth work).
These three policy agendas are mutually reinforcing, while producing tensions and enabling contestation. Economic social value claims generally rely on standardised measurement tools to assess young people’s skills or behaviours at ‘baseline’ and on completion of a programme. Austerity is used to justify both social value and impact measurement agendas. Yet it is not as simple as impact measurement only being imposed by funders or government; organisations might measure their impact to mark themselves out as competitive (Arvidson and Lyon Citation2014). Technologies of measurement, comparison and control create cultures of performativity, which restrict what individuals and organisations can do, while rewarding engagement in these processes (Ball Citation2003; De St Croix Citation2018). Thus, dominant approaches to impact measurement and evaluation reproduce neoliberalism through incentivising competition, cost-unit comparisons, and individual progress narratives. This tends to devalue young people’s and practitioners’ understandings of the value of youth work, as we now go on to discuss.
Conceptualising evaluation beyond outcomes
Outcomes-based measurement has been criticised by practitioners, activists and researchers for relying on an outside imposition of what counts as ‘success’ and narrowing the complexities of human experience (e.g. Coultas Citation2020; Lowe Citation2013; Recovery in the Bin Citation2016; Taylor and Taylor Citation2013). The value of youth work is demonstrably wider and more nuanced than what can be measured. Ord et al. (Citation2021) draw on 844 young people’s stories of youth work impact in six European countries to argue that many of the most prominent themes identified by young people – such as enjoyment, friendship, and atmosphere – are neglected in policy or seen as merely instrumental:
Friendship in policy terms is often framed as a means to an end – something that is important in order for youth work to be successful and achieve its policy priorities – whereas young people see it as an end in itself. For young people, friends and friendship are at the very heart of youth work. (Ord et al. Citation2021, 12–13)
To support our thinking on the intrinsic value of youth work, we are informed by Morwenna Griffiths’ (Citation2012) philosophical discussion of social justice and joy in education. Griffiths argues that the value of education must not be limited either to its instrumental outcomes (such as economic benefits), nor even to individual outcomes (such as autonomy and citizenship), but should include the intrinsic, integral purpose of education as:
… good in itself, as part of what makes a good life good, not just as part of what is needed to produce the good life. And here the experience, the process is significant. Aristotle argued convincingly that the good life is experienced as pleasurable as well as being good … . (Griffiths Citation2012, 656)
These central and integral aspects of youth work can be undermined by funder requirements for evaluation. For example, in Denmead’s (Citation2021) ethnographic research at a youth arts centre in the USA, young people from marginalised communities valued ‘chillaxing’ in the centre, explaining that resting and being unproductive was necessary for them to recover from racist and classist experiences at school. Yet the centre was required to use an auditing tool that measured aspects such as whether materials were ready, activities were clearly explained, and ‘students’ completed activities without distraction. While such requirements might appear harmless, they militated against the open structure and timing that young people preferred. While we acknowledge that some funders take a more sensitive and practice-based approach, we focus in this article on the key challenges and tensions in evaluation, as they are experienced by young people and youth workers.
Methodology
This article draws on a three-year study (2018–2021) that investigated how impact measurement and evaluation tools and processes are experienced and enacted by young people and practitioners in youth work settings. The study took a qualitative approach based on 87 interviews and focus groups with 143 young people, youth workers and policy influencers in England (16 of whom took part in two or more interviews or focus groups), alongside 73 sessions of participant observation. Research took place in eight open youth work settings, purposively selected to encompass a diversity of youth work approaches, locations, and organisation types (see ). This article draws on the perspectives of the 58 young people (mostly aged 13–19) and 59 youth workers who took part, and on fieldnotes from participant observation.Footnote1
Table 1. Participating organisations.
The research began in the first half of 2019, when we made several visits to each of the eight youth work settings. We participated in youth work sessions, debriefs and team meetings, and undertook in-depth semi-structured interviews with 14 managers and administrators, and 22 focus groups with 29 youth workers and 37 young people. We then selected two of these settings (Melham and Seaside) for more in-depth research, enabling us to build a deeper contextualised understanding of evaluation and monitoring in these contrasting settings over time. Our longer engagement in these settings from December 2019 to October 2020 enabled greater relationship building, fluidity, collaboration and creativity. Alongside further regular participant observation in youth work sessions, and interviews and focus groups with 40 young people, youth workers and managers, we were able to use methods that emerged from young people’s needs and interests, and the rhythm of work in these organisations. This included a tour of a youth club; photograph and music elicitation, through the sharing and discussion of photographs and songs that relate to youth work (Levell Citation2021; Varvantakis and Nolas Citation2021); and a ‘paper chatterbox’ with questions selected and asked in collaboration with young people. This approach was akin to what Batsleer and Duggan (Citation2021) have called ‘youth work as method’, drawing on principles of anti-oppressive youth work practice and youth participatory research to engage creatively and flexibly with young people and youth workers, in accordance with their wishes and interests.
From March 2020 the research was impacted by the Covid 19 pandemic; as a result, some of the participant observation, interviews and focus groups at Seaside and Melham took place online. The implications of the pandemic and lockdowns are not the focus of this article; yet it is important to note that, while online interviews and focus groups worked well where we had already built relationships pre-pandemic, it was more difficult to engage with new participants (see Arya and Henn Citation2021). In addition, while Seaside Youth Club quickly moved online and we were welcomed to continue our participant observation, Melham’s detached work was more seriously disrupted in the initial months of the pandemic, when even outdoor social contact was not permitted.
The research was guided by situated principles of ethical youth research and based on principles of trust, respect and informed consent (Batsleer and Duggan Citation2021; te Riele and Brooks Citation2013). While written information sheets and consent forms were provided for participants and young people’s parents and guardians, verbal discussion of the research process was of vital importance in the informal context of youth organisations, and consent was seen as an ongoing process rather than a one-off event (Hill Citation2017). The names of youth organisations and their participants were pseudonymised (some participants selecting their own pseudonyms), and any details likely to identify organisations or individuals were omitted. Interviews and focus groups were audio recorded and professionally transcribed. We wrote fieldnotes immediately after each session, following a loosely structured format (noting thoughts before, our role, what happened in the session, any observations on evaluation and monitoring, key moments, key words and emerging themes); fieldnote extracts have been minimally edited for clarity but without changing substantive details.
We took a collaborative approach to data analysis, aiming to be reflective, fluid, intuitive and responsive to the data whilst remaining grounded in a consistent approach (Gewirtz Citation2001). We used selected tools from constructivist grounded theory, including coding, theme identification, and written analytical memos (Charmaz Citation2006). Initial manual coding identified potential themes and questions, which we discussed in detail to generate broad baseline codes. We used coloured coding stripes and annotations on a selection of paper transcripts using these codes, reflecting on this process before transferring all fieldnotes and transcripts onto data analysis software (NVivo), where we shared out data for analysis as well as coding and comparing a selection of the same transcripts and fieldnotes to ensure consistency. Regular data analysis meetings offered a systematic way to reflect on our analysis and identify tensions, contradictions, omissions, questions and linked themes. As researchers with professional backgrounds in youth work practice, we engaged with social justice driven imperatives while remaining vigilant to our own preconceptions and perspectives, supported through a process of personal and collaborative reflection, and ongoing engagement with wider communities of youth scholarship and practice.
Findings
Throughout our study, youth workers and youth work managers expressed enthusiasm for evaluating their work, and commitment to making this evaluation meaningful for young people and for themselves as practitioners. The title of this article is inspired by part-time youth workers at Journeys, discussing an evaluation activity with young people:
Fern: … we posed these questions for a conversation, and we ended up having a really rich conversation come out of it with people really expressing a huge diversity of opinions, which if we’d just gave them a form to fill out, none of these discussions would’ve happened. And there wouldn’t have been the benefit for the young people of the conversations. It would’ve very much felt like ‘ah, but we’re doing this because the youth workers have asked us to do it, and then we’ll just forget about it’.
Zayn: Yeah. Were you saying, George, about capturing the magic of the work, yeah?
George: Yeah […] you did a write up of that, and that captured a bunch of this stuff that went on. But also it’s like, just knowing the magic is happening when you’re doing an evaluation is pretty special.
Current approaches to evaluating youth work
We start by outlining the main approaches used by the eight organisations in our study to evaluate open youth work (). The table is not exhaustive, and does not include details of frameworks or processes underpinning evaluation (e.g. theory of change; participatory design), nor the databases used to record attendance, participation, demographic data, outcomes and/or activities. It clearly shows that, while there are some commonly used methods, there is also great diversity.
Table 2. Evaluation in youth work settings.
Each organisation’s approach to evaluation was shaped by multiple considerations, including the requirements of resource providers, pragmatic aspects (resource constraints, staff confidence, or young people’s willingness), and organisational and professional principles and values, such as a concern for ethics, anti-oppressive practice, and young people’s perspectives. In post-austerity England, it is perhaps unsurprising that funding and managerial factors were highly influential. Some external funders engaged with practice realities and enabled flexibility; for example, involving grantees in developing evaluation frameworks, or enabling them to choose their own outcomes and evaluation methods. In contrast, some youth services, funders and commissioners required extensive monitoring that was regarded by youth workers as labour intensive and often inappropriate.
In each organisation, conversations played a central role in evaluation, including meetings and discussions with young people, informal chats during sessions, and staff debriefs. Every organisation used databases to record attendance and participation. Most incorporated creative and participatory approaches, case studies and/or storytelling. Questionnaires were common, but these were rarely validated impact measurement tools; the Life Effectiveness Questionnaire at Opal was an exception (and was seen by the manager, Wilson, as unsuited to an open youth work context).
As discussed earlier, we take an open and expansive view of ‘what counts’ as evaluation. Clearly, evaluation has diverse purposes – including organisational learning, practice and professional development, and accountability to resource providers. In some settings, evaluation focused on data collection for upwards accountability rather than learning and development; in others, there was an emphasis on reflective practice and little organised or formal evaluation; and in some organisations, extensive thought and effort was expended in developing creative, youth-centred approaches that enabled learning and reflection alongside providing information to funders. A common theme was that youth work is a particularly challenging arena for evaluation, as we now go on to discuss.
Challenges in evaluation practice
During our study, we experienced many versions of youth work: busy, chaotic, noisy, ‘drop in’ youth clubs; workshops where trans young people enjoyed snacks while reflecting thoughtfully on their hopes and dreams; lively discussions outside the shops after school; relaxed chats while customising old clothes. The youth work process clearly relied on a complex interaction of skilled youth workers, relationships with peers and adults, stimulating conversation, interesting activities, and the chance to chat, relax, take up space, and simply ‘be’. Given this diversity and complexity, it is hardly surprising that youth work is difficult to evaluate. Here, we highlight young people’s and youth workers’ perspectives on the main challenges inherent in evaluating youth work. We conceptualise these as: formalising the informal; intrusive and inappropriate questions; and bureaucratic burden.
Formalising the informal
Most young people and youth workers emphasised that youth work evaluation should suit the distinctively informal nature of youth work. This was commonly characterised by a preference for conversation over form-filling:
Me, I hate writing on paper. So like being able to speak, and like literally just speak thoughts straight out of your mind it’s so much easier. Because like otherwise you have to think of words. And I don’t like thinking of words. (Delilah, young person, Seaside)
… young people don’t want to fill out paper, they are like, ‘why are we doing this?’ But if we sort of had like a board where they could write their own quotes, or captures like random words, I feel that that would be more effective … (Mel, youth worker, Melham)
… school is not necessarily a hugely positive experience for all of them. […] And youth club doesn’t have to be like that for them. (Nicole, youth worker, Opal)
… there’s so many different creative ways but form filling, it does just, it wouldn’t fit with here. It would make it feel like school and it would, I dunno, take away from just what this space and what the agenda is all about. And about it being youth led … (Holly, youth worker, Seaside)
Intrusive and inappropriate questions
A related issue was the reductive and labelling nature of some forms of evaluation and monitoring that conflicted with anti-oppressive practice, especially where these were standardised and not co-designed with youth workers and young people. This led to some awkward moments:
In the ‘gender’ section, James [the manager] had indicated for people to circle M or F. One young person requested ‘prefer not to say’. This caused some discussion later as James hadn’t considered that might come up. I reflected on how brave it was for the young person to flag this omission … (Fieldnote, Melham, detached session)
During the discussion, a youth worker explained that it would help with funding if young people could provide their postcode and tick ‘yes or no’ to whether they or their family receive benefits. There was an uncharacteristic silence as young people passed the paper round. Despite it being made clear that it was optional and confidential, it felt out of kilter with the sensitive and inclusive approach in the rest of the session. (Fieldnote, Journeys, Saturday session)Footnote2
… some days you don’t really want to think about that question, and it can make you think too far into it, and then you can be left thinking about it for the rest of the day […] sometimes the things that it says on those questionnaires can actually give you ideas, instead of helping you. (Luna, young person, Seaside)
Bureaucratic burden
Evaluation and monitoring created significant demands on organisations, youth workers and young people. This demand became a burden where methods appeared unfit for purpose, unwieldy, and disproportionately time-consuming:
Sheldon mentioned so many different evaluation and monitoring tools … there was almost a sense of fatigue as he listed them … I'm struck by the weight of this sort of bureaucratic burden … not only the way organisational change happens but by the pressure put on staff to embrace, respond, enact ever-changing systems of evaluation without ever being part of their design. (Fieldnote, Melham, detached session)
Yeah, so [council database] is like a swear word in the youth work world. … you’re gonna put them on this ridiculous database that takes fucking ages, and use a lot of your capacity which could be working with young people … (Dawn, youth worker, Seaside)
It used to take up a really long time. And then it would be frustrating cause you wouldn’t be able to be doing your youth work, but you’d be trying to evidence youth work that you didn’t have time to do. (Nora, youth worker, Riverpath)
… it’s very longwinded now … I think some of the questions kind of say the same thing but just in a different language, so that can take time. So yeah, you debrief and then that goes onto [database], but I don’t really think the managers really look at it. (Mathew, youth worker, Melham)
Luke (manager) listed several new forms that would be introduced and a new system in which each staff member was expected to identify two young people in advance of each session and an intended outcome, writing up what happened afterwards. This was agreed to be a bad idea … Throughout the discussion, the part-time workers (who will need to implement this new system) looked really fed up – arms folded, frowning, looking away or at the floor. (Fieldnote, Fairlight, staff meeting)
You arrange quite an informal environment … and then at some point someone steps in and goes, ‘cool, ok, how can we turn this chaotic organic madness into data?’ (Dylan, volunteer, Seaside)
It sometimes seems a shame to have to translate life and experiences and great conversations and stuff into something that answers a question. (Gareth, youth worker, Dove Street)
Navigating the challenges
In this final section, we share how the challenges of evaluation are navigated in practice:
During a chat with two young women that ranged from school to K-pop, Jenny mentioned that she needed to write up a report for the funder, and asked how they would prefer to feed in. They were unsure so she gave examples: a questionnaire, a recorded conversation, film, putting photographs on the wall and adding notes or captions, or using big paper on tables. One said a questionnaire would be ok; the other looked doubtful and suggested a conversation. They discussed how a conversation could be captured, such as by filming or audio recording. (Fieldnote, Dove Street young women’s group)
Group conversations.
Flipchart sheets with questions, post-its, coloured pens.
Creative methods (e.g. video/audio/photography).
Fun activities to rate statements/questions.
Human thermometer (hands indicate level of agreement).
Anonymous suggestion box.
‘Speed-dating’ conversations between young people and funders.
Storytelling.
Simple, flexible questionnaires.
Staff debriefs and reflection.
… the user feedback is a list of like twelve questions asking things like, do you trust staff? Do you feel respected? […] We try and rotate the question, so we don’t ask the same question every time. So we’ll try and ask say two questions in a session. We don’t always get all of them to answer it either, so if we’re doing an activity with them there might be questions behind the stuff and we’ll go right, just do a yes or a no, do a tick or a cross and they’ll quickly fill it out. So the idea of that one, it takes thirty seconds for the young person to do. (Zara, Programme Manager, Vaults)
Aaron [manager] suggested they think about their experience of recent youth club gigs and asked them their views individually. For the question ‘how will you know if it is successful?’, Archie (who makes electronic music) suggested that at a successful gig, young people would watch all the acts, including types of music they don’t normally listen to and performers they don’t know, and that they would enjoy it. Aaron asked how we would know if young people had enjoyed it; Archie suggested they would appear to be listening, would stay in the gig space, might dance, and would encourage each other. (Fieldnote, Seaside, online youth club)
Conclusion
Throughout our study, youth workers sought to make evaluation meaningful, ‘capturing the magic’ in the sense of recording the impact of practice, as well as creating the possibility of magic through evaluation. Such evaluation is meaningful, participatory, flexible, and enables young people and youth workers – particularly those from marginalised social groups – to take part in the co-creation of knowledge. This is a critical, dialogical process; it is not based on an assumption that youth work is always and only positive in its contribution to young people’s lives, but starts from a perspective that evaluation is about reflection, mutual learning, process and practice development, rather than surveillance, top-down managerialism, categorisation and extraction. It is a process of growth, challenge and change that seeks to shape policy and funding by centring the perspectives and experiences of young people, and to shape practice by creating space for reflection and learning.
However, the potential for meaningful, participatory evaluation can be constrained by inappropriate demands for data, and through evaluation and monitoring processes that are experienced as boring, meaningless, time-consuming, triggering, intrusive, or reinforcing of inappropriate categorisations and individual progress narratives. That this happens even in youth work, with its central tenets of informality and openness, highlights the hegemonic nature of impact measurement in a neoliberal context. If evaluation in youth work is to be practice informed, youth centred and anti-oppressive, young people and practitioners must be at the heart of thinking not only about methods but also about the wider policy context – the conditions that enable good evaluation practice to take place.
Earlier in this article, we engaged with Griffiths’ (Citation2012) discussion of the ‘joys and delights’ of educational experiences, the integral ‘good’ of education beyond what it produces for individuals or communities. In our study, we observed and experienced joys and delights, fun and inspiration, as well as moments of dreariness, conflict and frustration – a valuable, messy range of experiences and emotions. To enable youth work that is truly magical, it is essential to listen closely and carefully to what young people and youth workers value about youth work, and how evaluation can best support this. While this article has focused on examples of youth work that are highly valued by young people, we recognise that evaluation may have a role to play in identifying and preventing problematic or harmful practice; this is outside of the scope of this article, yet in brief, we are doubtful that the metrics-based systems of evaluation preferred in policy are particularly effective in harm prevention, and we continue to argue that young people and practitioners must be central to thinking about these issues. As we have argued elsewhere, evaluation should suit the setting, challenge inequalities, and capture and value the everyday and intrinsic elements of practice as well as emerging outcomes (Doherty and de St Croix Citation2019). This involves taking an expansive, holistic, flexible, and inclusive view of what ‘counts’ as impact, celebrating creative and experimental approaches, while challenging wider social injustices in policy and practice. We suggest that this way of looking at impact is relevant beyond youth work, to other practices with young people and communities more generally.
Before finishing, we return to the part-time youth workers from Journeys, asked what they would recommend to policy makers and funders:
George: Leave your desk and come and see us. [Others: yeah!]
Harry: Give us access to funding without the criteria. Let us make the criteria because we do know what we’re doing […] And like, we ask the young people what they want as well. Like just let us make the criteria, rather than telling us what we’re meant to be doing.
George: We’re a great bunch, we’re gonna like make some cool evaluation stuff, that would be a good idea.
Zayn: Yeah, I’d like to kind of urge everyone to kind of take a leaf out of our radical youth work and do evaluation more creatively, and in different ways, and you know immediately in the space, with the people involved afterwards, and you know in discussions and writing it down, and just capturing stories. All that stuff, it’s so rich … […] it compels us, it touches us. And that’s what makes us want to keep going, through that whole cycle of evaluation, it takes us through to the next thing.
Acknowledgements
This article is based on research funded by the ESRC, reference ES/R004773/1. The authors would like to thank Sharon Gewirtz and Sorele Cohen for commenting on an earlier draft, PALYCW and Nancy Stephenson for supportive writing retreats, anonymous peer reviewers for useful feedback, and, most of all, our research participants and the many other youth workers, young people, evaluation practitioners and others who have informed this study, whether through our Advisory Group, feedback at events, or informal conversations. A partial dataset associated with this study (excluding fieldnotes and transcripts where consent was not provided or anonymity might be compromised) is available at UK Data Service Reshare to enable wider use of the data; see https://reshare.ukdataservice.ac.uk/855316/, doi:10.5255/UKDA-SN-855316.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Additional information
Funding
Notes
1 Ethics approval was granted by King’s College London’s Research Ethics Office, refs LRS-18/19-8799 and HR-19/20-13776.
2 The Journeys manager, Ira, said in an interview that this reporting was ‘really not suited to our organisation’, and that they would not reapply to this funder.
3 A practice resource accompanying this article shares some of these methods in more detail, suggesting questions for reflection.
4 For example, Journeys can assign young people a code number, enabling the organisation to track individual scores for three outcomes over time (showing ‘distance travelled’). However, young people can choose to participate in evaluations without this tracking aspect.
References
- Akram, Sophia. 2019. "The Revolution Will Not Be Funded, But That Won’t Stop Her." The New Arab. https://english.alaraby.co.uk/analysis/voices-shake-developing-creative-responses-social-injustice.
- Arvidson, Malin, and Fergus Lyon. 2014. “Social Impact Measurement and non-Profit Organisations: Compliance, Resistance and Promotion.” Volutes 25: 869–886. doi:10.1007/s11266-013-9373-6.
- Arya, Dena, and Matt Henn. 2021. “COVID-ized Ethnography: Challenges and Opportunities for Young Environmental Activists and Researchers.” Societies 11 (no. 2): 58. doi:10.3390/soc11020058.
- Baldridge, Bianca. 2019. Reclaiming Community: Race and the Uncertain Future of Youth Work. Stanford: Stanford University Press.
- Ball, Stephen. 2003. “The Teacher’s Soul and the Terrors of Performativity.” Journal of Education Policy 18 (2): 215–228.
- Batsleer, Janet, and James Duggan. 2021. Young and Lonely: The Social Conditions of Loneliness. Bristol: Policy Press.
- Berry, Sian. 2021. London’s Youth Service Cuts 2011-2021: A Blighted Generation. London: Green Party.
- Charmaz, Kathy. 2006. Constructing Grounded Theory. 2nd ed. London: Sage.
- Chouhan, Jagdish. 2009. ‘Anti-Oppressive Practice in’ Jason Wood & Jean Hine (eds.) Work with Young People. London: Sage.
- Coultas, Clare. 2020. “The Performativity of Monitoring and Evaluation in International Development Interventions: Building a Dialogical Case Study That Situates ‘The General’.” Culture and Psychology 26 (1): 96–116. doi:10.1177/1354067X19888192.
- Darking, Mary, Alison Marino, Bethan Prosser, and Carl Walker. 2016. Monitoring, Evaluation and Impact: A Call for Change. Position Statement. Brighton: Community Works.
- Davies, Bernard. 2021. "Youth Work: A Manifesto Revisited – At the Time of Covid and Beyond, Youth and Policy." https://www.youthandpolicy.org/articles/youth-work-manifesto-revisited-2021/.
- Denmead, Tyler. 2021. The Creative Underclass: Youth, Race and the Gentrifying City. Carolina: Duke Press.
- De St Croix, Tania. 2018. “Youth Work, Performativity and the new Youth Impact Agenda: Getting Paid for Numbers?” Journal of Education Policy 33 (3): 414–438. doi:10.1080/02680939.2017.1372637.
- de St Croix, Tania, Ian McGimpsey, and John Owens. 2020. “Feeding Young People to the Social Investment Machine: The Financialisation of Public Services.” Critical Social Policy 40 (3): 450–470. doi:10.1177/0261018319854890.
- Doherty, Louise, and Tania de St Croix. 2019. "The Everyday and the Remarkable: Valuing and Evaluating Youth Work." Youth and Policy. https://www.youthandpolicy.org/articles/valuing-and-evaluating-youth-work/.
- Duffy, Deirdre. 2017. Evaluation and Governing in the 21st Century: Disciplinary Measures, Transformative Possibilities. London: Palgrave Macmillan.
- Fusco, Dana, Anne Lawrence, Susan Matloff-Nieves, and Estaban Ramos. 2013. “The Accordian Effect: Is Quality in Afterschool Getting the Squeeze?” Journal of Youth Development 8 (2): 1–11.
- Gee, Ricky. 2020. “Informal Education as a Derridean Gift: A Deconstructive Reading of the Principles Guiding Youth Work Practice Within Neoliberal Policy Regimes.” Journal of Applied Youth Studies 3: 103–113. doi:10.1007/s43151-020-00021-5.
- Gewirtz, Sharon. 2001. "Post-Welfarism and the Reconstruction of Teachers’ Work: Reflections on the Analytic Process." Social Policy Dissertation Offprints, D866. Milton Keynes: Open University.
- Ginwright, Shawn, and Julio Cammarota. 2002. “New Terrain in Youth Development: The Promise of a Social Justice Approach.” Social Justice 29 (4): 82–95.
- Griffiths, Morwenna. 2012. “Why joy in Education is an Issue for Socially Just Policies.” Journal of Education Policy 27 (5): 655–670. doi:10.1080/02680939.2012.710019.
- Hill, Phoebe. 2017. "The Formal vs Informal Clash: The Challenges of Ethnographic Research with Young People in a Youth Drop-in Context." Youth and Policy. https://www.youthandpolicy.org/articles/the-formal-vs-informal-clash-the-challenges-of-ethnographic-research-with-young-people-in-a-youth-drop-in-context/.
- Hill, Matthew, Karen Scanlon, and Ed Anderton. 2019. Youth Investment Fund Learning and Insight Paper 1: A Shared Evaluation Framework for Open Access Youth Provision. London: NPC.
- House of Commons Education Committee. 2011. Services for Young People. London: The Stationary Office.
- In Defence of Youth Work. 2009. "Open Letter." In Defence of Youth Work Website. https://indefenceofyouthwork.com/the-in-defence-of-youth-work-letter-2/.
- Kenley, Anoushka, and David Pritchard. 2019. Youth Investment Fund: Learning and Insight Paper 2: Background to the YIF Economic Simulation Model. London: NPC.
- Levell, Jade. 2021. “Music Elicitation: Letting Research Participants Call the Tune.” The Sociological Review Magazine 9. https://thesociologicalreview.org/magazine/november-2021/methods-and-methodology/music-elicitation/. doi:10.51428/tsr.iwpb8007.
- Lovell, Alexander, Uzo Anucha, Rebecca Houwer, and Andrew Galley. 2016. Beyond Measure? The State of Evaluation and Action in Ontario’s Youth Sector. Toronto: Youth Research and Evaluation eXchange. YouthREX.
- Lowe, T. 2013. “New Development: The Paradox of Outcomes – The More We Measure, the Less We Understand.” Public Money and Management 33 (3): 213–216. doi:10.1080/09540962.2013.785707.
- McMahon, Sinead. 2021. "What’s the ‘Problem’ With Irish Youth Work? A WPR Analysis of Value for Money Policy Discourse and Devices." Youth and Policy. https://www.youthandpolicy.org/articles/whats-the-problem-irish-youth-work/.
- McPherson, Charlotte. 2020. “'It’s Just so Much Better Than School’: The Redemptive Qualities of Further Education and Youth Work for Working-Class Young People in Edinburgh, Scotland.” Journal of Youth Studies 23: 307–322. doi:10.1080/13676261.2019.1599103.
- Ord, Jon. 2014. “Aristotle’s Phronesis and Youth Work: Beyond Instrumentality.” Youth and Policy 112: 56–73.
- Ord, Jon, Marc Carletti, Daniele Morciano, Lasse Siurala, Christophe Dansac, Sue Cooper, Ian Fyfe, et al. 2021. “European Youth Work Policy and Young People’s Experience of Open Access Youth Work.” Journal of Social Policy 51 (2): 303–323. doi:10.1017/S0047279421000143.
- Recovery in the Bin. 2016. "Unrecovery Star." Recovery in the Bin Website. https://recoveryinthebin.org/unrecovery-star-2/.
- Ritchie, Daisy, and Jon Ord. 2017. “The Experience of Open Access Youth Work: The Voice of Young People.” Journal of Youth Studies 20 (3): 269–282. doi:10.1080/13676261.2016.1212162.
- Spence, Jean. 2004. “Targeting, Accountability and Youth Work Practice.” Practice: Social Work in Action 16 (4): 261–272.
- Taylor, Tony, and Marilyn Taylor. 2013. “Threatening Youth Work: The Illusion of Outcomes”, In Defence of Youth Work website, https://indefenceofyouthwork.files.wordpress.com/2009/05/threatening-yw-and-illusion-final.pdf.
- te Riele, Kitty, and Rachel Brooks, eds. 2013. Negotiating Ethical Challenges in Youth Research. New York: Routledge.
- Varvantakis, Christos, and Sevanti-Melissa Nolas. 2021. “Picturing What Really Matters: How Photo-Story Research Makes the Personal, Visible.” The Sociological Review Magazine 9. https://thesociologicalreview.org/magazine/november-2021/methods-and-methodology/picturing-what-really-matters/. doi:10.51428/tsr.mtsg8567.
- YMCA. 2020. Out of Service: A Report Examining Local Authority Expenditure on Youth Services in England and Wales. London: YMCA.