1,057
Views
1
CrossRef citations to date
0
Altmetric
Article

Going beyond metric-driven responses to surveys: evaluating uses of UKES to support students’ critical reflection on their learning gain

, &

ABSTRACT

Genuine engagement by academic staff and students in reflective practice in a time of great institutional pressure and a neo-liberal agenda for more metrics driven practice has become increasingly difficult to set time aside for. While more and more feedback is being requested on teaching practice, the quality, validity or reliability of this feedback is not always apparent. This case study explores a project developed at a small-medium sized institution, which aimed to provide an alternative rationale for lecturers to gain richer feedback about their teaching and student experience. The UK Engagement Survey was used in an original way as a reflective tool to increase the engagement with the survey itself and thus enhance the quality of the data. This article will outline the interactive, workshop-based nature of our approach and the effects this has had on the nature of the survey data and response rate.

Introduction

It is hardly an original claim that universities, as individual institutions, and the higher education sector more broadly in the UK has become increasingly metric driven in recent years. It almost goes without saying that this has had numerous negative consequences for those of us concerned about educating and developing students. With more surveys, more feedback, and more data, is the issue more about the quality of that data, who is dictating the nature of its use and to what end? While a greater pre-eminence of the student voice is valuable, there is great diversity within this and it is often unproblematised (Freeman, Citation2016). The pre-eminence of the National Student Survey (NSS) is a significant driver of the move to a metric-focused, managerialism in higher education. One of the greatest dangers of this is the enshrining of ‘satisfaction’ as a meaningful or reasonable measure of what our students are doing in (or gaining from) higher education. Leaving the dominance of the NSS unchallenged risks deepening the hegemony of satisfaction as the ‘be all and end all’ of teaching and learning. It is somewhat concerning for such a large-scale data collection exercise, that it is divorced technically from standard expectations of rigorous analysis (Sabri, Citation2013). As such, alternative measures and approaches must be championed. Unfortunately, without a strong rationale to challenge this hegemony, alternatives struggle to find traction. This case study explores a project developed at a small-medium-sized institution, which aimed to provide such an alternative rationale for lecturers to better understand their teaching and student experience. The rationale would be to separate this from the greater institutional pressure of surveys like the NSS but remain evidence-based. At the heart of this is an original approach to using the UK Engagement Survey (UKES) as a reflective tool. The purpose being to commit time to support academic staff and students to reflect together through the lens of the concepts featured in UKES. The rationale being that more time spent reflecting on these issues would increase the reliability and usefulness of the data to staff for enhancing their practice. It was also intended that responses would be more valid, as participants would be more engaged and familiar with the concepts being measured. This article will outline the interactive, workshop-based nature of this approach, the effects this has had on the survey data; including response rate.

Reframing UKES

The UK Engagement Survey (UKES) was developed by the Higher Education Academy (now AdvanceHE) and is considered ‘the only national survey in the UK focusing on students’ engagement with their studies’ (Buckley, Citation2014, p. 5). The UKES has its roots in the National Survey of Student Engagement (NSSE) in the United States. The NSSE has been running since 1999 and has grown to be used at over 1600 colleges and universities across the US (NSSE, Citation2017). NSSE was developed as a way of addressing a gap in data for oversight of higher education, with a specific focus on learning and personal development to provide institutions with a better understanding of the quality of their teaching and the capacity to make evidence-informed decisions to improve this (Kuh, Citation2001). The NSSE collects data in five different categories:

  • Student Behaviours

  • Institutional Actions and Requirements

  • Reactions to College

  • Student Background Information

  • Student Learning and Development

This focus on a mixture of factors, including differentiating between what is required of students by their institution and what the student actually is doing, is what distinguishes NSSE as a measure of engagement rather than satisfaction. Kandiko Howson and Buckley (Citation2017) describe that in developing the UKES, dissatisfaction with the NSS was a core motivation and the effectiveness of the NSSE as a measure of engagement was something to which they aspired. Inspired by NSSE, the UKES is a voluntary survey that Higher Education Institutions (HEIs) in the UK can opt into. While the survey is set up and managed by AdvanceHE, it is administered locally and data collected are analysed by individual institutionsFootnote1. The survey mainly focuses on four core areas of academic engagement; critical thinking; collaborative learning; course challenge; and staff–student interaction. These particular areas of the student’s engagement with their university experience were the key interests in the researchers looking at fostering reflective critical practice in students. Kandiko Howson and Buckley (Citation2017) state that a key purpose of the UKES was for it to be used as a tool for enhancement and not comparison between institutions, and as such results were kept anonymous between different universities.

The University of Winchester participated in the UKES for the first time in 2015/16. The decision to participate in the UKES came out of a general sense that we wanted to measure our students’ learning experience beyond satisfaction; instead building a picture of how our students are engaging with their studies and their own sense of developing knowledge and skills (and how we were supporting them to do so). With this in mind, the University approached the UKES as a tool for fostering students’ critical reflection, and as such, used it to create an environment for the students to explore the learning gain they had experienced upon entry into higher education, as advocated by Neves and Stoakes (Citation2017). A key aim for this methodology for engaging students with UKES was to emphasise and highlight to students their changes in knowledge, skills, values, and attributes and enable them to reflect on these areas to see what they have gained and what they feel they can work towards improving. It is important to reiterate and highlight that the use of UKES for learning gain in this study was to encourage and enable the students’ self-reflection, not to work towards metric-driven comparison of programmes.

A key part of this rationale was an attempt to increase the reliability and validity of the data collected, with findings from the survey being disseminated to, and discussed by, teaching staff and senior managers via central committees, as well as evaluated in institutional and programme-level survey reports. Furthermore, programme teams were encouraged to utilise data from the UKES alongside other data sources (such as the NSS), feeding it into annual programme reports and drawing upon it at programme committee meetings and in staff-student liaison committees. Given our institutional approach and that those staff who supported UKES were keen to use the data to make genuine improvements, it was paramount that the data collected were meaningful. Two key barriers to the validity and reliability of data from the UKES were identified which will be discussed in turn:

  • A low response rate to the survey

  • Low levels of pedagogic literacy among students

Response rate

In 2015/16 we engaged with the UKES using a standard survey approach. This involved promoting it across our campuses, via social media and in emails to students, as well as incentivising participation by offering students who participated in the opportunity to enter a prize draw. We faced the challenge of trying to capture student feedback at the end of the academic year and only during our assessment weeks. This timing was decided in an effort not to run the survey at the same time as the NSS amid concerns that final year students may complete the UKES instead of the NSS. Our response rate was just 10%, which is considered a ‘low response’ by the HEA (HEA Surveys, Citation2016, p. 3) and student surveys literature, although Nair et al. argue that surveys with a 10% response rate may still be considered viable (Citation2008). We believe there were a number of important factors behind this low response rate. It is well understood in survey literature that web surveys, such as the UKES, get lower response rates than other survey methods, with Porter and Umbach (Citation2006), Nair et al. (Citation2008) and Fan and Yan (Citation2010), among others, having all argued that paper surveys typically get much higher response rates than web surveys due to the nature of the contact between survey participants and survey administrators (Porter & Umbach, Citation2006, p. 243). Given that all promotion of the UKES at the University was remote and involved no form of staff-student or student–student interaction, students were not necessarily given the opportunity to understand or discuss the purpose of the survey, and in turn, may not have appreciated the benefits of participating. As explored by Porter and Umbach (Citation2006), prospective survey participants are understood to weigh up the costs and benefits of their participation and engagement with surveys. Understanding the survey to be beneficial to them (for example, the opportunity to win a prize), to be a meaningful experience (a useful tool for them in some capacity), or to be impactful in its outcomes (such as providing the opportunity to feed into or elicit change) are key drivers for increasing survey participation rates. AdvanceHE’s guidance on increasing UKES response rates is in-line with the findings from the literature outlined above, suggesting that some form of personal interaction with students during the administering of the survey – such as personalised participation invitations – and incentives are important methods through which higher response rates can be achieved (Citation2016: 4). Such drivers are well understood by both government and universities in the UK in their running and strategic backing of the NSS. While students tend to understand the benefits of participating in the NSS, with participation often incentivised by institutions and framed as an opportunity for students to ‘have a say’ in the future development of their programme and wider university processes and facilities, the UKES lacks the strategic backing needed to engage students on the same scale. It is, therefore, unable to counter the lower response rates experienced in the running of web surveys, with sector response rates generally hovering around the 13-14% marker, compared to the NSS, which does not publish results for institutions who have a lower than 50% response rate (Office for Students, Citation2018). With this critical context considered, we may therefore reasonably conclude that the lack of strategic backing and the absence of contact time between staff and students during their participation in the UKES at the University of Winchester in 2015/16 was a barrier to students recognising the benefits of their participation in the survey, in turn resulting in a low response rate.

In addition, the low response rate may in part have been as a result of survey fatigue – what Porter et al define as ‘one component of responsibility burden’ (Citation2004, p. 64). Given that we ran the UKES following the NSS and other annual institutional surveys it is plausible that both our students and staff were suffering from survey fatigue and as such were less likely to participate. Furthermore, that students were invited to participate in the survey during their assessments weeks may also have fed into a general sense of fatigue, as well as students’ disengagement from non-essential tasks. In-line with the literature, it is plausible that the costs of completing the survey (namely time) outweighed the benefits of doing so for many of our students – particularly if the benefits of participating in the survey were not well understood by students in the first place due to the lack of contact time involved in the administering of the survey.

Our low response rate in 2015/16 was a challenge that we needed to overcome. Low response rates – although considered viable by some – are inevitably less representative of the whole student body and thus the data produced are less valid (Nair et al., Citation2008), in this case as a measure of student engagement and learning gain. In turn, lower response rates are also likely to result in biased data. As Porter and Umbach (Citation2006) note in relation to students surveys, students who are already the most engaged in their university studies and/or who have a higher academic ability are almost certainly more likely to participate in surveys. This is true also for students from certain backgrounds or demographics, with an individual’s social environment and personal attributes understood to influence response rates and response bias (Porter & Umbach Citation2006). For example, female students are more likely to participate in surveys than their male peers. It was therefore essential for us to grow our response rate to increase the validity of our data and in turn make it more meaningful for staff looking to make enhancements to their programmes at the University.

Given the low response rate in 2015/16, we changed our approach in 2016/17. The survey was open from February to the end of our assessment weeks in May and we moved away from standard promotion, instead of encouraging staff to give students time in class to discuss and complete the survey (properly briefed that it was optional to do so). It may seem common sense that committing time and space for completing a survey would lead to a greater response rate. However, as the UKES was not an institutional priority, this relied on the goodwill of academic staff to allow students the time to complete the survey during their taught sessions. As such, making the case that the data would be valuable on a local level, and not for comparison purposes, was paramount for encouraging staff engagement. To do this, we aimed to utilise the survey as a critically evaluative and reflective tool for students (and tutors) by encouraging and supporting programme teams to take ownership of the survey and tailor students’ participation to best suit their academic development and progress. As noted above, we hoped this approach would ensure that the survey had purpose (that students would understand the benefits of participating) and that findings from the survey would be more insightful and could be utilised for learning and teaching enhancement, rather than a performance measurement.

Despite this shift in our approach, our response rate was still lower than the sector average (although up on the previous year, at around 12%). While some programme teams embraced this approach, building the survey into taught sessions and using this to shape reflexive activities for students, they were unfortunately in the minority. We found that without strategic backing and the staff resource for our learning and teaching team to run these kinds of sessions, programme teams found it challenging to engage with the survey in this way. In addition, some programme teams simply did not buy into the survey, seeing it as ‘just another survey’ and not aligning it with learning and teaching enhancement as we had hoped they would.

Pedagogic literacy

While referring to it as pedagogic literacy, in a simpler sense this could be described as comprehension. We felt it was necessary to ensure that students reflected on the concepts being measured, as well as on what the concepts meant to them in the context of their own study. We also felt that it was important for students to be able to clarify any uncertainties they had with regard to these concepts. Core to the idea of increasing pedagogic literacy was clarifying misconceptions. McKenna (Citation2010) identifies that students are often faced with an academic ‘code’ when encountering subject-specific terms or more generic academic language for the first time. This is equally true for specific teaching and learning terminology and can also apply to staff.

The need to develop our students’ engagement with the survey themes and language was another central factor which led us to change our approach to the survey from 2016/17 onwards. Instead, we aimed to give students the opportunity to work through the survey themes and concepts interactively and to contextualise them within their individual discipline areas in the class environment. We hoped that this would ensure that all students had a greater comprehension of the key concepts from the survey, in turn allowing them to understand the survey questions more thoroughly, thus increasing the reliability of the survey data. Similarly, the intent was that deeper understanding of the concepts being explored in the survey would lead to more detailed responses to the open-ended, free-text questions that feature in the survey, thus enhancing their usefulness. Finally, in changing our approach to the survey we aimed to present it as something that was meaningful not only to students, but to teaching staff and the wider university too (to the extent that it was worth investing teaching time in). By embedding the survey as a reflective, class activity and in aiming to make it a part of the learning and teaching culture of the institution, it was hoped that more students would complete the survey and thus improve the validity of the findings.

Response

Despite the continued low response rate and lack of engagement (from both staff and students) in 2016/17, those staff who did engage with the UKES found it valuable as both a reflective tool for their students and as a tool from which they could make enhancements to their own practice. One lecturer commented:

‘What is particularly good about this process is the way it hails and engages students into the realm of their studies holistically. They responded not just with the sense of knowledge or skills attained but also justification as to why they were pursuing certain subjects … This is not about the tutor, though as a terrific side effect the tutor can find out some fantastic information and potentials for change.’

In-line with the above feedback, and rather more anecdotally, we found that staff who engaged with the UKES have responded well to it not being used to hold them to account but rather provide genuine insights into the student experience. Despite this tacit support from staff, those of us involved in administering the UKES have frequently found that teaching staff lack the time, opportunity and sometimes skill to use it correctly. To address these issues, we developed and piloted a supportive and interactive, workshop-based approach to using the UKES as an enhancement tool in 2017/18. This involved a team of educational developers going into teaching sessions to work with students to unpack core concepts under-pinning the UKES and relate them to their own study and experiences. This had the following core aims:

  • Enable students’ critical reflection on their learning gain

  • Increase the pedagogic literacy of our staff and students

  • Increase the response rate of the survey

  • Enhance the quality of responses (e.g. more in-depth qualitative comments)

Reimagining UKES

The reimagining of how the UKES was conducted at the institution took the shape of a three-tiered workshop, in which levels of time and input from the Learning and Teaching Development team (LTD) increased with each tier. These workshops were offered to all programme teams and required up to an hour of class time during the UKES survey period, between February and May. The three-tiered approach was an attempt to offer staff flexibility with the workshops, depending on what time they felt they could give or how much they wanted the students to reflect on their experience of the programme through the workshops. The LTD team were there to unpack and unpick the questions in a manner that enabled greater understanding of what the question was asking. It is important to note that what we were not trying to achieve was ‘coaching’ students to respond to the survey in particular ways. We safe-guarded against this in two ways. Firstly, despite the workshop approach, the survey was still completed anonymously. Secondly, when asking students to think of examples relevant to their course, these were never critiqued for their effectiveness so as not to prejudice responses. A key consideration for this methodology was providing context for the student participants, as some students will have had differing levels of exposure to the types of critical thinking, reflecting and connecting and skills development with which these activities were designed to engage. This could be seen as a limitation of the study, but it also provided the students, who may perhaps be unfamiliar with these types of activities, with an opportunity to apply these skills in new contexts. For programmes who did not wish to engage with the survey via one of the approaches outlined below, the survey ran in the usual manner and all students were invited to participate via their university email. Programmes were invited to participate in the workshop, and we received interest from three programmes as seen in .

Table 1. Table of programme engagement.

Data analysis

Response rates

Due to the nature of the data we were collecting, we cannot identify direct causal links between workshop participation and response rates. In fact, the response rate benefits arguably may not be related to the nature of the workshops, but rather the principle of having a set time and location within which students could respond to the survey. That being said, each cohort showed a change in response rate as outlined in .

Table 2. Response rate by workshop type and cohort.

The only group which had a declining response rate was Humanities 1. While the absolute numbers increased, the programme has increased in size so this was a smaller proportion of the wider population. For both of the other programmes with participating cohorts, there were marked increases in the response rate with it nearly doubling in Humanities 2 and increasing by 37.7% in Social Science and considered high by AdvanceHE (Citation2016). As stated above, these increases may not be due to the workshops directly, but due to the convenience of completing the survey at a given time and place. This is still a useful finding as many institutions engage in activities to increase response rates in institutional surveys, but there is little publically shared information about the success of this. This also speaks to the importance of engaging staff to allow access to a classroom setting, which may be a benefit of the nature of the workshops and the approach taken.

Open-text comments

In order to explore the effectiveness of this new approach to the UKES, we undertook a comparative analysis of the open-text comments in the survey. This was intended to be a proxy measure of increased engagement with the survey, increased reflection by students and increased pedagogic literacy. As such, the analysis compared the following criteria-

  • The direction or subject of the comments (i.e. were they about the students’ learning or the teaching they received)

  • Number of comments made (i.e. whether a greater number of comments were made by students after the intervention)

A limitation of this method is, of course, the subjective nature of the assessment being made. Example comparison quotes will be presented when possible to represent the judgements being made, the extent to which these are representative of the wider cohort will be discussed.

Overall, there was not an increase in the number of open-text comments made by students. The researchers had expected that the amount of comments would increase following the intervention activities. More important in terms of the pedagogic literacy of the students was the ‘quality’ of their responses. The open-text comments received from those students who participated were analysed to explore the usage of more specific or ‘technical’ pedagogic language or reflections about their own learning. Typically, these comments were vague or focused on specific issues (e.g. complaints about individual teaching staff). This section will discuss each participating cohort in turn and where possible compared to survey respondents who did not participate in the interventions, either by drawing on survey responses of previous years or respondents from the wider student body on the programme.

Humanities 1

In line with the overall picture, free-text comments did not increase as a result of the workshop, in fact, only two students who participated (out of 13) gave comments and between them they only left five comments (students have the option to leave comments at the end of every section of the survey). In terms of the quality of the responses, those who completed tended to have a more reflective tone whereas non-participants gave feedback about their broad view of the quality of the programme or their own satisfaction. The following quotes are representative of this divide-

I felt that my work improved generally when I began the module as it encouraged layers of understanding. On reflection, my modules prior to that felt vague and I struggled to grasp the right method for assignments.

Level Six Workshop participant

My lecturers have been brilliant and I feel I have really expanded my knowledge in my field of study.

Level Four Student (non-participant)

The teaching, the content, and the way the lessons are delivered. All the staff are very friendly and helpful. Very pleased.

Level Five Student (non-participant)

While the survey respondents who participated in the workshop did not necessarily go into much more detail than the rest of respondents, there is a clearly different tone. Specifically, this participant is explicitly reflective about both the course and their progress within it, showing clear development that is both driven by teaching and self-directed.

It is important to note, this was the only workshop where the module tutor was actively engaged in the workshop discussion around the survey questions. This likely influenced the students’ engagement with the survey and increased the levels of self-reflection as the more abstract discussion was contextualised by the lecturer in the subject matter. A potential explanation besides the effectiveness of the reflective workshop approach is that these examples are comparing a Level Six student to one from a lower level, this may influence the nature of the response.

Humanities 2

Diverging from the wider picture, the participants in this workshop group provided more open-text comments than those students who responded to the survey and did not attend. For the cohort generally, the Level Four students made the most comments. While the Level Five workshop group provided more free-text comments than the non-workshop Level Six group, they had slightly fewer than the non-workshop Level Four group. Similar to the Humanities 1 cohort, the comments from students who did not participate in the workshop were more focused around feedback and satisfaction. However, there was less consistency around the type of comments from those who did participate, being a mixture of more reflective responses and those focused on engagement like the wider cohort as these quotes show-

Some modules in particular (depending on the different teaching styles of the lecturers) led me to questions & philosophical thoughts of my own. I like that my course is not always just relevant to itself, but also to life & my personal development.

Level Five Workshop participant 1

Good standard of teaching.

Level Five Workshop participant 2

I am really enjoying the academics, some modules better than others and overall I am happy with University.

Level Four Student (non-participant)

An important difference between these workshops and those from the Humanities 1 cohort, was the involvement of staff members. In this case, the module tutor was completely absent from the workshop and asked us to run it in place of teaching. This may have meant that the students did not have the same support and guidance to be able to reflect on and evaluate their development and progress within their own context. Additionally, as these are Level Five students, this may also have contributed to the different types of responses than the Humanities 1 Workshop students. However, it warrants further investigations that no responses outside of the workshop participants had an element of reflection.

Social science

Whereas the two examples from Humanities were an intervention in one workshop for 1-year group, the Social Science programme ran workshops across all three levels. As you may expect, this led to an increase across the board in the number of free-text comments made. However, as the different levels, all had a different workshop approach which has potentially led to some differences in the nature of the responses. As such, none of the Level Six students left any open-text responses. This group only participated in the short, Tier 1 workshop which does not unpack the questions and potentially does not give enough time for meaningful engagement. As this largely involved the survey team making a brief appearance in a lecture, this may have had a negative effect, particularly with a group towards the end of their studies who would already have completed NSS. Unlike the two Humanities workshop cohorts, the responses by Level Four and Five students tended to be more about the quality of the course and their satisfaction-

The quality of lectures could be better and more informative.

Level Four Workshop participant

Every module has been great fun, enjoyable; informative and every lecturer great as well

Level Five Workshop participant

We are not suggesting that the only useful comments would be reflective ones, but increasing the amount of students’ reflection on their learning was a key aspect of this study. Based on these responses, this would not appear to have been effective and the responses still tend towards a basis in satisfaction. Reflecting on this, one possible reason for this could be that the discourse surrounding the higher education sector is so deeply steeped in language focused around satisfaction levels that students are immersed in this dialogue more frequently than pedagogic reflection and feel more comfortable aligning their experience at higher education to this frame of reference. While the staff members did not actively participate in the workshops at any level in Social Science, they were present to introduce the activities and occasionally contribute when relevant.

Reflections for the future

As the case studies above have suggested, the UKES workshops have had mixed reception amongst the students and mixed results. It is important to note here that the reflections on the effectiveness of these workshops are from the researchers implementing the methodology and are thus from their perspective. From the LTD team’s observations, there were multiple factors that influenced the degree to which they have been successful. Firstly, the module tutor’s engagement and input in the workshop has proven to be essential, as it had a visible impact on the student’s engagement and interest with the content. If the tutor seemed disinterested or appeared as though they had other (better) things to do while the team ran the workshop, the students saw this as a reflection on the UKES workshop and followed the tutor’s example. Where the tutor showed interest and attentiveness to the session, the students emulated this behaviour and engaged with the workshop. The most significant positive impact on the student’s engagement was where the tutor was wholly engaged with the workshop and co-ran the activities, contextualising at each stage with the content of their course. In future iterations of these workshops, the team will ask tutors to engage with the workshop if possible, and if not, the team will ask for course/module content before running the workshop to contextualise the activities and integrate them more effectively with what the students have been learning.

The activities were designed to help students decode the language of, and engage more deeply with, the UK Engagement Survey. For future workshops, the team has observed that it would be more effective if each session was tailored specifically to the cohort. This would involve exploring module content, as previously discussed, and tweaking activities to reflect the length and number of activities for the individual cohort. There were points at which the activities seemed too long for certain groups and seemed to work well with others. We would mitigate this through a more in-depth discussion with the module tutor prior to the workshop, to assess the atmosphere of the cohort and see which activities would be better suited to them and for how long. Integrating more of the module content going forward will help to contextualise the workshop with their learning. However, the workshop will remain with the core of using UKES as the guiding themes for students to reflect on their studies. This periscope effect of contextualising the workshop in the module content whilst thinking more broadly about their whole learning experience on the programme seemed a favourable approach, as one tutor noted, ‘Even with students who are working with a high degree of critical engagement there is sometimes an understandable reluctance to see how such ideas are linked or have effect on their own selves. What is particularly good about this process is the ways it hails and engages students in the realm of their studies holistically.’ As the UKES is not specific to disciplines, the questions engage holistically with the students’ learning, which provides a time for reflection on the broader dimensions of their programme experience, making connections with modules that might have appeared to be discrete activities with no overarching connection. We intend to keep this sense of a holistic engagement with the programme, but equally find ways to contextualise the workshop with the module, as sometimes it is difficult to see the programmatic design wood for all of the modular trees.

The free-text comments did not dramatically increase, as had been our aim, and perhaps this is because the students required more space or time to fill these in. Through shortening the activities and tailoring them to the cohort, we would be able to provide more time for the students to fill out the survey and free-text comments. The quality of the comments made did not necessarily indicate an increase in pedagogic literacy, however, as one tutor commented, ‘[the workshops] made space for discussions and for students to have differing views on types of modules, assessments and their participation in them in a language that they could feel connected to. I.e. not “pedagogy speak” not the language of assessment literacy.’ Whilst the students may not necessarily have adopted the ‘pedagogy speak’, or shown a directly correlative effect in their responses, the tutors highlighted the value of the workshops as being a translation activity, one that engaged students with the language in an accessible format. This tutor felt the workshops were particularly useful for their cohort stating, ‘I would also like all staff to engage with it as a matter of course. To have to actively listen to students is a rare opportunity and should be encouraged.’ Which emphases the key role the UKES has as a student-centred reflective learning tool. It suggests the UKES questions provided an opportunity to listen to students in a more effective way than the, so later called in the feedback, ‘blunt tools’ that do not pick up on the rich nuances of discussion facilitated by the workshops.

Finally, it is also always a question of timing for conducting any feedback or engagement activity in higher education. This is a key consideration for the future of the UKES workshops. The timing could have meant the students taking the UKES could have recently filled out another institutional survey, be that NSS or module feedback form. As previously discussed, the timings of when the University ran the UKES has changed since its inception. The survey is open to students from February until May, but this has also meant the workshops were equally open to the module tutors to book in throughout this period, giving no control over when other surveys might have been conducted with the cohorts. If a UKES workshop was conducted near to another survey, the students could be less responsive or engaged due to no fault of the team or tutor, but due to survey fatigue or an attitude that their feedback has already been given. Beyond considering other surveys, the timing of the workshop could have conflicted with an approaching assessment deadline for the students and they could have felt impatient with the workshop encroaching on their course content and class time. Again, as the timing of the workshop was under the tutor’s discretion, we were unaware of this being an issue, but it is certainly something to avoid through discussion in the future.

Conclusion

In attempts to tackle the low response rate with the UK Engagement Survey, the University of Winchester developed a three-tiered approach to deepen the understanding of and engagement with the survey. This was based on the belief that the UKES provided a tool for critical reflection and pedagogical enhancement rather than a neo-liberal agenda for measurement of performance. The UKES workshops have proven to be an effective model for engaging students with the survey, but it has highlighted that vital to this success is the tutor’s engagement with the session. Where tutors have been engaged in the process, we have seen increased student participation with UKES and where tutors have shown detachment and disinterest during the session we have seen a polarised result. The qualitative data around the effectiveness are inconclusive but suggest the need for further analysis in future. This will also need to account more for the levelness of students when comparing the reflective nature of their responses. Engaging directly with staff and students did seem to influence the response rate but we cannot show any causal links here. As the UK Engagement Survey provides an alternative approach to understanding and improving the student’s teaching and learning experience, whilst remaining within an empirical framework, but without the institutional pressure, the team will continue to work to adapt the workshops to attempt to better the staff and student’s engagement with it.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. Although institutional data are analysed by the participating institutions themselves, AdvanceHE do collect national data from the UKES and offer to participate HEIs benchmarking reports. In addition, AdvanceHE report on national data in the annual UKES report.

References