Publication Cover
Engineering Education
a Journal of the Higher Education Academy
Volume 6, 2011 - Issue 1
6,799
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

The meaning of prompt feedback and other student perceptions of feedback: should National Student Survey scores be taken at face value?

, &
Pages 31-39 | Published online: 15 Dec 2015

Abstract

Regardless of the institution or discipline, the UK’s National Student Survey (NSS) has consistently highlighted that the level of student satisfaction about the feedback they receive is notably lower than for other aspects of their learning experience. This study explored how students understand concepts and practices rated through NSS questions evaluating feedback practices in higher education. Drawing on questionnaires completed by first, third and fourth year Chemical Engineering students, the study calls into particular question the reliability of NSS data on promptness of feedback. In conclusion, it calls for greater attention to be paid at institutional level to the identification and management of students’ perceptions and expectations of the process, content and outcomes of feedback.

Introduction

Improving the quality of the student learning experience is a key issue in higher education and it has been widely recognised that feedback on assignments can contribute to this (CitationKolb, 1982; Brockbank, 1998; Ramsden, 2003; Irons, 2007; Norton, 2007). CitationRamsden (2003: p.187), for example, highlights that ‘It is impossible to overstate the role of effective comments on students’ progress in any discussion of effective teaching and assessment’. Yet in the National Student Survey (NSS) conducted by the Higher Education Funding Council for England (2005–2009), the questions asking students to rate the process, content and outcomes of feedback continue to receive the lowest scores when compared with other aspects of their higher education experiences. The NSS asks final year undergraduate students to respond on a five-point scale, running from ‘definitely agree’ to ‘definitely disagree’, to the following statements about feedback:

  1. Feedback on my work has been prompt

  2. I have received detailed comments on my work

  3. Feedback on my work has helped me clarify things I did not understand.

Whilst the NSS undoubtedly plays its part in measuring institutional performance and student satisfaction (CitationRamsden et al., 2010), further investigative research is required to analyse the design aspects of this survey. One key exception (CitationYorke, 2009) considers the use — or non-use — of negatively stated items in the survey and highlights the importance of cultural differences in the interpretation of such items. Students’ cultural and linguistic backgrounds have also been shown to influence their responses to a teaching evaluation questionnaire (CitationAl-Issa and Sulieman, 2007). Since UK universities attract students from a range of cultures and nationalities some items in the NSS may be interpreted differently by these groups. Bearing this in mind, this study explored Chemical Engineering students’ perceptions of feedback and aimed to identify whether NSS scores on feedback may be influenced by non-instructional factors.

Research methods

The study sample comprised first, third and fourth year undergraduate Chemical Engineering students. A considerable number of third year and all fourth year Chemical Engineering students are final year degree students who will be eligible to take part in the 2011 NSS survey. The first year students were included in order to investigate the possibility of changes in perception during the course of their degree programmes.

In order to understand students’ perception of feedback, a questionnaire was designed and circulated at the beginning of the first semester in the academic years 2009/10 and 2010/11. The questionnaire was delivered on paper and was completed by 74 first year students (from a cohort of 93 in 2009/10), 88 third years (from a cohort of 112 students in 2010/11) and 44 fourth years (from a cohort of 83 students in 2010/11).

The questionnaire was anonymous and sought both quantitative and qualitative data, following the lead of previous research on students’ perception of feedback (CitationWeaver, 2006; Burke, 2009). By opting for this strategy it was possible to obtain students’ points of view in a non-controlling and open way as well as to collect some data that lent itself to statistical analysis (CitationPatton, 2002). The students were asked to identify their origin (home or overseas, referred to here as international students), gender and age (≤20 or ≥21) and to provide anonymous information about their perception of feedback. Third and fourth year students were also asked to identify if they were direct entry into Year 2 or Year 3 — this admission route is open to certain overseas students whose earlier studies at overseas partner universities are accredited as prior learning. It was considered possible that such students would have a different attitude to feedback to international students who had entered in the first year.

Great care was taken to relate the questionnaire to the research aim and to ask the minimum number of questions necessary so that students did not feel overwhelmed. Five questions were designed in order to capture students’ perceptions of five key aspects of feedback. These are shown in .

For the multiple choice question on preferred method of receiving feedback, students were encouraged to tick as many options as they wished and to explain their choice/choices in a free text response box. No verbal explanations were provided to clarify any terms in the questionnaire. Responses to the four open questions were read closely and common themes were drawn out to build a coding frame. These codes then provided the basis for the analysis of the open question data. The closed question was analysed using descriptive statistics.

Results

The anonymous student feedback questionnaire indicated that of the 74 first year responding students, 55 were home (UK) students and 19 were international students (see ). Students’ origin (home or overseas) was self-declared. The third year cohort comprised 45 home and 43 international students, including 2 year 2 and 26 year 3 direct entry students. The fourth year cohort consisted of 15 home and 29 international students, including 6 year 2 and 15 year 3 direct entry students.

As can be seen from the breakdown in , all the cohorts contained more male than female students. Similar distributions are observed within the institution across other engineering and physical sciences programmes, as opposed to a female majority in other disciplines.

The first step in understanding students’ views of feedback was to ask them for their understanding of the term ‘feedback’. General consensus was that feedback should help them to gain insight into the strengths and weaknesses of their work. Between a third and a half of the students (45% of first years, 43% of third years and 37% of fourth years) went on to say that feedback should provide them with guidance on how to improve their performance. Students were then asked to describe why feedback is important to them. Responses to this question were similar to those given to the previous question — i.e. that feedback should help them identify their strengths and weakness — with the exception that a higher percentage (78% of first years, 88% of third years and 77% of fourth years) stated that feedback offers them opportunities to improve.

Table 1 Questions used in the students’ questionnaire in five specific aspects of feedback

Table 2 Number of students by year, origin and gender

Following the general agreement amongst the students on the definition and importance of feedback, the question ‘What, in your opinion, constitutes prompt feedback?’ generated a high number of non-responses (19% of first years, 24% of third years and 19% of fourth years) and a wider range of answers. These answers were placed in different categories and then ranked for each year of study (see , and ). Although a significant number of students specified a numeric time frame for prompt feedback (38% of first years, 52% of third years and 27% of fourth years), others (15% of first years, 7% of third years and 23% of fourth years) identified prompt feedback as ‘feedback that is given quickly’ without further specifying a time period. Some students also stated that prompt feedback should be given in a time period in which they could still remember the work, while a few others said that the size of the coursework and cohort should be taken into consideration. Of the first year students that gave a time frame for prompt feedback (38%), 15% would prefer to receive feedback within 1–3 days of submitting the assignment, 21% within one week and only one student out of 74 was prepared to receive feedback within one month. Half of the third year students also gave a time frame for prompt feedback, with 52% of those expecting feedback within two weeks. The responses from fourth year students were more broadly distributed, but of those that provided a time frame for prompt feedback (27%), 50% would prefer to receive feedback between 1–3 days and 2 weeks.

A considerable number (21%) of first, third and fourth year students (mainly international) appeared not to understand the temporal nature of the adjective ‘prompt’ in the question: ‘what, in your opinion, constitutes prompt feedback?’ Instead of relating it to the timeliness of feedback, these respondents associated it with either the mode of feedback (e.g. by email) or its quality. For example, one international student wrote that prompt feedback ‘is when a person writes a long feedback’.

Responses to this question show that expectations differ regarding the timescale of ‘prompt’ feedback (58%) and that the adjective ‘prompt’ can be interpreted in different ways (21%). Furthermore, 21% of the students did not attempt to answer this question. Of the 42% of students that either did not answer the question or did not understand the word ‘prompt’ to mean the timeliness of feedback, 71% were international students (13 out of 19 first year international students, 33 out of 43 third year international students and 16 out of 29 fourth year international students) and 29% were home students (21 out of 55 first yearhome students, one out of 45 third year home students and three out of 15 fourth year home students). Considering only the first years, a significant number of home students did not answer the question or related the word ‘prompt’ to either the mode or quality of feedback, although very few had this problem by the third year. Considering third and fourth years only, the majority of those not providing an answer or not relating ‘prompt’ to the timeliness of feedback were international students (92% of 40%). Most of these were direct entry students.

Figure 1 Categories identified based on 74 responses from first year Chemical Engineering students to the question ‘What, in your opinion, constitutes prompt feedback?’ and their associated distributions among home and international students

Figure 2 Categories identified based on 88 responses from third year Chemical Engineering students to the question ‘What, in your opinion, constitutes prompt feedback?’ and their associated distributions among home and international students

Figure 3 Categories identified based on 44 responses from fourth year Chemical Engineering students to the question ‘What, in your opinion, constitutes prompt feedback?’ and their associated distributions among home and international students

The year of entry of the international students appears to have an effect on the percentages mentioned above. For instance, in the third year cohort lack of understanding was found in 84% of international students that entered into the first year but increased to 100% for direct entry year 2 students and 88% for direct entry year 3 students. A more pronounced trend was observed for the fourth year cohort: 38% of the international students that entered into the first year, 67% of the direct entry year 2 students and 60% of the direct entry year 3 students showeda lack of understanding of the meaning of ‘prompt’. The implications of this finding are explored in the discussion section of this paper.

Interestingly, these results were not affected by gender; the percentages of male and female across the different years of study that appeared not to understand the meaning of ‘prompt’ were as follows:

  • first year — 46% of the male cohort and 45%of the female cohort

  • third year — 40% of the male cohort and 30% of the female cohort

  • fourth year — 43% of the male cohort and44% of the female cohort.

The research also considered how students prefer to receive feedback. As discussed in the methods section of this paper, the question asking ‘How do you prefer feedback?’ required students to choose one or more responses from five possible options. The questionnaire also asked students to give the reasons for their response. The responses for first, third andfourth year students are summarised in , and , respectively. The results clearlyreveal that both UK and international students prefer individual rather than class feedback. Students that chose class oral feedback were also interested in having individual oral and/or written feedback. One respondent clarified this choice by stating that: ‘Individual written feedback helps me to work through each improvement step by step and class oral feedback may highlight key points others have included which I haven’t’.

Of the students stating a preference for individual feedback only (77%), 20% had a preference for a combination of individual written and oral feedback, 14% for individual oral feedback and 43% for individual written feedback. The percentages are evenly distributed between male and female (i.e. 76% male and 79% female preferred individual feedback only). In the first year cohort a total of 89% of the male cohort preferred individual feedback only, comparable with the percentage of females (80%) stating the same preference. Overall, these percentages are lower for the third year cohort (65% male and 68% female) but increase again in the fourth year (75% male and 94% female).

Reasons provided for choosing individual written feedback included: ‘I prefer for the feedback to be individualised and to be able to refer back to it later on’. Students that just opted for individual oral feedback stated that ‘words expressed orally to you tend to become more useful’ and ‘[I] like to ask questions about feedback on work so prefer to do it face to face’. Students that stated a preference for both individual oral and written feedback mentioned the benefits of having a written record to return to: ‘I feel that sometimes oral feedback is too much to take in all at once, and would prefer both oral and written feedback so that I can refer back to the written feedback later on’.

The questionnaire also contained an open question asking students what they want from feedback. Most of the responses from both UK and international students reinforce the ideas generated by responses to the question about the importance of feedback (i.e. that students are expecting identification of strengths and weakness and ways to improve subsequent pieces of submitted work). A significant number of students (38% of first years, 24% of third years and 16% of fourth years) would like to see the inclusion of positive comments and encouragement as well as information on ‘key points I have not covered properly’ and ‘constructive criticism [of] where [I am] going wrong’. Some also asserted the need for detailed feedback.

Figure 4 Responses of 74 first year Chemical Engineering students regarding feedback preferences and their associated distributions among home and international students

Figure 5 Responses of 88 third year Chemical Engineering students regarding feedback preferences and their associated distributions among home and international students

Figure 6 Responses of 44 fourth year Chemical Engineering students regarding feedback preferences and their associated distributions among home and international students

Discussion

This study examined students’ perceptions of feedback in order to examine whether NSS scores may be biased by non-instructional factors. The results show that the majority of the third and fourth year student cohorts do have a clear idea of the meaning of feedback and how important it can be to assist them in making improvements to subsequent work. These results are corroborated by the existing literature on feedback (CitationHiggins et al., 2002; Weaver, 2006; Brown, 2007; Rae and Cochrane, 2008; Burke, 2009), highlighting the importance that academic staff need to place on providing clear, meaningful and constructive feedback so that students are able both to make sense of it and, perhaps more importantly, to use it effectively. More specifically, CitationWeaver (2006) argues that, while feedback is valued by students, it is often perceived as not containing enough information to guide or to motivate them. Moreover, Weaver highlights the pitfalls of using academic discourse in feedback, leading to ‘insufficient understanding’ for students to ‘interpret comments accurately’(p. 391) (see also CitationBrown, 2007).

Perhaps the most surprising finding is how students perceive the meaning of the adjective ‘prompt’ in the question: what, in your opinion, constitutes prompt feedback? Results indicate that, while 58% of our sample expected ‘quick’ feedback or feedback that is provided within at most two weeks, almost half of the third and fourth year students either did not answer this question or provided an answer that showed a misunderstanding of the temporal nature of the adjective ‘prompt’. This lack of understanding may be related to the origin of the student (home or international and whether the student was a direct entry student to the final year of the programme or had progressed through the programme at the institution). In short, over half of the third and fourth year cohorts were international students and most of the non-responses or responses relating prompt feedback with quality of feedback were from these cohorts, in particular direct entry students. Interestingly, although a considerable number of first year UK students also appear not to have a full understanding of the meaning of ‘prompt’, this appears to reduce over time during the programme of study, with misunderstanding being less common amongst third and fourth year UK students. As UK universities attract a significant number of international students, these results call into question the validity of particular student groups’ ratings of feedback promptness in the annual National Student Survey. In the university where this study was conducted, of the engineering and physical science students that are eligible to take part in the NSS 2011 34% are international and 66% home. For the university as a whole, the percentage of eligible international students is lower (8% international and 92% home). In short, if a particular cohort state low satisfaction in relation to the NSS question ‘Feedback on my work has been prompt’, then our analysis suggests that further investigation of the meaning and understanding of the adjective ‘prompt’ amongst the cohort and its constituent groups may be advisable before other actions are taken.

In agreement with CitationHiggins et al. (2002), the results clearly reveal that both UK and international students prefer individual rather than class feedback. Results also indicate that one-off oral feedback may be difficult for some students to process and that technological innovations such as podcasting (see CitationRibchester et al., 2007) and the increasing use of open source software downloads (for the capture of audio feedback that students can download and replay) may have the potential to provide new opportunities for fulfilling this need. Although there is no single type of feedback that will meet all students’ specific needs, there is strong evidence from students in the sample that personalised feedback is preferred, supporting the findings of previous research (see CitationBrown, 2007).

Some students clearly identified their need for detailed feedback and a balance between positive and negative comments. These comments also clearly support the findings of other studies which identify praise, encouragement and other comments designed to motivate the student as aspects of feedback that can positively influence their confidence and engagement (CitationLizzio and Wilson, 2008; Walker, 2009). Yet researchers such as CitationButler (1987), CitationKluger and DeNisi (1996) and CitationBrookhart (2007) call for caution in the use of ‘too general’ praise, identifying it as a factor that can detract attention from the task and consequently undermine learning. Finally, given the predominance of males in our sample, it is worth mentioning that our analysis shows that gender had no significant influence on the responses given to the five questions.

Conclusions

This study contributes to a better understanding of the way in which Chemical Engineering students understand feedback. In support of previous research findings, there is a general consensus amongst students that feedback should inform them of their strengths and weakness and help them to identify what and how to improve. Students overwhelmingly asserted a desire for individual feedback, with some of them expressing the need for detailed feedback and positive comments. However, perhaps of most relevance to this study are our findings that students expressed varying opinions relating to the meaning of ‘prompt feedback’. While we cannot at this stage ascertain how far our findings directly relate to the scores on the NSS question ‘Feedback on my work has been prompt’, this study certainly suggests that a link can potentially be made. Moreover, it highlights some of the benefits of pursuing this area of research further; not least the benefit of ensuring that students are being asked evaluation questions that they fully understand and, in turn, that evaluation data is valid. This study has clearly revealed some important findings which can provide useful guidelines by which institutions can address and improve aspects of feedback in higher education. Future research should investigate whether these findings relate to student populations in other disciplinary areas and, more broadly, to other key subject matters on which students are surveyed. Furthermore, it is our intention to continue to study the group surveyed as first year students in order to monitor changes in perception over the course of their programme of study.

Acknowledgements

We are very grateful to the following colleagues who facilitated engagement with the various student groups: Regina Santos, Phil Robbins and Andrzej Pacek. Thank you also to the three anonymous referees who provided extremely helpful feedback on an earlier version of this paper.

References

  • Al-IssaA. and SuliemanH. (2007) Student evaluations of teaching: perceptions and biasing factors. Quality Assurance in Education, 15 (3), 302-317.
  • BrockbankA. and McGillI. (1998) Facilitating reflective learning in higher education. Berkshire: Society for Research into Higher Education and the Open University.
  • BrookhartS. M. (2007) Feedback that fits. Educational Leadership, 65 (4), 54-59.
  • BrownJ. (2007) Feedback: the student perspective. Research in Post-Compulsory Education, 12 (1), 33-51.
  • BurkeD. (2009) Strategies for using feedback students bring to higher education. Assessment & Evaluation in Higher Education, 34 (1), 41-50.
  • ButlerR. (1987) Task-involving and ego-involving properties of evaluation: effects of different feedback conditions on motivational perceptions, interest, and performance. Journal of Educational Psychology, 79 (4), 474-482.
  • HigginsR., HartleyP. and SkeltonA. (2002) The conscientious consumer: reconsidering the role of assessment feedback in student learning. Studies in Higher Education, 27 (1), 53-64.
  • IronsA. (2007) Enhancing learning through formative assessment and feedback. New York: Routledge.
  • KlugerA. N. and DeNisiA. (1996) The effects of feedback interventions on performance: a historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119 (2), 254-284.
  • KolbD. A. (1982) Experiential learning: experience as the source of learning and development. 1st edition. New Jersey: Prentice Hall.
  • LizzioA. and WilsonK. (2008) Feedback on assessment: students’ perceptions of qualityand effectiveness. Assessment & Evaluation in Higher Education, 33 (3), 263-275.
  • National Student Survey. (2009). National student survey. Available from http://www.thestudentsurvey.com/ [accessed 27 April, 2009].
  • NortonL. (2007) Using assessment to promote quality learning in higher education. In: CampbellA. and NortonL. (eds.) Learning, teaching and assessing in higher education. Exeter: Learning Matters Ltd, 92-101.
  • PattonM. Q. (2002) Qualitative research and evaluation methods. 3rd edition. London: Sage.
  • RaeA. M. and CochraneD. K. (2008) Listening to students: how to make written assessment feedback useful. Active Learning in Higher Education, 9 (3), 217-230.
  • RamsdenP. (2003) Learning to teach in higher education. 2nd edition. London: RoutledgeFalmer.
  • RamsdenP., BatchelorD, TempleP. and WatsonD. (2010) Enhancing and developing the National Student Survey — report to HEFCE by the Centre for Higher Education Studies at the Institute of Education. London: HEFCE.
  • RibchesterC., FranceD. and WheelerA (2007) Podcasting: a tool for enhancing assessment feedback? Education in a Changing Environment. 4th International Conference, 12–14 September 2007, Salford, UK.
  • WalkerM. (2009) An investigation into written comments on assignments: do students find them usable? Assessment & Evaluation in Higher Education, 34 (1), 67-78.
  • WeaverM. R. (2006) Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education, 31 (3), 379-394.
  • YorkeM. (2009) Student experience’ surveys: some methodological considerations and an empirical investigation. Assessment & Evaluation in Higher Education, 34 (6), 721-739.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.