1,030
Views
0
CrossRef citations to date
0
Altmetric
Research Article

‘I finally understand my mistakes’ – the benefits of screencast feedback

ORCID Icon, ORCID Icon & ORCID Icon
Pages 43-55 | Received 26 May 2021, Accepted 03 Apr 2023, Published online: 20 Sep 2023

ABSTRACT

This study explores the impact of screencast feedback on maths tests for junior and high school students. While prior research emphasises the influence of feedback on learning, its effectiveness varies with type and delivery. Although studies in higher education observe improved precision and emotional connections through screencast feedback, its applicability in secondary education remains understudied. The authors surveyed 99 students, collecting responses via an 11-item questionnaire after the students had received feedback. Through thematic analysis, they found 72% favoured video feedback due to its clarity, depth and personal touch. Conversely, 17% preferred written feedback for efficiency. These findings underscore the benefits of screencasting feedback, highlighting its comprehensibility and individualised nature. Given the rise of digital learning, educators are encouraged to adopt screencasting as a valuable tool for enhancing feedback in academic settings.

Introduction

In a sociocultural learning tradition, great parts of the learning occur through dialogues, collaboration and scaffolding with peers and/or teachers (Sawyer, Citation2016). Learning benefits immensely from strategically regulated repetition and practice, especially when it is accompanied by reliable and timely feedback (Nathan & Sawyer, Citation2016). However, the goal for feedback should not only be to assess the student, but to assure that the student learns from the feedback (Hattie, Citation2015; Wiliam, Citation2011). Among all factors that can influence students’ learning, feedback is one of the most powerful influences on learning and achievement (Hattie & Timperley, Citation2007). Yet, providing thorough, constructive feedback is time-consuming for teachers. Providing only grades takes far less time than providing thoroughly written comments, but considerable evidence has indicated that written comments are more effective (Black & Wiliam, Citation1998; Crooks, Citation1988; Hattie & Clarke, Citation2019). Still, providing feedback in written form may also be challenging, as teachers must ensure that they are understood correctly. Various authors have suggested that feedback should be a dialogue between the student and the teacher to prevent misunderstandings and miscommunication (Hyland, Citation1998; Killingback et al., Citation2019, Citation2020; Kinchin, Citation2003; Skidmore, Citation2006). The drawback to this approach lies in the fact that such dialogue may take up time otherwise spent on classroom activities. Screencasts offer a potential solution for efficiently providing students with individual quality feedback. A screencast is a digital video in which the setting is partly or wholly a computer screen with audio narration that describes the on-screen action. In a feedback situation, the on-screen action includes the text provided by the student, and the audio is the teacher’s verbal response to the text. The teacher may also include screen actions from other programs used in different subjects, such as Geogebra in mathematics or other relevant programs for other subjects. Providing screencast feedback is also time-consuming, but it does not necessarily take much more time than providing written feedback (Denton, Citation2014; Edwards et al., Citation2012; Zhai et al., Citation2002). Screencast feedback may be used in all types of learning mode deliveries such as eLearning, blended modes, or when teaching physical face to face, and in all taught subjects.

Screencast feedback has mainly been used and studied in higher education contexts (e.g. Borup et al., Citation2015; Haxton & McGarvey, Citation2011; Lowenthal & Dunlap, Citation2018; Mahoney et al., Citation2019; Mathisen, Citation2012; O’Malley, Citation2011; Robinson et al., Citation2015). These studies have demonstrated that such feedback strengthens the student–teacher relationship and provides a promising alternative to written feedback because it allows the teacher to ‘show and tell’ on the screencast while giving rich feedback in the form of relevant information and personal communication. Other studies suggest, similarly, that those providing screencast feedback change their feedback by paying more attention to detailed comments instead of brief suggestions and corrective comments (Mahoney et al., Citation2019). Mahoney et al. (Citation2019) refer to several studies that have found that video feedback addresses more positive aspects of students’ work (Lamey, Citation2015; Parton et al., Citation2010; Thomas et al., Citation2017). These are findings from research in higher education which present positive implications for students’ learning. Such positive influence would be valuable for students in secondary education too. Recently, screencasting has become more relevant and present in secondary education (Lowenthal et al., Citation2020). Yet, little is known about how students at this level perceive this form of feedback. Including students’ perceptions of teaching practices and teaching quality is important, as students are the main stakeholders within education. Moreover, students’ perceptions contribute to their educational experience and learning process (Shuell, Citation1993) and influence their learning outcomes (Allen & Fraser, Citation2007; Fauth et al., Citation2014). Making these perceptions heard enables the co-construction of teaching and may support teachers in their professional development (Mayes et al., Citation2020). Furthermore, it may help teachers to align their teaching to students’ preferences (Bakx et al., Citation2015). Therefore, we set out to investigate the following research question: How do students in secondary education experience screencast feedback?

We investigate this question within the context of a mathematics test through the responses of 99 students from four schools. All students learned mathematics by being physically present in class. In the following, we first elaborate on the concept of feedback and explore research on screencasting. Then, we provide a description of the methods used before presenting and discussing our findings.

Theoretical background

Feedback

Although feedback has a significant influence on learning, the type of feedback and the way it is communicated can be differentially effective (Frymier & Houser, Citation2009; Hattie & Clarke, Citation2019; Hattie & Timperley, Citation2007). When given incorrectly, feedback is not very effective and may erode someone’s self-worth (Hattie & Timperley, Citation2007; Kluger & DeNisi, Citation1996). Feedback needs to provide information specifically relating to the task or process of learning that fills the gap between what is understood and what is aimed to be understood (Sadler, Citation1989).

Hattie and Timperley (Citation2007) suggested a feedback model to effectively impact learning processes. Following this model, feedback should address three questions: (1) Where am I going? (2) How am I going? and (3) Where to next? The last two questions focus mainly on the learning process, whereas the first focuses on the goal. Studies have shown that commenting on students’ learning process has a strong impact on students’ learning and motivation (Dweck, Citation1986; Dweck & Leggett, Citation1988; Hattie & Clarke, Citation2019).

For mathematics, typical written feedback on a formal written coursework is given in multiple ways but generally contains (1) short comments on scripts, (2) model answers, (3) review of common errors in class, (4) written summary of common errors, and/or (5) follow-up one-to-one discussion in practical classes following the return of work (Robinson et al., Citation2015). Model solutions can be highly valued by students (Robinson et al., Citation2015); however, students may struggle to understand the difference between the model solution and their own work (Robinson et al., Citation2015). The thinking behind the model is usually omitted in written feedback, and there is a risk that the student copies the model in similar problems without understanding why (Robinson et al., Citation2015). Hence, it is likely that screencast feedback in mathematics, also in secondary school, will support learning through better and more detailed feedback.

Screencast feedback

The COVID-19 pandemic has changed education and teaching, requiring teachers to consider aspects such as loneliness. Recent research has suggested that screencasting teaching and feedback may maintain and even strengthen the teacher–student relationship (Lowenthal & Dunlap, Citation2018) and student performance (Loch et al., Citation2014) when teaching online. Lowenthal and Dunlap (Citation2018) suggested that screencast feedback may be one of the most effective strategies to connect students to their teachers. Loch et al. (Citation2014) found that students performed better when solving mathematical problems after watching screencasts. Their students reported that being able to replay these sections, fast-forward and pause videos when studying for assignments or exams was important (Loch et al., Citation2014). In the field of educational psychology, Atkinson (Citation2002) and Mayer (Citation2003) have shown that learning from video with animation and verbal communication is more efficient than learning from on-screen text with narration, or narration alone. Focusing on screencast feedback, Borup et al. (Citation2015) discovered that while students found written feedback to be more efficient, organised to read and containing more specific critiques, they viewed screencast feedback as being more supportive. Other studies from higher education have reported that teachers’ use of screencast feedback leads to increased precision and quality, as well as stronger emotional bonds between students and teachers (Brick & Holmes, Citation2008; Denton, Citation2014; Henderson & Phillips, Citation2015; Mathisen, Citation2012; Robinson et al., Citation2015; West & Turner, Citation2016). However, research has also shown that students do not want their lectures replaced by screencasts only (Mullamphy et al., Citation2010). Robinson et al. (Citation2015) concluded that screencast feedback gave another dimension in mathematics. The students commented that they not only learned mathematical skills, but also to reflect on their own work and solution. They learned to communicate mathematics like a mathematician (Robinson et al., Citation2015). Hence screencast feedback promoted effective feedback, enabling students to close the gap between their work and the expected standard.

Due to time limitations in many classroom settings, it is almost impossible for teachers to give individual students immediate and detailed feedback. However, screencast feedback does not need to be time-consuming (Edwards et al., Citation2012). For example, Zhai et al. (Citation2002) calculated that typing efficiency, among six participants, on a QWERTY keyboard is about 30 words per minute. However, Denton (Citation2014) found that the average word count for one minute of screencast feedback was 135. Comparing these numbers suggests teachers will be able to create and deliver more information to students in less time using screencasts. Nevertheless, spoken communication has its own set of challenges, such as increasing cognitive load over a shorter period, where typing may promote systematic thought due to frequent pausing and opportunity to revise (Karat et al., Citation1999). Ways of giving feedback is not a question of either written or screencast feedback, but more a question of how teachers may offer a variation of their feedback procedures (Ryan et al., Citation2019). In higher education Ryan et al. (Citation2019) found that combining various forms of feedback results in students experiencing higher levels of detail, more personalisation and higher levels of usability compared to single-mode feedback.

Method

Participants

A total of 103 students in six classes were given screencast feedback and were subsequently asked to complete a survey. Four students chose not to reply to the survey. Thus, our final sample consists of 99 students, who ranged in age from 12 to 18 years. They were not asked to provide any background information to ensure anonymity. These students were selected based on a convenience sample (Robinson, Citation2013) of nine teachers who agreed to participate in this study. These teachers were either former colleagues or personal contacts of the first author or enrolled in a digital teaching programme within the institution of author1 and author2. Six teachers agreed to participate; among the participants, one was from the digital teaching programme, two were personal contacts and the remaining three teachers were former colleagues (see for their background information). None of the participants in this study had previously provided their class with screencast feedback. Further background information was not collected to ensure anonymity. All students and teachers were informed about the project beforehand. They were informed that they had the right to withdraw their participation at any time without consequences. Students were also informed that (non)participation would not influence their grades.

Table 1. Participants in the study.

Procedure

We asked all teachers to evaluate students’ written mathematics tests in a screencast feedback done in the software program Screencast-O-Matic. Typical for all students is that they have been taught a mathematical theme (usually 1–2 chapters such as geometry, algebra, equations etc.) for a time period of eight weeks. This was followed by a written assessment requiring the students to solve different mathematical problems. Because of the different age groups included and different mathematical courses provided to these groups (e.g. theoretical and practical mathematics in high school classes), the mathematical themes taught were different across classes.

To ensure that the students experienced similar retrieval of the feedback, we asked the teachers to follow the feedback process suggested by Hattie and Timperley (Citation2007). Specifically, we requested that all feedback should tell the students: (1) where they were (what were they supposed to do, and what did they do?), (2) how they were doing (to what degree did they solve the problems correctly?), and (3) what they should do and how they should work to improve. These steps also align with the recommendation by the Norwegian Directorate of Education (Udir) on how to provide solid feedback (Udir, Citation2020). Students received the feedback before receiving their test grade at the end of the screencast. Immediately after receiving their screencast feedback from the teacher, students voluntarily and anonymously completed a paper-and-pencil or digital questionnaire. The anonymous paper-and-pencil versions of the questionnaire were scanned by the teachers and emailed to author1.

Questionnaire

The questionnaire consisted of 11 open-ended questions asking students about their perception of the content quality of the feedback. For example, one question asked, ‘What in the feedback made you understand how to … ?’ We also asked the students about their perceptions of how the feedback was given through questions such as ‘How did you experience receiving screencast feedback?’ Lastly, we asked students which feedback method they preferred and why.

Several of the questions were very similar, resulting in similar answers across these questions, or comments such as ‘see above’. As such, we believe the answers reflected the totality of possible experiences while simultaneously strengthening the reliability of our findings. Although some students skipped questions they thought were too similar, most students replied to all or most of the questions in the questionnaire.

Text analysis

We used an inductive thematic analysis approach to establish the main categories in the material (Braun & Clarke, Citation2006). First, we worked through the data from one class to establish the following broad categories, initial codes: (1) how the student experienced the feedback, (2) how the student experienced properties of the feedback, and (3) the student’s preferred feedback type. Using these codes, we split the data from the remaining classes between author1 and author2 to code into codes. Next, we divided the comments into either positive or negative responses, primary theme. Lastly, we took a closer look at each category and fine-tuned the existing category ‘how the student experienced the feedback’ to better reflect the data, secondary theme. We divided the positive responses into three secondary themes: (a) easier to understand, (b) detailed information, and (c) personal feedback. The negative comments were subdivided into two categories: time and structure. See for an overview of the categories.

Table 2. Examples of extracts of data.

Results

Most students (n = 68, 68.7%) reported positive experiences of the screencast feedback. An additional 17 students (17.2%) were neutral to the type of feedback, while 15 students (15.2%) described more negative experiences of the feedback ().

Most of the students (n = 74, 74.8%) preferred screencast feedback to written feedback. The comments that explained these students’ preferences were divided into three categories (see ). First, 52 students (52.5%) responded that it was easier to understand the feedback on a screencast. One student wrote: ‘the video feedback really helped, and it was much easier to understand … in comparison to when I get a written feedback. I feel that such a feedback gives more motivation’, and another wrote ‘I finally understood my mistakes, I never do that on a written feedback’.

Some students explained that they liked that the teacher demonstrated how to solve a mathematical problem. This may be shown by solving a mathematical problem by using the cursor while explaining the thinking behind the different steps in the calculation. Secondly, 36 students (36.4%) responded that the screencast feedback was more detailed and provided a more elaborate explanation of the problem at hand, one student wrote: ‘the screencast feedback was more thoroughly and detailed as the teacher also showed the conclusion’.

Lastly, nine students (9.1%) responded that the feedback seemed more personal, one student wrote ‘… I feel the screencast feedback was more personal…’. They felt they were being spoken to individually, instead of as a whole class. One student wrote: ‘the feedback felt more personal, and “we” looked at it together’. The responses showed that the students not only appreciated that the teacher showed and talked through their test on a personal screencast, but also that they felt that the teacher was nice to them by saying ‘hello’ and including small talk, which is not common in written feedback.

A total of 17 students (17.2%) reported that they preferred written feedback for reasons divided into two categories. First, some of these students (n = 8, 8.1%) made comments about the structure of the feedback: ‘the written feedback is more precise and concise’, indicating the screencast feedback to be messy. Another student explained that it was easier to go back to written feedback and reflect upon specific parts of the feedback. Second, time was an issue for some students who preferred written feedback. Some (n = 5, 5.1%) stated ‘it took longer time to watch to the feedback’. Additionally, four students (4.0%) showed empathy to the teacher and commented that it took longer for the teacher to give feedback to the class. Two students gave a different kind of response regarding why they preferred written feedback; specifically, one reported feeling the disappointment more with the screencast feedback, and one responded that the screencast feedback was too personal.

Some students (n = 13, 13.1%) reported that they were indifferent to the type of feedback. Typical answers in this category suggested that any kind of feedback was better than no feedback or that a combination of feedback types would be the best.

Discussion

We set out to investigate how students in secondary education experience screencast feedback in mathematics. In doing so, our study adds to the limited knowledge base on students’ experiences with screencast feedback in secondary education and expands what is known from previous studies in higher education (e.g. Borup et al., Citation2015; Mahoney et al., Citation2019; Robinson et al., Citation2015). As shown in the results, almost three quarters of the students were positive about receiving screencast feedback. This finding is positive, especially considering the current COVID-19 pandemic that has forced teachers and students into online teaching and learning. In the following, we will discuss insights from the students’ responses and shed light on the increased detail and understanding screencast feedback provided. Additionally, we provide insight into how screencast feedback adds a personal touch in the feedback process and discuss how combinations of feedback processes may be helpful for the students.

More detail and understanding

The students’ responses in our study indicated that they understood screencast feedback better than written feedback. This is in line with Robinson et al. (Citation2015), who concluded that many students preferred screencast feedback because it was more personal, provided a richer experience and developed mathematical skills. A piece of oral feedback contains, in general, 4.5 times more words per minute than written feedback (Denton, Citation2014; Zhai et al., Citation2002). Thus, it is not surprising that many of the students in our study commented that the screencast feedback contained more details and was therefore easier to understand. Still, this finding contradicts the results of Borup et al. (Citation2015), who reviewed studies done in higher education and determined that both students and lecturers reported that although they felt videos with comments were longer and more supportive, text feedback contained more specific critiques. The difference in educational level between the two studies might account for some of the discrepancies in these findings. An additional possible explanation for the difference could be that Borup et al. (Citation2015) included multiple subjects, whereas we solely focused on the topic of mathematics. Instruction in mathematics involves an assortment of programs and equipment, in addition to a language mainly containing numbers rather than words. Many students struggle with learning and liking mathematics (DiMartino & Zan, Citation2010; Rojo Robas et al., Citation2020). In a screencast the teacher can give more detailed feedback with a more thorough explanation. Students in our study explicitly expressed that they appreciated that the teacher instructed them in the feedback. In mathematics, it is common to use programs such as Geogebra, which may be more beneficial when used during screencast feedback, where the teacher can show and explain. It may be that the very positive attitude that we found towards the screencast feedback in this study was due to the experience of understanding mathematics better through this form of feedback.

Screencast feedback may also reduce the risk of misunderstanding the feedback. Not only can teachers include more words, but they can also play with their oral contribution, for instance by varying the pitch of their voice, by varying their pace of speaking or by including more words used in students’ daily lives (Killingback et al., Citation2019). This way, the teacher can, for instance, stress important steps in a mathematical procedure. Furthermore, teachers can address the student by name and refer to previous classroom discussions, thus bridging the feedback to the broader learning context and activities in school while at the same time maintaining the personal connection to the student (Henderson & Phillips, Citation2015).

Based on our findings, we believe teachers should strive for and continue with high levels of detail in their screencast feedback. Additionally, the use of modelling in teaching, where the teacher shows and explains how mathematical problems can be solved, should be a recurring aspect in screencast feedback. This way, students will learn through observation, which in its turn increases students’ capability to solve problems later on.

The personal touch

Our participating students may have expressed mostly positive experiences in their answers because of the personal touch in the screencast feedback, which several students explicitly mentioned. Written feedback is designed to carry a heavy informational load, offering comments on the content of a text or assignment to encourage students to consolidate their learning. Such feedback has often been purely informational, channelling reactions and advice to facilitate improvement in the subject at hand. Often, when teachers write feedback, their comments may be general, vague, and cryptic (e.g. ‘poor effort – more critical interpretation’; Hounsell, Citation2003). Research in language learning has shown that criticism and critical suggestions tend to be mitigated through praise that is often used to tone down negative effects on comments (Hyland & Hyland, Citation2001). Such comments may make it difficult for students to understand where they went wrong or how to improve their situation (Hounsell, Citation2003). Moreover, as shown in language learning, it may be easier to misunderstand the content of written feedback (Hyland & Hyland, Citation2001), as we also mentioned above. Feedback is a key factor in the learning process, yet it is effective only when it engages the student and gives the student a sense that the feedback is a response to a person, rather than a script or an assignment (Hyland & Hyland, Citation2006). The key is to balance the positive feedback with feedback relating to opportunities for improvement (Paterson & McColl, Citation2009).

Our finding that the students appreciated the personal touch corroborates results from higher education (Brick & Holmes, Citation2008; Denton, Citation2014; Mathisen, Citation2012; Robinson et al., Citation2015) that have highlighted the importance of the personal touch and the level of detail in screencast feedback. Good interpersonal relationships between students and teachers are important for students in their learning process (Frymier & Houser, Citation2009). Particularly at a time when students might feel lonely or may be in difficult situations, as with the COVID-19 pandemic, maintaining and strengthening safe and supportive relationships is important.

Thus, based on these findings, we would propose to continue with or start incorporating at least some screencast feedback to maintain or strengthen the relationship between the students and the teacher. This could be particularly relevant for teachers in new classes or at the beginning of the year when this relationship is still developing. It is yet to be investigated, but as a next step, screencast peer feedback could also be considered and studied in relation to the sociocultural learning tradition. As such, peers might vary their ways of collaboration and scaffolding and through this strengthen their relationships.

Combining types of feedback

Although most students preferred screencast feedback and a few preferred written feedback, some students commented that a combination of both written and screencast feedback might be helpful. This result aligns with Ryan et al. (Citation2019), who found that various forms of feedback result in students experiencing higher levels of detail, more personalisation and higher levels of usability compared to single-mode feedback. Particularly combinations of types of feedback including digital modes were rated high on these aspects (Ryan et al., Citation2019).

Screencast feedback across subjects

It is likely that motivation, interest and achievement in a subject affect the students’ feedback preference (Mensink & King, Citation2020; Robinson et al., Citation2015). Including these kinds of psychological variables as covariates in a design regarding screencast feedback will enhance and extend our knowledge of students’ preferences regarding screencast feedback. Here, we focused on mathematics. Loch et al. (Citation2014) found that their students performed better when solving mathematical problems after watching revision screencast feedback. Traditionally, feedback in mathematics often consists of marks indicating whether the answer is correct, provided with brief comments accompanied by short or complete solutions in written form. To address the most common mistakes, teachers sometimes work through assignments step by step in class (Robinson et al., Citation2015). Our data implies that more detailed feedback, as experienced in this study, may help to increase mathematical understanding among the students or increase their self-confidence in the learning process. The setup and content of the screencast made such detailed feedback possible. However, we cannot say whether screencast feedback is more effective in improving student performance. Brick and Holmes (Citation2008) suggested the need for more extensive trials of screencast feedback to establish ‘whether learners respond equally well, irrespective of individual learning style or other factors’ (p. 339). Additionally, we think it is wise to study students’ benefits when receiving screencast feedback in different subjects. Replicating our study in additional subjects will shed light on whether certain subjects are better suited for screencast feedback than others.

Limitations and avenues for further research

Our study has extended the knowledge base on students’ perceptions of and experiences with screencast feedback within secondary education. Still, these findings should be considered in light of the limitations of our study. First, receiving screencast feedback may have been a novel experience for the participating students. This novelty may have shaded their previous experience with written feedback and may have acted as a bonus (Krebs et al., Citation2009). Novelty is a potent learning signal that attracts attention and causes rapid orienting reactions (Knight, Citation1996; Lisman & Grace, Citation2005; Mesulam, Citation1998). Not only did the students experience a feedback novelty, but it may have been a variation component in their feedback process. Exposure to variation is critical for the possibility to learn, and what is learned reflects the pattern of variation that was present in the learning situation (Marton & Morris, Citation2002; Marton et al., Citation2004). Hence, screencast feedback to students may also be a tool for teachers to enhance students’ learning and motivation through variation and novelty. Investigating students’ perceptions of screencast feedback over a longer period and/or including participating students with varying amounts of experience with screencast feedback will be a valuable extension of the present study.

A second limitation lies in the structure of the provided feedback. The Norwegian Directorate of Education (Udir, Citation2020) recommends that teachers give feedback in a form that resembles Hattie and Timperley’s model (Citation2007). The participating teachers were asked to follow this structure during this project; however, we do not know whether the students previously had experienced feedback that followed this structure. If students had not experienced this structure before, the possible change in the structure of the feedback focusing on three areas (i.e. Where are you now? Where are you going? What is your next step?) might also have influenced students’ perceptions of the received feedback. We acknowledge that the teachers might have used this structure in their written feedback as well. Still, we believe that giving the type of detailed feedback recommended by Hattie and Timperley (Citation2007) might be easier with screencast feedback, as teachers can avoid misunderstandings more easily by playing with their language and pitch of their voice, as well as demonstrating what they are talking about on the screen. Again, it could be that this approach is what the students appreciated even more than the screencast feedback. Setting up a study with a quasi-experimental design in which it is ensured that students obtain feedback with a similar structure but in a different form would be a strong addition to our finding that students in secondary education seem to prefer screencast feedback over written feedback.

Conclusion

We have shown that secondary school students express mainly positive experiences with screencast feedback and tend to prefer screencast feedback over written feedback in mathematics. They experienced the feedback as more informative, detailed and personal. Screencast can be an important tool for teachers when giving students feedback. In mathematics it may be a critical tool, as it may be easier to show students more details, which is often needed in the subject. Moreover, we believe screencast feedback can be an important tool for feedback variation in all subjects.

Even when considering the limitations of this study, our findings are a step forward in understanding students’ experiences with digital feedback, something which is highly relevant at present and is expected to remain relevant in the future as well. Considering the importance of the student–teacher relation for secondary school students’ learning (Cornelius-White, Citation2007; Roorda et al., Citation2011) and considering that the students in our study reported experiencing screencast feedback as personal and supportive, we believe that teachers in secondary schools, like those in higher education settings, should consider using screencast feedback.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Anne-Line Bjerknes

Anne-Line Bjerknes works as an associate professor (Dr, PhD) at the Department of Mathematics and Science Education of the University of South-Eastern Norway. Her research field is interdisciplinary pedagogy in teacher education, urban planning with young children’s involvement, and STEM pedagogy in teacher education within early childhood education and school. She is the head of the research group ‘Holistic Learning in Early Childhood Education’.

Lars Opdal

Lars Opdal works as associate professor at the Department of Educational Science of the University of South-Eastern Norway. He is co-leading the research group ‘Mentoring in Profession and Education’. His research focuses on mentoring and teacher training, including use of technology in mentoring. He teaches pedagogy in teacher education and is head of a mentoring programme for continuing education.

Esther T. Canrinus

Esther T. Canrinus works as a professor at the Department of Education of the University of Agder where she is the head of the research group ‘Professions and Professionals in Cooperation’. Previously, she worked at the Knowledge Centre for Education as a part of the Research Council of Norway, where she collaborated on writing review studies commissioned by the Norwegian Government. Her research focuses on the coherence and quality of teacher education, teachers’ professional development and their professional identity. She is, furthermore, interested in teachers’ social networks, classroom behaviour, and teachers’ and students’ motivation.

References

  • Allen, D., & Fraser, B. J. (2007). Parent and student perceptions of classroom learning environment and its association with student outcomes. Learning Environments Research, 10(1), 67–82. https://doi.org/10.1007/s10984-007-9018-z
  • Atkinson, R. K. (2002). Optimizing learning from examples using animated pedagogical agents. Journal of Educational Psychology, 94(2), 416. https://doi.org/10.1037/0022-0663.94.2.416
  • Bakx, A., Koopman, M., de Kruijf, J., & den Brok, P. (2015). Primary school pupils’ views of characteristics of good primary school teachers: An exploratory, open approach for investigating pupils’ perceptions. Teachers & Teaching, 21(5), 543–564. https://doi.org/10.1080/13540602.2014.995477
  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102
  • Borup, J., West, R. E., & Thomas, R. (2015). The impact of text versus video communication on instructor feedback in blended courses. Educational Technology Research & Development, 63(2), 161–184. https://doi.org/10.1007/s11423-015-9367-8
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Brick, B., & Holmes, J. (2008). Using screen capture software for student feedback. In D. Klinshuk, G. Sampson, J. M. Spector, P. Isaias, & D. Ifenthaler (Eds.), Cognition and Exploratory Leaning in Digital Age: Proceedings of the IADIS CELDA 2008 conference, ‘IADIS CELDA 2008‘ (pp. 339–342). IADIS. http://www.iadis.org
  • Cornelius-White, J. (2007). Learner-centered teacher-student relationships are effective: A meta-analysis. Review of Educational Research, 77(1), 113–143. https://doi.org/10.3102/003465430298563
  • Crooks, T. J. (1988). The impact of classroom evaluation practices on students. Review of Educational Research, 58(4), 438–481. https://doi.org/10.3102/00346543058004438
  • Denton, D. W. (2014). Using screen capture feedback to improve academic performance. TechTrends, 58(6), 51–56. https://doi.org/10.1007/s11528-014-0803-0
  • DiMartino, P., & Zan, R. (2010). ‘Me and maths’: Towards a definition of attitude grounded on students’ narratives. Journal of Mathematics Teacher Education, 13(1), 27–48. https://doi.org/10.1007/s10857-009-9134-z
  • Dweck, C. S. (1986). Motivational processes affecting learning. American Psychologist, 41(10), 1040. https://doi.org/10.1037/0003-066X.41.10.1040
  • Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review, 95(2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256
  • Edwards, K., Dujardin, A. F., & Williams, N. (2012). Screencast feedback for essays on a distance learning MA in professional communication. Journal of Academic Writing, 2(1), 95–126. https://doi.org/10.18552/joaw.v2i1.62
  • Fauth, B., Decristan, J., Rieser, S., Klieme, E., & Büttner, G. (2014). Student ratings of teaching quality in primary school: Dimensions and prediction of student outcomes. Learning and Instruction, 29, 1–9. https://doi.org/10.1016/j.learninstruc.2013.07.001
  • Frymier, A. B., & Houser, M. L. (2009). The teacher‐student relationship as an interpersonal relationship. Communication Education, 49(3), 207–219. https://doi.org/10.1080/03634520009379209
  • Hattie, J. (2015). The applicability of visible learning to higher education. Scholarship of Teaching and Learning in Psychology, 1(1), 79–91. https://doi.org/10.1037/stl0000021
  • Hattie, J., & Clarke, S. (2019). Visible learning: Feedback (1st ed.). Routledge. https://doi.org/10.4324/9780429485480-1
  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
  • Haxton, K. J., & McGarvey, D. J. (2011). Screencasting as a means of providing timely, general feedback on assessment. New Directions in the Teaching of Physical Sciences, 7(7), 18–21. https://doi.org/10.29311/ndtps.v0i7.462
  • Henderson, M., & Phillips, M. (2015). Video-based feedback on student assessment: Scarily personal. Australasian Journal of Educational Technology, 31(1). https://doi.org/10.14742/ajet.1878
  • Hounsell, D. (2003). Student feedback, learning and development higher education and the lifecourse. In M. Slowey & D. Watson (Eds.), Higher education and the lifecourse (pp. 67–78). The Society for Research into Higher Education & Open University Press.
  • Hyland, F. (1998). The impact of teacher written feedback on individual writers. Journal of Second Language Writing, 7(3), 255–286. https://doi.org/10.1016/S1060-3743(98)90017-0
  • Hyland, F., & Hyland, K. (2001). Sugaring the pill: Praise and criticism in written feedback. Journal of Second Language Writing, 10(3), 185–212. https://doi.org/10.1016/S1060-3743(01)00038-8
  • Hyland, K., & Hyland, F. (2006). Interpersonal aspects of response: Constructing and interpreting teacher written feedback. In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues (pp. 206–224). Cambridge University Press.
  • Karat, C.-M., Halverson, C., Horn, D., & Karat, J. (1999). Patterns of entry and correction in large vocabulary continuous speech recognition systems. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (CHI ’99). Association for Computing Machinery, New York, NY, USA, 568–575. https://doi.org/10.1145/302979.303160
  • Killingback, C., Ahmed, O., & Williams, J. (2019). ‘It was all in your voice’: Tertiary student perceptions of alternative feedback modes (audio, video, podcast, and screencast): A qualitative literature review. Nurse Education Today, 72, 32–39. https://doi.org/10.1016/j.nedt.2018.10.012
  • Killingback, C., Drury, D., Mahato, P., & Williams, J. (2020). Student feedback delivery modes: A qualitative study of student and lecturer views. Nurse Education Today, 84, 104237. https://doi.org/10.1016/j.nedt.2019.104237
  • Kinchin, I. M. (2003). Effective teacher ↔ student dialogue: A model from biological education. Journal of Biological Education, 37(3), 110–113. https://doi.org/10.1080/00219266.2003.9655864
  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254. https://doi.org/10.1037/0033-2909.119.2.254
  • Knight, R. T. (1996). Contribution of human hippocampal region to novelty detection. Nature, 383(6597), 256–259. https://doi.org/10.1038/383256a0
  • Krebs, R. M., Schott, B. H., Schütze, H., & Düzel, E. (2009). The novelty exploration bonus and its attentional modulation. Neuropsychologia, 47(11), 2272–2281. https://doi.org/10.1016/j.neuropsychologia.2009.01.015
  • Lamey, A. (2015). Video feedback in philosophy. Metaphilosophy, 46(4–5), 691–702. https://doi.org/10.1111/meta.12155
  • Lisman, J. E., & Grace, A. A. (2005). The hippocampal-VTA loop: Controlling the entry of information into long-term memory. Neuron, 46(5), 703–713. https://doi.org/10.1016/j.neuron.2005.05.002
  • Loch, B., Jordan, C. R., Lowe, T. W., & Mestel, B. D. (2014). Do screencasts help to revise prerequisite mathematics? An investigation of student performance and perception. International Journal of Mathematical Education in Science and Technology, 45(2), 256–268. https://doi.org/10.1080/0020739X.2013.822581
  • Lowenthal, P. R. (2020). Video feedback: Is it worth the effort? A response to Borup et al. Educational Technology Research & Development. https://doi.org/10.1007/s11423-020-09872-4
  • Lowenthal, P. R., & Dunlap, J. C. (2018). Investigating students’ perceptions of instructional strategies to establish social presence. Distance Education, 39(3), 281–298. https://doi.org/10.1080/01587919.2018.1476844
  • Mahoney, P., Macfarlane, S., & Ajjawi, R. (2019). A qualitative synthesis of video feedback in higher education. Teaching in Higher Education, 24(2), 157–179. https://doi.org/10.1080/13562517.2018.1471457
  • Marton, F., & Morris, P. J. T. F. (2002). What matters? Discovering critical conditions of classroom learning. Acta Universitatis Gothoburgensis.
  • Marton, F., Tsui, A. B., Chik, P. P., Ko, P. Y., & Lo, M. L. (2004). Classroom discourse and the space of learning. Routledge.
  • Mathisen, P. (2012). Video feedback in higher education – A contribution to improving the quality of written feedback. Nordic Journal of Digital Literacy, 7(2), 97–116. https://doi.org/10.18261/ISSN1891-943X-2012-02-02
  • Mayer, R. E. (2003). Elements of a science of e-learning. Journal of Educational Computing Research, 29(3), 297–313. https://doi.org/10.2190/YJLG-09F9-XKAX-753D
  • Mayes, E., Black, R., & Finneran, R. (2020). The possibilities and problematics of student voice for teacher professional learning: Lessons from an evaluation study. Cambridge Journal of Education, 51(2), 195–212. https://doi.org/10.1080/0305764X.2020.1806988
  • Mensink, P. J., & King, K. (2020). Student access of online feedback is modified by the availability of assessment marks, gender and academic performance. British Journal of Educational Technology, 51(1), 10–22. https://doi.org/10.1111/bjet.12752
  • Mesulam, M. M. (1998). From sensation to cognition. Brain: A Journal of Neurology, 121(6), 1013–1052. https://doi.org/10.1093/brain/121.6.1013
  • Mullamphy, D. F., Higgins, P. J., Belward, S. R., & Ward, L. M. (2010). To screencast or not to screencast. Anziam Journal, 51, C446–C460. https://doi.org/10.21914/anziamj.v51i0.2657
  • Nathan, M. J., & Sawyer, K. (2016). The Cambridge handbook of the learning sciences (Sawyer, R. K., Ed.). Cambridge University Press.
  • O’Malley, P. J. (2011). Combining screencasting and a tablet PC to deliver personalised student feedback. New Directions in the Teaching of Physical Sciences, 7(7), 27–30. https://doi.org/10.29311/ndtps.v0i7.464
  • Parton, B. S., Crain-Dorough, M., & Hancock, R. (2010). Using flip camcorders to create video feedback: Is it realistic for professors and beneficial to students? International Journal of Instructional Technology & Distance Learning, 7(1), 15–21. http://www.itdl.org/Journal/Jan_10/Jan_10.pdf
  • Paterson, K., & McColl, J. H. (2009). Giving useful feedback to students in statistics courses. In CETL-MSOR Conference 2009, Milton Keynes, UK (pp. 52). CETL-MSOR-Proceedings2009.pdf (sigma-network.ac.uk)
  • Robinson, O. C. (2013). Sampling in interview-based qualitative research: A theoretical and practical guide. Qualitative Research in Psychology, 11(1), 25–41. https://doi.org/10.1080/14780887.2013.801543
  • Robinson, M., Loch, B., & Croft, T. (2015). Student perceptions of screencast feedback on mathematics assessment. International Journal of Research in Undergraduate Mathematics Education, 1(3), 363–385. https://doi.org/10.1007/s40753-015-0018-6
  • Rojo Robas, V., Madariaga, J. M., & Villarroel, J. D. (2020). Secondary education students’ beliefs about mathematics and their repercussions on motivation. Mathematics, 8(3), 368. https://doi.org/10.3390/math8030368
  • Roorda, D. L., Koomen, H. M. Y., Spilt, J. L., & Oort, F. J. (2011). The influence of affective teacher–student relationships on students’ school engagement and achievement: A meta-analytic approach. Review of Educational Research, 81(4), 493–529. https://doi.org/10.3102/0034654311421793
  • Ryan, T., Henderson, M., & Phillips, M. (2019). Feedback modes matter: Comparing student perceptions of digital and non‐digital feedback modes in higher education. British Journal of Educational Technology, 50(3), 1507–1523. https://doi.org/10.1111/bjet.12749
  • Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. https://doi.org/10.1007/BF00117714
  • Sawyer, R. K. (Ed.). (2016). The Cambridge handbook of the learning sciences. Cambridge University Press.
  • Shuell, T. J. (1993). Toward an integrated theory of teaching and learning. Educational Psychologist, 28(4), 291–311. https://doi.org/10.1207/s15326985ep2804_1
  • Skidmore, D. (2006). Pedagogy and dialogue. Cambridge Journal of Education, 36(4), 503–514. https://doi.org/10.1080/03057640601048407
  • Thomas, R. A., West, R. E., & Borup, J. (2017). An analysis of Instructor social presence in online text and asynchronous video feedback comments. The Internet and Higher Education, 33, 61–73. https://doi.org/10.1016/j.iheduc.2017.01.003
  • Udir. (2020). Gi Gode Faglige Tilbakemeldinger [Give Good Professional Feedback]. Retrieved November 30, 2020, from https://www.udir.no/laring-og-trivsel/vurdering/underveisvurdering/tilbakemeldinger/
  • West, J., & Turner, W. (2016). Enhancing the assessment experience: Improving student perceptions, engagement and understanding using online video feedback. Innovations in Education and Teaching International, 53(4), 400–410. https://doi.org/10.1080/14703297.2014.1003954
  • Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3–14. https://doi.org/10.1016/j.stueduc.2011.03.001
  • Zhai, S., Sue, A., & Accot, J. (2002). Movement model, hits distribution and learning in virtual keyboarding. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’02), 4(1), 17–24. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/503376.503381