7,467
Views
8
CrossRef citations to date
0
Altmetric
Articles

Enhancing graduate employability skills and student engagement through group video assessment

, &

Abstract

Universities are under increasing pressure to equip graduates with a broader set of competencies, such as communication, teamwork and leadership skills, that go beyond subject-specific knowledge. This, alongside growing student numbers in higher education, creates pedagogic challenges, especially with regards to assessment design. Conventional assessment modalities, such as individual essay writing, are costly to scale up and poorly suited for the development of further desired competencies. To address these challenges in the context of a first-year economics module, we replaced a 1,000-word individual written assignment with a group video assignment, where students were required to work in small teams to create a three-minute video on a contemporary economic issue. Focus groups and module evaluation questionnaires were used to elicit students’ perceptions of how the group video assessment contributed to their learning experience and skill development, how it compares with other modes of assessment, as well as suggestions for improved implementation. Our analysis generates insights on all these aspects. Students reported a preference for diversity in assessment methods, and found the video assignment to be a positive, engaging but also challenging experience, which provided the opportunity for collaboration and development of diverse skills.

Introduction

Besides subject-specific skills, universities are under increasing pressure to equip graduates with broader transferable and digital skills that are highly valued by employers. The QS Global Employer Survey (QS Higher Education Report Citation2018) reports that problem solving, teamwork and communication skills are most important to employers. Yet employers also report that graduates often lack preparedness for the modern workplace. The UK Employer Skills Survey (Department for Education Citation2018) reports that skills found to be lacking ranged from technical (e.g. analysis, problem solving and digital) to inter-personal (e.g. self-management, leadership and teamwork). Group work within undergraduate courses has been found to better prepare students for the graduate job market as it provides an opportunity for students to develop teamwork skills (e.g. schedule team meetings, allocate tasks among members of the group), as well as time management and communication skills (see Ettington and Camp Citation2002; Mills Citation2003; Carver and Stickley Citation2012; Shah Citation2013).

There is increasing demand for skills that embrace new technologies, while the coronavirus pandemic has necessitated wider technology adoption and a rapid shift to blended, and in some cases purely online, learning. Sim and Hew (Citation2010) provide a detailed review of empirical studies on blogging in higher education settings. For example, Xie et al. (Citation2008) report that blogging increased reflective thinking for a sample of college political science students in the USA. Carrasco-Gallego (Citation2017) explores the use of short YouTube movie clips projected during lectures in a first-year economics course and finds they positively contribute to students’ learning process. Despite the increasing use of videos and capture-content by university academics to enhance students’ learning experience, video-based student assessment in higher education remains limited (Jorm et al. Citation2019).

More generally, as Timmis et al. (Citation2016) point out, there is a need to identify opportunities for incorporating new technologies into assessment practices. Using videos as a method of assessment enhances diversification of assessment practices from traditional modalities like essay and report writing, enabling the development of graduate employability skills, while stimulating student engagement. It can also motivate a relatively more digital-savvy generation to channel their technological skills for educational purposes and enhance these in anticipation of prospective job-search.

Group work assessment in higher education typically involves students working in teams to produce a written report and/or deliver an oral presentation of their findings (Gatfield Citation1999; Springer, Stanne, and Donovan 1999; Mills Citation2003; Mills and Woodall Citation2004; Shah Citation2013). The use of video-based assessment is novel in higher education, with most of the evidence coming from undergraduate science programmes, like health care and veterinary professions (see for example, Seddon Citation2008; Hay et al. Citation2013; de Lange, Møystad, and Torgersen Citation2020); very little is known in the context of business and economics education.

To our knowledge, this is the first article providing evidence of learners’ experience with a group video assessment in an undergraduate economics programme. We consider the creation of short video clips as a method of assessment within a higher education setting in the UK. Specifically, a cohort of first-year economics students worked in small groups to explore a contemporary economic issue of their choice and present their analysis through a three-minute video clip. This method of assessment is expected to be relevant far more broadly to the social sciences, humanities and even the natural sciences.

Various motivations underlie our approach. First, students tend to use mobile technologies (like instant messaging and video streaming) more extensively and display an aptitude for tasks such as using presentation and video/audio editing software (Jones et al. Citation2010). By embracing such technologies it is expected that this assessment can enhance student engagement with their studies, as compared to more traditional assessment tools. Second, video creation tasks promote creativity as the same topic can be conceptualised and visualised in multiple ways. Garrison and Kanuka (Citation2004) argue for a more critical and creative approach to designing and delivering higher education. Sheridan-Rabideau (Citation2010) advocates for creativity as a key feature of twenty-first century pedagogy, while Livingston (Citation2010) supports that to spur creativity in higher education we have to turn to students’ technological expertise. Therefore, the group video assignment discussed in this article represents an innovative mode of assessment in this direction. Third, students are required to engage in research on their chosen topic, thereby nurturing their independence and giving them ownership of their learning (Gatfield Citation1999). Fourth, students are given the opportunity to develop communication skills, a key competency valued by employers of economics graduates (Economics Network Citation2019). Finally, from an instructor’s perspective, marking and feedback are greatly facilitated, in itself an important consideration as undergraduate enrolments in the UK grow (Higher Education Statistics Agency Citation2017).

Our research aims to gauge student perceptions of their learning experience and skill development through video group assessment, as well as whether and to what degree students value diversity of assessment methods. To achieve our objectives, and with ethical approval, students were surveyed both in groups and on an individual basis generating two complementary samples. We designed a questionnaire to collect information on students’ perceptions of the video assessment experience, and their perceptions of alternative methods of assessment (i.e. multiple choice questions, essays, exercises and written group work). The 265 students on the course were randomly split into 45 groups of five to six students. We conducted focus groups with the (voluntary) participation of the representatives of 22 groups. The same questionnaire was sent to the 23 groups that opted not to participate in the focus group; of these, 15 responded, bringing the total group-based response rate to 37 out of 45. Through these surveys we find that students prefer diverse methods of assessment and consider the group video assignment a positive experience that is engaging yet not overly difficult, through which they collaborate and develop teamwork and communication skills.

To enhance our sampling process, we incorporated three additional questions to the module’s online evaluation questionnaire, which also aimed to capture students’ perceptions of the video assessment experience. Unlike the focus groups, responses were submitted on an individual basis. The results from these questions were in line with those arising from the focus group, except that the video group assignment was not perceived as the most engaging method of assessment amongst individual respondents.

Assessment context

‘Contemporary Issues in Economics’ is a compulsory module taken in the first term of the first year of study by students enrolled in the undergraduate programmes of the School of Economics at the University of Surrey, UK. Enrollment is around 270 students per year, with students generally not knowing their peers in advance as they are newly arrived. The module offers students economic insights into a range of topics that dominate public discourse in the UK and worldwide (e.g. income inequality, government economic policy and the gig economy). It also allows students to develop transferable competencies, such as research, teamwork and communication skills, which are crucial to academic progression and employability.

The module’s summative assessment strategy comprises a coursework weighted 30% and an end-of-term unseen written examination weighted 70%. The coursework requires students to work in groups to create a three-minute video clip in which each group introduces and analyses a contemporary economic issue of their choice. Students receive general guidance to facilitate their choice of topic, which requires formal approval. Module leaders suggest broad topics and make themselves available to discuss ideas, recommend readings and suggest suitable economic frameworks. Written guidance is also provided through the University’s virtual learning environment, including on open access technology for creating and editing a video, as well as on submission and detailed marking criteria.

In the year of analysis students were randomly allocated by the module leaders to groups of five or six students. Groups were announced in the second teaching week (out of a total of 11 weeks). Groups were encouraged to discuss their chosen topics with module leaders during lectures, office hours, via email communication or by posting messages on the module discussion forum. Groups worked on their project for seven weeks and submitted their videos through the virtual learning environment at the start of week 9. Students were (purposefully) left to coordinate amongst themselves and to determine how to delegate tasks leading to the creation of the video. Most groups discussed their progress with module leaders between weeks 2 and 8. Module leaders and the Deputy Head of School marked and moderated the video group submissions against pre-announced criteria; final coursework marks and feedback were published at the start of the final week.

The assessment was designed to evaluate students’ ability to understand, analyse and convey information in a meaningful and interesting way, and to collaborate effectively as part of a team. As with any group work, there are concerns about ‘free-riding’ or student ‘passengers’. To mitigate this as much as possible we introduced an element of peer-assessment. The final coursework mark for each individual student (weighted 30% in overall assessment), was determined by two elements: (a) the group video mark (weighted 75%) and (b) peer assessment of each group member’s overall contribution of effort (weighted 25%).

The module leaders and Deputy Head of School evaluated the videos and arrived at the group video marks. Students were expected to apply their acquired knowledge on their chosen subject and analyse and present concisely, using accessible language. A key learning outcome was the ability to articulate economics ideas, alongside the rationale for economic policies, as sought after by employers of economics graduates.

A strong video should capture the viewer’s attention through tools of economic discourse, such as data and graphs, and be creative with respect to its approach and illustration. Specifically, the mark for each video was assessed according to the following, pre-announced criteria: (a) knowledge and understanding (20%), which captures the topic’s foundations by presenting basic facts; (b) analysis (40%), which involves the application of the discipline’s tools to explain the issue of interest; (c) structure and presentation (30%) which involves coherence, clarity and appealing visualisations; (d) references (10%) used to inform the topic’s discussion, a list of which was required to be submitted separately.

Each group was asked to submit a form alongside the video file declaring the individual contribution of each student member, expressed as a percentage share. Students were instructed to discuss within their groups and agree on each group member’s contribution. In the case of ‘even’ contribution of effort by group members, each group member received the group video mark as their final coursework mark. In the case of ‘uneven’ contribution of effort, the group video mark was adjusted on the basis of a pre-announced formula to arrive at the final individual coursework mark of each group member; students who contributed less than an even share received an individual mark lower than the group video mark, while those students who contributed more than their share received an individual mark higher than the group video mark. Given the importance of the individual contribution shares, groups were required to justify their decisions. In the case of a failure to agree on individual shares, or in the event of disputes, the module leaders adjudicated based on evidence of effort. In the year of analysis, out of 45 groups, around half declared uneven shares, while intervention by module leaders to clarify individual contributions was required for only one group.

The online coursework submission included three files: a video file in .mp4 format, a document with the list of references used to create the video, as well as the individual contributions form.

Methodology and data

Following ethical approval from the University of Surrey, data on student perceptions were collected from three sources: (a) a survey based on a focus-group session, (b) a group survey conducted after the focus-group session and (c) responses drawn from the module evaluation questionnaires (MEQs) completed by students at the end of the term. These sources offer diversity in perspectives, enhancing the sampling process. A questionnaire (see Appendix) was designed by the authors to collect information on students’ perceptions of the video assessment experience. It consisted of four sections. Section A captured students’ overall perception of the video learning experience. Students were asked to indicate their views on a Likert scale and justify their scores. Section B involved the diversity of assessment methods within the School of Economics. Specifically, it included a question on students’ preferences regarding assessment variety and how the video assignment compared with standard assessment methods (i.e. multiple-choice questions, essays, exercises and group written assignments) in terms of difficulty, engagement and skills developed. Section C aimed to pin down specific skills that a group video assignment is expected to help students develop. In particular students were asked to evaluate on a Likert scale the extent to which the video assignment had improved their communication and teamwork skills, and collegiality. Finally, in Section D two questions asked respondents to elaborate on positive aspects of the video assessment, but also to highlight ways to improve the experience.

All quantitative questions require students to rate their experience on different aspects of the video assignment using a 10-point scale (where 10 indicates the best outcome). This permits the data collected to be sufficiently granular for comparisons to be drawn. In contrast, where students are required to express agreement with a statement, we follow a five-point Likert scale ranging from 1 = strongly agree to 5 = strongly disagree.

The 45 student groups were invited to select a representative to participate in the focus-group session conducted in the second hour of the final lecture of the term, scheduled to occur before marks were released in order to avoid perceptions of the video assessment being influenced by performance. 22 representatives participated and were randomly assigned to five focus groups of approximately four or five students. The questionnaire was distributed amongst the five focus groups and students were given time to discuss and complete it within their focus groups.

The same questionnaire was emailed to the 23 groups that did not participate in the focus-group session the following day, 15 of which responded (65.2% response rate) within two weeks. To distinguish the two sampling procedures, we call the groups that participated in the survey after the focus-group session, the ‘late groups’. The sampling difference between the groups that participated in the focus group session and late groups allows us to make between and within-groups comparisons. Interestingly, the focus group session was conducted before the coursework marks were released, while the late groups provided their responses after the publication of the marks. This allows us to examine whether students’ perceptions of their learning experience were shaped by the knowledge of their coursework performance.

The third source of information is individual responses from the end-of-term MEQ. Module leaders included three further questions designed to elicit student-individual information on their learning experiences with the group-video assessment. Students were asked to complete the MEQs during the first hour of the final lecture, directly before the focus group session.

The organisation of our surveys is summarised in . The total group-based response rate was 82% (37/45). Both individual and group response rates are comparable or higher than values reported in studies that assess group-work learning experiences using questionnaires (e.g. Bourner, Hughes, and Bourner Citation2001; Shah Citation2013). Our samples are thus representative of the study’s target population.

Table 1. Summary of students’ participation.

Analysis and findings

Data analysis: group-based results

Group video assessment: students’ perception of their learning experience

In terms of learning experience, the mean score reported based on the focus group sample is six out of 10 (ranging from 5–8). Although the late groups rated the learning experience somewhat higher at 6.5 on average, the difference is not statistically significant. Interestingly, shows that although focus groups and late groups responded to the questionnaire before and after receiving coursework marks, the distribution of students’ perception scores between groups and within groups are very similar (i.e. the standard deviation, minimum and maximum values are almost identical).

Table 2. Video assessment contribution to students’ learning experience: overall perception.a

To explore how the group video assessment contributed to the development of key transferable skills, students were asked to rank their experience on developing communication, teamwork and engagement (see ). The first important finding is that students reported their learning experience to be better, on average, when focusing on specific skills, than compared to their perception overall. The mean values for each of the specific skills in are higher than the overall students’ perception of their learning experience reported in . One possible explanation for this discrepancy is that when students are asked about their learning experience in a more general way, they do not necessarily pin down specific aspects of the assessment that have worked well or not; rather, they provide an overall impression of their experience, presumably factoring in challenges they had to face, other than those already accounted for.

Table 3. Group work and video assessment contribution to the development of students’ skills.a

The second important finding is that students’ perceptions between groups and within groups appear to be different. For the focus group, communication and engagement scored a mean of 6.8, whereas teamwork slightly higher at 7.2. Notably, some groups rated teamwork and engagement as high as 9, clearly acknowledging the development of these collaborative work skills. The late groups ranked specific skills somewhat differently to the focus groups. Engagement with other students was the highest rated aspect at 8.4 on average, with several groups assigning a value of 10. The mean score for teamwork was 7.6, lower than engagement but higher than the corresponding score from the focus groups (7.2). Again, we have a few groups rating this aspect as high as 10. Finally, communicating economics appears to be the least developed skill according to this sample rated at 7.2 on average. Once again, the discrepancy observed between the overall learning experience and some of its aspects is more pronounced in this sample. Overall, however, the late groups appear to appreciate more the value of this assessment. This might be attributed to the sampling method, where in the focus groups the discussions among students of different teams might have negatively influenced the more positive views held by representatives of certain groups.

Finally, the groups were asked about aspects of the video group assessment that worked well (Question 6). Mostly, students stated aspects of effective collaboration, such as ‘time management’, ‘role assignment’ and ‘making a schedule for meetings’. In this context, they also stated research, discussion and decision making, which they paradoxically failed to acknowledge when asked about skills development. Students from the late groups raised very similar points.

Alternative methods of assessment: students’ perception of assessment variety

We also surveyed students about their preferences regarding the range of assessment methods employed by the School of Economics. That is, whether they would prefer a variety of methods used, or one/two methods (Question 2). Four out of five groups from the focus group session (80%) show preference towards a variety of methods and only one group was ‘neutral’. Interestingly, no group ‘strongly agrees’, indicating that while students appreciate the benefits of alternative methods of assessment, they are – perhaps – not entirely convinced of the value of diverse assessment methods. The latter suggests the video assessment may have coloured student perceptions of alternative methods, an effect likely pronounced by the relative inexperience of level four students, new to a higher education environment. The students from the late groups sample show a similar attitude, with 73% of the late groups agreeing on more diverse methods of assessment, very close to the corresponding rate provided by the focus groups (80%).

Our different sampling methods allow us to draw interesting comparisons. First, we can compare individual types of assessment, namely multiple-choice questions (MCQs), essays and exercises (e.g. solving numerical problems), with group type of assessments (group written project and group video project). reports that students perceived the group assessments as more difficult yet more engaging than individual assessments. MCQs are considered the easiest, least challenging and least engaging form of assessment by most groups. On the other hand, the group written assignment is considered the most difficult and challenging, but not necessarily the most engaging. In contrast, the video assessment is rated highest in terms of engagement and ranks second in terms of difficulty and challenge. All in all, students perceive the group-based assessments as the most demanding, but, at the same time, the most interesting. As the literature in assessment methods (e.g. Bourner, Hughes, and Bourner Citation2001) has pointed out, first year undergraduate students lack experience in group work and group assessment, making teamwork naturally more challenging. Perceptions emerging from the late groups show both similarities and differences from the focus group. MCQs are considered by far the easiest type of assessment, the written project the most difficult and the video somewhere in between. There is thus a discrepancy with respect to the video’s perceived level of difficulty being slightly lower according to this sample. Surprisingly, the video is not considered very challenging (the second least challenging after the MCQs), but it is still the most engaging. Summing up, across both samples there is agreement that assessment through group video is the most engaging, but the late groups regarded it as less demanding than the focus groups.

Table 4. Group video assessment in comparison with other forms of assessment: students’ ranking (mean values).a

Finally, students were asked to identify specific skills that the video assessment helped them develop, as compared to those developed by alternative methods of assessment (Question 4). As expected, almost all focus groups pinned down ‘communication’, but there were also references to ‘organisation’ and ‘creativity’. It is surprising, however, that students did not realise that this assessment helped them improve key employability skills such as research, collaboration and effective presentation of their research. This may be because students in their first term at university do not yet have work experience and are unaware of the main skills employers look for. Limited awareness of added value in this dimension may have contributed to the relatively low overall learning experience scores reported. The late groups provided some insight regarding the skills they felt this assignment helped them develop. ‘Teamwork’ is a term that is repeatedly mentioned amongst the late groups (9/15 groups mention it), as is ‘communication’, though at a much lower frequency (4/15 groups). Some groups identify development of more technical skills such as ‘use of new software’ and ‘filming’. Once again, however, important skills such as research and presentation of the research outcome, are only occasionally stated; ‘research’ is only mentioned once, despite it being an integral part of the assignment.

Summarising findings from the group-based sample, we find that most students prefer diverse methods of assessment and perceive the group video assignment a positive, engaging, and manageable experience, through which they collaborate and develop teamwork and communication skills.

Some of our results corroborate previous findings in the literature, yet others are different. Group work was a positive experience in our sample of first-year undergraduate economics students, as in first-year undergraduate students in the biosciences at Queen’s University Belfast (Garvin et al. Citation1995), first-year accounting students at the University of Brighton (Bourner, Hughes, and Bourner Citation2001) and first and second-year students in veterinary science at the University of Queensland (Mills Citation2003; Mills and Woodall Citation2004; Seddon Citation2008). Moreover, as in Mills and Woodall (Citation2004), our students welcome variety in assessment. Regarding communication skills, Bourner, Hughes, and Bourner (Citation2001) find that ‘oral presentation’ and ‘presenting information in written form’ received relatively low scores. In our study, although ‘communicating economics’ received the lowest score only in the late groups, the score was the same for ‘engage with other students’ in the focus group. The discrepancy may be due to Bourner, Hughes, and Bourner (Citation2001) focusing on a written group project with an oral presentation, whereas our focus is on a video assignment.

Data analysis: individual-based findings

We also designed and collected information on an individual basis as part of the module’s evaluation. The questions asked students to state their level of (dis)agreement with the following:

  1. The video assessment has positively contributed to my learning in this module.

  2. The video assessment was a stimulating and effective way for me to demonstrate my understanding of economics, as compared to other forms of assessments (e.g multiple-choice questions, essays and exercises).

  3. The video assessment has helped me develop my ability to communicate economics ideas.

All questions included five possible answers, ‘(strongly) dis/agree’ and a ‘neutral’ option. The sample size was 71 students. reports the percentages of students who answered negatively (disagree or strongly disagree), positively (agree or strongly agree) and neutral.

Table 5. MEQ-based video evaluation.

The results from Questions A and C paint a similar picture to that from the group-based samples ( and ). When comparing the results of Question A with those of Question C we see that students’ perception of the contribution of the video assessment on students’ learning is somewhat lower than the specific contribution of the video on communication. This is similar to the result from the group-based sample; it appears that specific contributions are valued more than the overall contribution from both the individual and group point of view. Furthermore, although a direct comparison between Question A from the MEQ and Question 1 from the group-based questionnaire is not possible (as choices were provided on different scales), the results of Question A (76% with neutral/agree) are qualitatively similar to those of Question 1 (with means of 6 and 6.5 for the focus group and late groups, respectively).

In contrast, students’ views on how the video assignment compares to other assessments is almost equally split across negative and positive views, which contradict the earlier findings from group-based samples that it is the most engaging method of assessment. More specifically, the results of Question B shows that more than 40% of students do not agree with the video assessment as a stimulating and effective way of demonstrating their understanding, which is at odds with the more positive results in terms of developing communication, teamwork and engagement skills () and relatively the most engaging method of assessment (). This discrepancy may perhaps be explained by the different sampling method. Students made quick personal assessments within just a few minutes within the lecture when completing the MEQs, while the earlier results were generated after discussions between (focus) or within (late) groups.

Further discussion for teaching practice

Here we focus on both the successful aspects and the key implementation challenges of the video group assessment. To this end, we draw information from the group-based student feedback. Specifically, Section D of the questionnaire asked what aspects went well and, similar to Bourner, Hughes, and Bourner (Citation2001), the final question gave students the opportunity to offer ideas on avenues for improvement. The most common positive aspects were the discussion of alternative ideas, coordination for meetings and delegation of tasks within groups.

Students identified three key challenges. First, the three-minute time limit, which some felt was insufficient to adequately communicate their ideas and analysis. They reported a considerable editing burden to satisfy the length constraint. Module leaders felt the three-minute duration was adequate for explaining the crux of an economic issue, while the skills gained in structuring and focusing their analysis to satisfy the constraint are valuable learning outcomes in themselves. Effective, concise communication skills are highly valued by employers and this assignment provides a unique opportunity for students on these degree programmes to develop these skills using digital technology.

Second, some groups found coordination and allocation of tasks difficult and recommended reducing the group size. This issue relates to the issue of uneven workloads. One of the main challenges in group-work assessment is the presence of passengers (poor contributors to group work) or the uneven allocation of tasks among group members, which may lead to the group mark inaccurately reflecting individual effort. Gatfield (Citation1999) and Mills (Citation2003) discuss how peer-review assessment might deal with this issue. We introduced a peer-assessment element to mitigate the passenger effect or uneven workload allocation issue (25% of the assessment grade arises from individual contribution). Many groups dealt with passengers or workload issues: 20/45 groups reported equal shares, whereas the remaining 25 groups reported uneven contribution shares. In most cases groups were able to agree and allocate individual contribution shares. In some cases, one or two students contributed slightly more than the rest; in other cases, one student was allocated a relatively low contribution and the rest equal contributions.

To address these issues, also drawing from students’ feedback, in a subsequent implementation of the video assignment we increased the weight of the individual evaluation from peer assessment from 25–40%. As extant research has pointed out, although the inclusion of intra-group peer assessment as a component of the individual mark awarded suffers from possible manipulation of marks due to friendship, peer-group pressure or assessment based on criteria other than individual performance (Kruck and Reif Citation2001; Willcoxson Citation2006), it also gives students the opportunity to deter or limit ‘free-riding’ and motivates group members to contribute more to the their collective assignment (Kruck and Reif Citation2001). We thus argue that the increased weight of the intra-group peer assessment can motivate group members to minimise the passenger issue. Second, we reduced the number of students per group (from six to five). The literature emphasises that larger group sizes facilitate the passenger issue, though restricting groups to four or five members (Garvin et al. Citation1995; Bourner, Hughes, and Bourner Citation2001) cannot guarantee equal division of labour.

The third challenge related to video examples for students to use as inspiration for their own work. While students tend to prefer as many samples as possible, there is a fine balance between instruction and creativity, as too much of the former can inhibit the latter. We opted for external sources of short videos, for example The Economist, Financial Times, and The Guardian. Such sources can provide inspiration without being overly prescriptive.

A final concern related to students’ allocation to groups. The literature reports (e.g. Huxham and Land Citation2000; Mills Citation2003; Mills and Woodall Citation2004; Seethamraju and Borman Citation2009) the most common methods are random allocation or student self-selection (i.e. tutor-led vs. student-led), though the benefits of each approach remain unclear. We randomly allocated students to groups to minimise coordination problems, which are amplified in large cohorts, as applies here. However, students’ learning experience can be negatively impacted by random group allocation, which may not reflect their preferences. This was observed to some extent as reported in students’ feedback. Hence, in subsequent implementation students were free to form their own groups within a specified timeframe, with assistance by module leaders to ensure all students were assigned. This added flexibility served to lower within-group coordination costs and enabled students to better cope with the challenges of the group video assignment.

A final reflection as higher education institutions continue to face the challenge of delivering teaching and a good student experience amidst the COVID-19 pandemic. Producing a video is an inherently digital output and so can be produced, disseminated, and viewed remotely, using freely available technology. Hence a video assessment requires minimal adaptation as universities shift to online assessment. Whilst social distancing and remotely situated students pose a challenge to collaborative group work, there are an increasing number of free, flexible and effective online tools designed to facilitate remote collaboration. With working from home, a rapidly growing phenomenon, such skills are likely to be appreciated by employers. Moreover, this assessment creates opportunities for students to interact, which is particularly valuable at a time when face-to-face interaction is restricted.

Conclusion

This study investigates perceptions of a novel group video assignment for a diverse sample of undergraduate economics students in a UK university. Notwithstanding the challenges the implementation of such an assessment entails for both educators and learners, the study reports positive feedback overall from students. Specifically, the group video coursework positively contributed to the students’ learning experience as it develops essential competencies such as communication and teamwork. Furthermore, we report evidence that students regard the group video assignment as a more engaging and challenging method of assessment, when compared to more traditional methods, such as multiple-choice questions and individual essays. Undergraduate programmes are likely to benefit from plurality in assessment modes; to be considered overall when designing assessment strategies across entire programmes of study.

While our study reports some noteworthy findings, it has its limitations. Our sample size is inherently small, as it is constrained by the cohort size. This prevents us from being able to draw statistically robust inferences. Also, new students have limited opportunity to experience different forms of assessment, so their perceptions of these as captured within focus group discussion are likely shaped by their experiences prior to arriving at university.

Future research could investigate how group video assessments contribute to students’ learning experience and skills development in final year or postgraduate modules or study the relationship between video assessments and graduate employability. More generally, this article raises awareness of the benefits of introducing assessments that draw on technological innovations.

Acknowledgements

We would like to thank Dr Marion Heron, Senior Lecturer in Higher Education at the University of Surrey, for her valuable comments and suggestions. We are also grateful to participants of the 2019 Developments in Economics Education conference for their comments. The authors remain responsible for any possible errors.

References

Appendix

Feedback sheet 1

Section A (approximate time for discussion and answer: 10 min)

Q1. To what extent has the video assessment positively contributed to your learning experience in this module?

Rate your group experience from 0 (worst experience) to 10 (best experience):

0 1 2 3 4 5 6 7 8 9 10

List three reasons for your answer:

  • 1

  • 2

  • 3

Section B (approximate time for discussion and answer: 15 min)

Q2: I prefer the School of Economics to use a variety of methods to assess my learning rather than using one or two methods.

Options: Strongly Agree Agree Neutral Disagree Strongly disagree

Q3. How does the group video assessment compare with other forms of midterm assessment?

Rank the following assessments according to the criteria in the first column from the most preferred (1) to the least preferred (5):

Q4. What skills do you feel the video assessment helped you develop, as compared to the other forms of assessment?

1

2

3

Feedback sheet 2

Section C (approximate time for discussion and answer: 10 min)

Q5. In the following rate your group experience from 0 (worst experience) to 10 (best experience)

To what extent has the video assessment helped you develop …

(a) the ability to communicate economics: 0 1 2 3 4 5 6 7 8 9 10

(b) teamwork skills: 0 1 2 3 4 5 6 7 8 9 10

(c) engage with other students: 0 1 2 3 4 5 6 7 8 9 10

Section D (approximate time for discussion and answer: 10 min)

Q6. What aspects went well in the video assessment (e.g. coordination, group discussion)?

1

2

3

Q7. Do you have any suggestions for improving the video assessment as a student learning experience?

1

2

3

Q8. Which video groups are represented by your focus group?