3,055
Views
21
CrossRef citations to date
0
Altmetric
Research Articles

Podcasts and Mobile Assessment Enhance Student Learning Experience and Academic Performance

Pages 1-7 | Received 24 Apr 2010, Accepted 12 Aug 2010, Published online: 14 Dec 2015

Abstract

The aim of this study was to combine podcasts of lectures with mobile assessments (completed via SMS on mobile telephones) to assess the effect on examination performance. Students (n = 100) on a final year, research-led, module were randomly divided into equal sized control and trial groups. The trial group were given access to podcasts / mobile formative assessments for lectures on the module. Towards the end of the module, all students on the module completed a ‘mock’ examination on the material in the lectures. Students in the trial group who listened to podcasts of the lectures and completed mobile assessments (n = 31) performed significantly better in the formative assessment (58.1±1, mean ± S.E.M; P<0.05, Student’s t-test) than other students on the module (52.2 ± 2; n = 54). Students accessed the podcasts via iTunes (or similar software; 38%), from the institutional virtual learning environment (31%), or using a combination of the two (31%). Interestingly, only around 21% of students listened to the majority of their podcasts away from a computer. The results of this study indicate that providing supporting resources does have a positive impact on student performance.

Introduction

Mobile technologies have been used with learners in a number of contexts, including delivery of resources, timetable planning and collaborative learning (for review see CitationNaismith et al., 2004). Interestingly, healthcare students in a number of UK HE institutions have recently been offered mobile solutions as they are frequently on placements away from their host institution and therefore benefit greatly from regular communications and the ability to interact with tutors (CitationZhang et al., 2008). However, the vast majority of bioscience students are registered on full-time, institution-based courses with a large number of contact hours in lectures, tutorials and practical classes. Therefore, the role of mobile communication, and in particular mobile assessment, will be different to that for healthcare students. In this study, mobile assessment was used in conjunction with podcasts of lectures, which were accessed at the learner’s convenience (e.g. via their MP3 player). Podcasting is rapidly gaining a foothold in UK HEIs (CitationHarris and Park, 2008), and will soon be commonplace in bioscience courses. Podcasts of full lectures are used by many higher education institutions in the United States and are available freely through podcatching software (CitationCampbell, 2005). Internal evaluations at some US institutions (including University of Washington) have encouraged faculties to podcast all courses (CitationLane, 2006). A number of studies have found that students value podcasts greatly as an additional resource to support their learning (CitationClarke et al., 2007; Ralph et al., 2010; Evans, 2008). Students supplement their lecture attendance with podcasts and previous research has suggested that podcasting lectures does not significantly reduce attendance (CitationBrittain et al., 2006; Lane, 2006; Malan, 2007). Students commonly perceive that podcasts have contributed to their exam marks (CitationBrittain et al., 2006), and whilst there is increasing literature on the methodology for producing educational podcasts, there is little pedagogic evidence that podcasts are useful for improving learning (CitationFalzon et al., 2005; Grabe and Christopherson 2008). However, a recent study has found that students given access to a podcast of a lecture, instead of the face-to-face lecture, performed better in an assessment (CitationMcKinney et al., 2009).

The current study highlights one way that mobile assessment may be employed to improve the learning experience and educational outcome for students who make use of mobile blended learning resources. Whilst there appears to be no literature quantifying the impact of combining podcasts and mobile assessment tools, it makes sound pedagogical sense as the learner is able to listen to the podcast and then immediately assess their understanding of the material by completing an SMS quiz, via their mobile device, which gives instant feedback. Learners can then either revisit the podcast material or move on to the next stage of their learning, based on the score obtained and the feedback received. Integrating formative assessments into mobile learning resources will stimulate students to pay attention to the material, and the instant SMS response and feedback will provide an immediate measure of their level of understanding. The aim of this study was to investigate the effect of providing mobile blended learning resources on student performance in an examination setting.

Methods

Student group

A final year research-led module on cognitive neuroscience was chosen for this study. There were 100 students enrolled on the module, and in the first session of the module the students were informed of the study and ethical consent was obtained. All students completed a pre-study questionnaire about their use of mobile devices and podcasts, which consisted of Yes/No, multiple choice and Likert style questions. The questionnaire was produced using QuestionMark Perception, and was presented within the University of Leeds virtual learning environment. Students were informed that they would be randomly allocated into either a control or trial group for the duration of the study, but that all resources would be available to all students prior to the revision period. All students on the module had access to lecture handouts and PowerPoint slides used in lectures. All students on the module had extensive experience of answering multiple choice questions, both in formative and summative settings. Students allocated to the trial group (n = 50) received detailed instructions on how to access podcasts and how to answer questions using their mobile telephones. Students in the control group were not able to access podcasts or mobile assessments until completion of the study.

Production of podcasts and mobile assessments

All lectures on the module were audio recorded using a MicroMemo® recording device attached to a 16Gb iPod (Apple, US). The module was team taught and instructions were provided for staff delivering lectures. Audio files were downloaded to an eMac for editing. Audio files were edited using Garageband software (version 3.0.4, Apple) and a 15 minute segment of the lecture was selected for the podcast. The selected segment typically involved either a description of a difficult concept, or discussion of research evidence. Objective assessment tests (Multiple Choice Question format) were recorded using the internal microphone on the eMac and were edited into the end of the audio file. MCQs were composed of a statement and five options, one of which was correct, and were related to the material in the podcast. Each podcast contained five MCQs. Instructions on how to answer MCQs using SMS on a mobile telephone were provided in each podcast. Audio files were exported as MP4 files and hosted on personal web-space as podcast episodes. A hand-coded XML was used to create the Really Simple Syndication (RSS) feed. Students were able to access the podcast episodes via the RSS feed or directly as audio files from within the institutional virtual learning environment. Only students in the trial group had access to the podcast episodes. Podcast episodes were typically published within 2–3 days of the lecture. Mobile SMS assessments were coded using software developed by Cambridge Training and Development (CTAD). Students used a code word for each podcast episode and sent a SMS with the code word followed by their answers to the MCQs. Feedback to students was provided immediately by SMS and consisted of the score obtained and the correct answers. Students were reimbursed with printer credits for SMS costs incurred during the study.

‘Mock’ examination

Students in the trial group had access to 11 podcasts during the course of the study, and they were sent regular reminders by email to encourage them to engage with the additional learning resources. Towards the end of the module, all students enrolled on the module (control group and trial group) were invited to take a ‘mock’ examination under examination conditions. Students were given two weeks notice of the assessment. The formative assessment consisted of 30 MCQs on topics covered in the module up to this point, and students had 45 minutes to complete the examination. All multiple choice questions in the mock examination came from a previously used bank of questions. Students were given a paper copy of the examination paper, and recorded their answers on a computer readable marking sheet (Speedwell, UK). Following the assessment, all students were given the correct answers to the MCQs, and students in the control group were given access to all podcast episodes for revision purposes. Students in the trial group completed a post-study questionnaire immediately following the ‘mock’ examination, which was produced on a printed computer readable marking sheet (Speedwell, UK), and consisted of Likert style questions evaluating their satisfaction with the resources.

Data analysis

The computer readable marking sheets used for the mock examination answers and the post-study questionnaire were analysed automatically using a Speedwell OMR scanner (Speedwell, UK), and results were analysed in Microsoft Excel. Responses received by SMS were stored using the CTAD software for offline analysis in Microsoft Excel. Students were asked to record their mobile telephone number in the pre-study questionnaire, so SMS responses were linked up to individual students. This also provided an additional check that only students in the trial group who submitted SMS answers to mobile assessments were included in the analysis. Statistical analysis was carried out using SPSS (version 16.0, SPSS Inc.) to compare the ‘mock’ examination results for students in the trial versus control groups; an unpaired Student’s t-test was used where a p value of less than 0.05 was considered to indicate a statistically significant result. The same test was used to compare the mock examination results for students who had submitted SMS (from within the trial) group compared to all other students in the cohort. Data are presented as mean ± standard error of the mean, and n is the number of subjects.

Results

A pre-study questionnaire was conducted (n = 92) to evaluate students’ use of podcasts and mobile devices All students reported owning a mobile telephone, and the majority (90%) had an MP3 player for listening to music, podcasts etc. Over 50% owned an iPod, 21% had another brand of MP3 player, and 14% used their mobile telephone as an MP3 player. However, less than half of the sample (47%) had downloaded podcasts, and of those the majority (67%) did so less than once a month, with only 16% downloading and/or listening to podcasts more frequently than once a week. Students had mostly downloaded and/or subscribed to radio broadcast podcasts (67%), with only 7% having subscribed to academic podcasts.

During the course of the trial, 11 podcast episodes were published to students, between weeks 1 and 8 of the module. Of the 50 students who had access to the podcasts, 34 attempted at least one mobile formative assessment using their mobile telephone. On average, students attempted 6.2 quizzes during the trial. During the course of the study period, the number of users completing the SMS quiz reduced from 32 to 8 and the majority of SMS answers were recorded in the week following release of the podcast. The average number of students completing the SMS assessments throughout the study was 19.2 ± 2.2. The average score on the SMS quizzes varied between 2.4 and 3.8 (out of 5; mean score 3.3, n = 211). It was not possible to quantify the number of users accessing the podcasts, either via the RSS feed or the institutional VLE. Students accessed the podcasts via iTunes (or similar software; 38%), from the institutional virtual learning environment (31%), or using a combination of the two (31%). Interestingly, only around 21% of students listened to the majority of their podcasts away from a computer, and only 5% of students listened to podcasts on a personal computer in an Information Systems Service owned cluster on the university campus.

Eighty five students took the ‘mock’ examination which consisted of 30 multiple choice questions, and was conducted under examination conditions in a lecture theatre setting. Forty three students from the trial group and 42 from the control group attended the session and took the examination. Students in the trial group scored 56.4 ± 2% (n = 43) and performed significantly better than those in the control group (52.1 ± 2%; n = 42; p<0.05, Student’s t-test). Students in the trial group who had completed at least one SMS assessment (n = 31) and took the exam performed significantly better (58.1 ± 1; p<0.05, Student’s t-test) than other students (52.2 ± 2; n = 54). Therefore, students who used the podcasts and mobile formative assessments as a learning aid experienced a 6% improvement in examination performance.

Analysis of the students’ previous marks (all level 1 and level 2 marks) was undertaken to take into account potential sampling bias around prior academic performance. Students in the trial group who had completed at least one SMS assessment and took the examination (n = 31) had an average mark of 61.4 ± 0.5 (n = 31 students; n = 319 marks) which was not significantly different from the other students who took the exam (average mark: 60.9 ± 0.4; n = 54 students; n = 719 marks; p > 0.05, Student’s t-test).

Thirty-six students from the trial group completed a questionnaire at the end of the study, which was designed to evaluate their opinion of the learning resources and the use of technology enhanced learning tools. Overall, the respondents were very satisfied with the enhanced learning resources provided to support the module. Students were comfortable with the use of their mobile telephone to send SMS to answer the formative questions (86% agreed that the technology was easy to use) and valued the feedback received. Seventy-six percent of students agreed that the MCQs accompanying the podcasts reinforced their understanding of the audio material. The majority of students (86%) agreed that having podcasts to accompany all lectures would enhance their learning opportunities. In written comments to accompany the questionnaire, the following themes emerged about the podcasts. On the positive side, students commented that the podcasts were a useful “Reinforcement of lecture”, an “Excellent revision tool” and a useful tool to “Revisit material not understood during lecture”. On the negative side, students commented that podcasts “Should be consistent (shorter) length” and that they should have access to “Podcasts with slides”. Also, a number of students noted that they would have benefitted from “More help needed to use the technology”. With regard to the mobile assessments accompanying the podcasts, students were in favour of “Instant feedback on understanding and progress” and the fact that the assessments “reinforced the podcast material”. Overall, students felt that better SMS feedback would have improved the learning experience.

Discussion

Improvement in academic performance

The results of this study show that providing students with podcasts and integrated formative assessment answered using mobile technology significantly improved examination performance. Moreover, students who engaged with mobile assessment exercises during the study performed around 6% better in an examination, albeit a formative exercise designed for this study. Interestingly, the students in the trial group showed this significant improvement in performance despite having a similar academic record to students who did not engage with, or have access to, podcasts and mobile assessments. These data add weight to the hypothesis that providing students with blended learning resources can improve academic performance.

The podcasts provided to students in the trial group were extracts of lectures delivered to the full group, and contained conceptually challenging aspects of the lecture. The podcasts were accompanied by narrated formative assessments designed to test the students’ understanding of the lecture extract, and were answered using a mobile telephone. Students in the trial group who engaged with the podcasted formative assessments performed better in the whole group ‘mock’ examination used to assess the results of the study. The results suggest that this group of students had engaged more deeply with the material, and learnt from the podcasts and formative assessment, and this had helped them to perform better in the examination. Obviously, it would not have been ethical to continue this study through to the formal module examination, which was why a ‘mock’ examination was provided during the course of the module. Students in both the trial and control groups attended this examination opportunity in roughly equal numbers, and thus made the study robust and quantifiable. Students in the control group were given access to the podcasts following the mock examination, and all students were encouraged to make use of the resources for revision purposes.

The results of this study are supported by the recent study by CitationMckinney et al., 2009, where a similar increase in exam performance was observed when students were divided into groups, where they accessed a podcast or a normal lecture (CitationMcKinney et al., 2009). The results are also supported by a ‘podcast-like’ study where students performed better in assessments when they made significant use PowerPoint resources with embedded audio (CitationCramer et al., 2007). The results of this study may be explained by a number of factors. Firstly, students in the trial group had continued access to extracts of lectures from the course, and could therefore revisit lecture material over and over again, thus undertaking active experimentation, and gaining a learning advantage (CitationKolb, 1984). Secondly, the formative assessments in the podcasts, which were provided to test students’ understanding of the lecture extract, should have reinforced their factual and conceptual knowledge (CitationBiggs, 1999; Kolb, 1984), and the feedback provided should have corrected any factual errors. Finally, there may have been some practise effect for the students in the trial group, as they had access to more formative testing opportunities than the other students, and this could have influenced their overall academic performance in the ‘mock’ examination. However, given that most modules offer some form of formative assessment (commonly in the form of multiple choice questions), this is unlikely to have contributed significantly to the large difference in performance observed between the trial and control groups in this study.

The results of this study may help to inform future curriculum design principles as they provide evidence that providing enhanced learning resources during a module can significantly improve students’ performance in examinations. Importantly, producing podcasts of lectures and accompanying mobile assessments is not technically challenging for the teacher, and could be incorporated into course design relatively easily.

Student perceptions of blended learning

As participants in this research study, and undergraduates pursuing a BSc degree, the students’ views on the study were extremely important, as was the data collected on their technological habits. As expected, the students valued the additional learning resources which accompanied this study, seeing clear learning opportunities from having repeated access to lecture material and formative testing opportunities. In line with other studies, it was found that students are digital consumers, using MP3 players for music (CitationClark et al. 2007; McKinney et al., 2009; Ralph et al. 2010). Surprisingly, the majority of students did not regularly subscribe to podcasts, but most were familiar with the concept of doing so. Whilst most commentators agree that this generation of undergraduate students are “digital natives” (CitationPrensky, 2001), who expect technology to be integrated into their HE experience, we may be overly concerned about the expertise and familiarity of our current students’ with these tools. A consistent negative comment received from participants in this study was that they would have benefitted from more help with subscribing to a podcast, despite the fact that detailed instructions were provided, suggesting a lack of confidence or experience with this form of technology enhanced learning tool.

As with other studies, it was found that the majority of students accessed podcasts from their personal computer, despite receiving instructions of how to subscribe to podcasts on their MP3 players (CitationLane, 2006, Lee and Chan, 2007; CitationMalan, 2007; Ralph et al., 2010; Sandars and Schroter, 2007). This is an interesting, but perhaps not surprising observation, which suggests that the current generation of students are still traditional learners, and have not yet resorted to learning “on the move”. From comments received it is clear that most students listened to the podcasts whilst reviewing the lecture slides or research articles on a PC, thus limiting mobile learning opportunities. This suggests that future studies could usefully evaluate the impact of providing richer (e.g. PowerPoint slides with audio / video narration) mobile learning resources on students’ academic performance, to evaluate if students value mobile learning opportunities when provided with a complete set of resources. Alternatively, podcasts of lectures could be produced with embedded images (which could be PowerPoint slides) and supplied via RSS. The disadvantage of both of these solutions is the increased technological demands placed on the learner, who would need a compatible mobile device to access the resources. As technological advances continue, it is likely that mobile devices will deliver learning resources in a more accessible and useful format, so that students are able to study in a truly anytime, anywhere fashion. The challenge will be to ensure that the depth of the learning experience is maintained, and students may have to adapt their study skills to ensure deep learning occurs in these environments.

Conclusions

The results of this study provide quantitative statistically significant evidence that providing students with enhanced learning resources and opportunities to undertake mobile formative assessments improves their academic performance. The design of this study allowed a robust testing of the hypothesis, without compromising ethical issues of fairness and equity for students studying a module contributing to their degree classification. This study provides an example of how mobile technology can be usefully integrated into the curriculum for higher education students engaged in a campus based degree scheme, to provide improvements to the student experience and academic performance.

Acknowledgements

This study was funded by the UK Centre for Bioscience, The Higher Education Academy, under a call for mobile assessment projects as part of the JISC/Higher Education Academy (DEL) Distributed e-Learning programme. The author gratefully acknowledges the students enrolled on the module used for this study for participating in the research, whilst studying the module.

References

  • BiggsJ. (1999) What the student does: teaching for enhanced learning. Higher Education Research & Development, 18, 57-75
  • BrittainS., GlowackiP., Van IttersumJ. and JohnsonL. (2006) Podcasting Lectures. EDUCAUSE Quarterly, 29 (3), 24-31
  • CampbellG. (2005) There is something in the air: Podcasting in Education. EDUCAUSE Review, 40, 6
  • ClarkS., WestcottM. and TaylorL. (2007) Using short podcasts to reinforce lectures, Symposium presentation at National UniServe Conference, 2007, The University of Sydney, pp 22-27
  • CramerK.M., CollinsK.R., SniderD. and FawcettG. (2007) The virtual lecture hall: Utilisation, effectiveness and student perceptions. British Journal of Educational Technology, 38, 106-115
  • EvansC. (2008) The effectiveness of m-learning in the form of podcast revision lectures in higher education, Computers and Education, 50, 491-498
  • FalzonB. G. and BrownC. J. (2005) Web-Assisted First-Year Undergraduate Teaching in Engineering. Computer Applications in Engineering Education, 13 (2), 125-132
  • GrabeM. and ChristophersonK. (2008) Optional student use of online lecture resources: resource preferences, performance and lecture attendance. Journal of Computer Assisted Learning, 24 (1), 1-10
  • HarrisH. and ParkS. (2008) Educational usages of podcasting. British Journal of Educational Technology, 39: 548-551
  • KolbD.A. (1984) Experiential Learning: experience as the source of learning and development New Jersey: Prentice-Hall.
  • LaneC. (2006) UW Podcasting: Evaluation of Year One. University of Washington. Available at http://catalyst.washington.edu/research_development/papers/2006/podcasting_year1.pdf (Accessed 15 March 2009)
  • MalanD.J. (2007) Podcasting computer science E-1. In Proceedings of the 38th SIGCSE Technical Symposium on Computer Science Education, pp 389-393
  • McKinneyD., DyckaJ. L. and LuberaE. S. (2009) iTunes University and the classroom: Can podcasts replace Professors? Computers and Education, 52 (3) 617-623
  • NaismithL., LonsdaleP., VavoulaG. and SharplesM. (2004) Mobile technology and Learning. Futurelab. available at www.futurelab.org.uk/resources/publications-reports-articles/literature-reviews/Literature-Review203 (accessed 20 March 2009)
  • PrenskyM. (2001) Digital Natives, Digital Immigrants, On the Horizon. NCB University Press, 9 (5), 1-16
  • RalphJ., HeadN. and LightfootS. (2010) Pol-Casting: The Use of Podcasting in the Teaching and Learning of Politics and International Relations. European Political Science, 9, 13-24
  • SandarsJ. and SchroterS. (2007) Web 2.0 technologies for undergraduate and postgraduate medical education: an online survey. Postgraduate Medical Journal, 83, 759-762
  • ZhangP., MillardD., WillsG., HowardY., FauldsS., GilbertL. and SparksD. (2008) A Mobile Toolkit for Placement Learning. In: The 8th IEEE International Conference on Advanced Learning Technologies (ICALT 2008), July 1st- July 5th, 2008, Santander, Cantabria, Spain, eds DíazR., Kinshuk, AedoI. and MoraE., pp 92-96

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.