2,736
Views
18
CrossRef citations to date
0
Altmetric
Research Article

Students’ motives for using online formative assessments when preparing for summative assessments

, , , &
Pages e1644-e1650 | Published online: 20 Sep 2013

Abstract

Background: Formative assessments intend to provide feedback on student performance in order to improve and accelerate learning. Several studies have indicated that students using online formative assessments (OFAs), have better results on their exams.

Aims: The present study aims to provide insight in student reasons for using or not using available OFAs.

Method: Three OFAs with feedback were available in a second year undergraduate course in physiology for biomedical sciences students (N = 147). First, students received an open questionnaire about why they did (not) complete the first two OFAs. Based on this data, a closed questionnaire was developed and distributed among students. Exploratory factor analysis (EFA) was applied.

Results: The results indicate reasons why students do (not) use the OFAs. The EFA for using the OFAs indicated three factors, that were interpreted as collecting (1) feed up, (2) feed forward, and (3) feed back information. The main reasons for not using the OFAs were lack of time and having completed the questions before.

Conclusions: Students’ reasons for using OFAs can be described in terms of collecting feed up, forward and back information and students’ reasons for not using OFAs can be student-, teacher-, or mode-related.

Introduction

Self-directed learning is one of the most promising types of learning advocated in medical education to prepare medical students for continuing professional education (Collins & Hammond Citation1987; Candy Citation1995; Davis et al. Citation1995; Murad et al. Citation2010). Self-directed learners are able to self-appraise their work and to seek, accept and use feedback from others in order to improve their performance (Sargeant et al. Citation2008). A tool that can be used to support the self-directed learning of students is an online formative assessment (OFA) that students can voluntarily take part in while preparing for a summative exam (Henley Citation2003; Kibble Citation2007; Gikandi et al. Citation2011; Wang Citation2011). Formative assessment refers to an assessment that is specifically intended to provide feedback on performance to improve and accelerate learning (Sadler Citation1998; Rushton Citation2005), as opposed to a summative assessment that summarises the achievement of a student, often in the form of a grade (Sadler Citation1989). The results from a formative assessment have value not in terms of completing a course, but rather providing feedback to the students as to the extent of their understanding of the course material. Thus, it can help them in planning their next learning activities (Carrillo-de-la-Pena et al. Citation2009).

Several studies have investigated the effects of OFAs (e.g. Buchanan Citation2000; Olsen & McDonald Citation2004; Kibble Citation2007; Dobson Citation2008; Velan et al. Citation2008; Angus & Watson Citation2009; Kibble et al. Citation2011; Velan et al. Citation2002; Bouwmeester et al. accepted), all indicating that students participating in the OFA had higher scores on the subsequent summative assessment. In order to explain this effect of OFAs, several mechanisms have been proposed: increasing student engagement (Peat & Franklin Citation2002; Gikandi et al. Citation2011); increasing time on task (Cook et al. Citation2010); preventing procrastination (Peat & Franklin Citation2002); and providing formative and informative feedback (Velan et al. Citation2002; Carrillo-de-la-Pena et al. Citation2009; Gikandi et al. Citation2011). Interestingly however, despite the positive effects of OFAs, not all students use them (Sly Citation1999; Kibble Citation2007). This raises the issue of why students do or do not use OFAs and how these student reasons might be related to the different mechanisms that are proposed and so far little is known about this issue (Sinclair & Cleland Citation2007). Therefore, the present study seeks to better understand the effect of OFAs by asking students why they chose to use or not use OFAs and by addressing the following research question: What reasons do students have for using online formative assessments?

Insight into students’ reasons for using OFAs can (a) contribute to our understanding of the mechanisms that explain the relation between participation in OFAs and higher scores on summative assessments, (b) establish guidelines for the design and implementation of OFAs in medical education that are aligned with student reasons for using them and (c) provide suggestions for making OFAs more appealing to students who do not use them instantly.

Methods

This study was conducted in a second year undergraduate course in physiology for biomedical sciences students. In total, 147 students took the course, which was made up of three blocks in which the respiratory, circulatory and urinary systems were addressed. At the end of each block a summative multiple choice test was taken by the students, assessing their knowledge and skills with respect to the specific organ system. In addition, at the end of the whole course a final summative test with essay questions was taken by the students in which knowledge and skills of the three organ systems were tested.

For each of the three organ systems OFAs were available in the e-learning environment, to which all students had access. Students received feedback on their answers when they had finished the complete assessment. In case of a correct answer, the feedback confirmed/elaborated why the answer was correct. In case of a wrong answer, the feedback indicated what mistake the student most likely made. The questions of the OFA were also available beneath micro lectures of the lectures, so that students could make the test items right after watching the micro lectures. If they did so, this was not considered using the complete OFA.

After the summative assessment of the second block, using tracking data from the online learning environment, we divided the students into four groups: (a) students who completed both OFA 1 (respiratory system) and OFA 2 (circulatory system) (N = 88), (b) students who completed only OFA 1 (N = 15), (c) students who completed only OFA 2 (N = 20) and (d) students who completed neither OFA 1 nor OFA 2 (N = 24). Each group received a specific email with a link to an online questionnaire with open questions about the formative assessments, so that the questions could be adapted to whether or not they completed one or two of the available OFAs (see ). Respectively 43, 6, 7 and 4 students of the four groups completed the questionnaire, indicating a general response rate of 40.8%. More specifically, the response rates for the four groups of students were 49% for group (a), 40% for group (b), 35% for group (c), and 17% for group (d). This indicates a trend of students that used the OFAs being more likely to fill out the questionnaire. Two independent researchers deductively constructed a coding scheme for the short answers. Both coding schemes were compared and differences between the schemes were discussed until consensus was reached (Creswell & Miller 2000). This led to the final coding scheme presented in . All student answers were coded (Cohen's kappa for reasons for completion: 0.76; Cohen's kappa for reasons for non-completion: 0.78).

Table 1.  Questions of the first student questionnaire

Table 2.  Final coding scheme for reasons for completing and not completing OFA

Based on the final coding scheme a questionnaire was developed containing 32 questions that had to be answered on a five-point scale. In order to establish a higher response rate than was established for the online questionnaire with open questions, this questionnaire was administered on paper right after the students took the summative assessment of the third block (urinary system). So, the students could fill out the questionnaire after completing the summative assessment and hand in both at the same time. In total, 134 students returned the completed questionnaire, indeed indicating a good response rate of 91.1%. These data were analysed by inspection of the means and standard deviations and factor analyses in order to explore whether latent factors underlying students’ reasons for completing and not completing the OFAs could be found. For the factor analyses, following the advice of Costello and Osborne (Citation2005), we used maximum likelihood estimation in combination with a direct Oblimin rotation. The number of factors was determined by inspecting the scree plot.

Results

Reasons for completing the OFAs

Most variables violated the assumption of a normal distribution, and all variables had a median of 4. Therefore, also also presents the means and standard deviations for all variables. The fact that the median for all variables is 4, indicates that all reasons play a role for students. The first two reasons (“checking whether I studied sufficiently” and “repeating and rehearsing the course material”) proved the most important reasons for students to complete the OFA, followed by “discovering gaps in my knowledge and skills.” In addition, the least important for students were “understanding what elements of the course material are important” and “having a positive/negative experience with the previous formative/summative assessment.”

Table 3.  Means and SDs for reasons for completing OFAs and the added value in terms of means and standard deviations (N = 104)

A factor analysis was run on all items concerning reasons for completing OFAs. The scree plot showed a hitch on the fourth factor, and therefore we chose a factor solution with three factors, which explained 51% of the total variance and the three factors having eigenvalues of respectively 4.56, 3.96 and 2.98. For interpretation of the factors, we first determined for each item the highest factor loadings (boldface values in ). Factor loadings smaller than 0.40 are presented in grey and cross loadings are presented in black.Footnote1

Table 4.  Factor loadings of exploratory factor analyses on items concerning reasons for completing OFAs and the added value of OFAs with ML estimations and Oblimin rotation

The items with the highest factor loadings on the first factor were the three items concerning understanding what elements of the course material are important and two items concerning getting insight into the form and content of the summative assessment. Based on these high loading items we interpreted the factor as using the OFAs to collect feed up information. The OFAs are used to see what the format of the assessment will be and what course material will possibly be addressed. In other words, students gain information about what the goals are and what is expected of them.

Second, the items with the highest factor loadings on factor 2 were the three items concerning repeating and rehearsing the course material, the three items about discovering gaps in knowledge and skills and surprisingly one item about getting insight into the form and content of the summative assessment. Based on the first six high loading items we interpreted the factor as using the OFAs to collect feed forward information. The OFAs are thus also used to learn the course material (move forward) and to know what elements of the course material would need to be studied more thoroughly. The fact that also one of the items of to collect feed up information loaded highest on this factor, in combination with a second item cross loading on this factor, is not that strange as feed up and feed forward are not independent elements as feed forward is about how to reach the goal more closely.

Third, the items with the highest factor loadings on the factor 3 were the three items concerning checking whether a student studied sufficiently and the two items concerning previous experiences with the formative or summative assessment. Based on these high loading items this factor is interpreted as students collecting feed back information. First, previous experiences already indicated how a student is going and are thus used in deciding whether using the OFA is needed, and second when the student completed the OFA, based on the outcome the student again checks where (s)he stands in relation to the goal. In total, 8 of the 17 items cross loaded on a second factor, indicating that these items were not strong in distinguishing the different factors.

Reasons for not completing the OFAs

provides the medians of the different reasons for not completing the OFAs that were found in the qualitative measurement. From the table it can be seen that in general most reasons have reasonably low medians, indicating that not all reasons seem to play a role for students. Given the small N for the items of the second column of , only the items from the first column were used in the factor analyses. This yielded a factor solution with only one factor, indicating that no clear underlying factor structure could be found for these items.

Table 5.  Medians for reasons for not completing OFAs in terms of means and standard deviations (N = 29; N = 9)

Discussion

This article explored student reasons for using OFAs when preparing for a summative assessment in a physiology course for biomedical sciences. Based on qualitative data gathered through open questions, a category system was developed that described the different motives students gave for making use of the OFAs, or not. In total six different reasons for making use of the OFAs were found, and seven reasons for not using them. Factor analyses on these seven reasons did not result in a clear factor structure for the function of not completing the OFAs. For the six reasons for completing the OFAs, three factors were found. Based on the work of Sadler (Citation1989) and Hattie and Timperley (Citation2007), the factors were interpreted as describing the underlying motives of collecting three types of information: (a) feed up, (b) feed back and (c) feed forward. Sadler (Citation1989) argued that in order for students to use feedback for self-regulation in their learning process, they should seek information about (a) what the goals, standards and criteria are, (b) how their current performance relates to these criteria and (c) what they can do to close the gap between their current performance and the standards. Hattie and Timperley (Citation2007) named these three elements feed up, feed back and feed forward. Our findings thus indicate the feedback function being an important explaining mechanism for the positive effects of OFAs on summative exam scores, rather than just increasing time on task (Cook et al. Citation2010) and preventing procrastination (Peat & Franklin Citation2002).

When we compare the three functions of OFAs to other studies investigating OFAs, several consistencies appear. These studies explored the student experience of OFA, but did not aim principally to gain insight into the specific functions. For instance, in line with the notion of feed up, Sly (Citation1999) argued that an important function of formative assessments is that it gives students an idea about what is expected of them in the summative assessment in terms of both content and form. Also, Henley (Citation2003), based on anecdotal evidence from informal student feedback, described two ways students could use the OFA. In the beginning, they used it after they studied the course material to check whether they understood the course material properly (i.e. feed back). Later in the course they used the OFA to guide their learning, i.e. before studying the course material (i.e. feed forward). Unfortunately, the design of our study does not allow us to draw such sequential conclusions as to how student use of OFAs changed during the course. Still, in line with Henley's (Citation2003) finding, Walker et al. (Citation2008) also found that students used e-assessments to identify areas of strengths and weaknesses in order to study further the course material (i.e. feed forward). Thus, the three factors we found can explain the preliminary research findings of other studies as well, suggesting that our findings are not specific to our sample only.

Concerning the findings of the present study about reasons for non-completion of the OFAs, it is important to bear in mind that some of the students indicated that they had already used the formative questions in another part of the electronic learning environment. In other words, these students did use the formative questions but not in the form of an OFA. Therefore, these findings might not be interpreted as if the students were uninterested in practising questions at all. That being said, our findings are now discussed in relation to findings on non-attendance in medical education, as this might give some indication of the extent to which our findings are specific to the use of OFAs or possibly can be generalised to students not being willing to participate in learning activities in general. For instance, Mattic et al. (Citation2007) found that students reported both student-related and teaching-related factors for not attending lectures. In the present study “Finding the OFA insufficiently representative of the summative assessment,” could be considered a teaching-related factor and the other reasons (with the exception of technical problems) can be considered to be student-related, as these mainly describe students’ learning preferences. This is also in line with the findings of Billings-Gagliardi and Mazor (Citation2007), who found that student decisions to attend lectures were among others based on personal learning preferences and learning needs at a particular time. Finally, “Experiencing technical problems” is neither a teaching-related nor student-related factor and is therefore interpreted as a mode-related factor. We thus conclude that students’ reasons for not using OFAs can be student-related and/or teaching-related, but can also be mode-related, which might be specific for OFAs.

Implications

In general, the findings of this study indicate that students used the OFAs for acquiring information with respect to understanding what is expected of them on the summative assessment, both in terms of form and content (feed up), and to what extent they have already acquired the course material (feed back), and to what extent they need to study further topics (feed back). In other words, they confirm the notion of Gibbs and Simpson (Citation2004) that not only summative, but also formative assessment strongly drives learning. For the use of OFAs in practice, this indicates the importance of constructive alignment of formative and summative assessments as students use information from the formative assessment to self-regulate their learning activities in preparing the summative assessment, i.e. “we have first to be clear about what we want students to learn, and then teach and assess accordingly in an aligned system of instruction” (Biggs Citation1999, p. 8). It is thus important that formative assessments clearly reflect the learning objectives and therefore the content, level and types of questions of the summative assessment, in order to provide students with the opportunity to use properly the information that is acquired to prepare for the summative assessment.

With respect to the reasons for not completing the OFAs, we think some reasons might be overcome and some might not. As addressed above, the formative assessment not being representative of the summative assessment is a reason that might be overcome with strong constructive alignment. Also, the aim can be to avoid technical problems when using the OFAs. However, the student-related reasons indicate that OFA might not suit the preferred learning activities of all students. Therefore, it is important to design rich (online) learning environments so that students can study course material and prepare for assessments in a way that matches their preference.

Limitations

The design of this study has some drawbacks that should be kept in mind when drawing conclusions from these findings, based on which suggestions for future studies are provided. First, the present study was conducted during one course only and study strategies can differ considerably between courses (e.g. Kadri et al. Citation2011). Therefore, these reasons should be viewed as course-specific. Also, concerning inter-generalisability, future research is needed to validate the three functions that were found in order to see whether these motives also play a role in other courses with OFAs specifically, or other non-compulsory study activities in general.

Second, in the present study, only a small number of students had not completed the OFAs, so we had only minimal data about them. In future studies, it would be interesting to collect more data from these kinds of students so that for the reasons for not completing the OFAs general motives or functions underlying the separate items could also be investigated. Also, in the present study, students could use the questions from the OFAs in combination with viewing the micro lectures, which makes the group of students not completing the OFAs a rather heterogeneous group. For research purposes it would have been better if the OFAs were the only way to access the test questions. Finally, future studies might investigate whether different groups of students can be described based on their scores on the different motives using for instance cluster analyses. This would provide insight into how the three functions are related within students.

Conclusion

The results of this article revealed different student reasons for completing or not completing OFAs. More specifically, despite the fact that no clear factor structure was found for reasons for not completing, it is concluded that these reasons can be student-related, teaching-related and/or mode-related. As regards the reasons for completing three underlying types of information we found that by using the OFAs, students acquire: (a) feed up (what is expected of me in the summative assessment in terms of content and form?), (b) feed back (to what extent have I already mastered the course material?) and (c) feed forward (do I need to study more and if so, what do I need to study?). These findings can be related to findings of both Sadler (Citation1989) and Hattie and Timperley (Citation2007) and confirm preliminary indications about the use of OFAs (Sly Citation1999; Henley Citation2003; Walker et al. Citation2008) and findings about motives for lecture attendance (Billings-Gagliardi & Mazor 2007; Mattic et al. Citation2007). The results of the present study yield important theoretical insights into why students do or do not use OFA and indicate that, for practice, attention should be paid to constructive alignment of instruction, formative assessments and summative assessments (Biggs Citation1999; Gibbs & Simpson Citation2004).

Acknowledgements

The authors would like to thank Joop Hox and Urbano Lorenzo Seva for their valuable help and advice with respect to the data analyses.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Notes

Notes

1As the data violated the assumption of normal distribution, we also ran an exploratory factor analyses for categorical data using the software package FACTOR. The results from these analyses showed an identical factor solution, i.e. all variables loaded highest on the same factor as in the original analyses. Therefore, we report the results of the original factor analyses.

References

  • Angus SD, Watson J. Does regular online testing enhance student learning in the numerical sciences? Robust evidence from a large data set. Br J Educ Technol 2009; 40: 255–272
  • Biggs J. What the student does: Teaching for enhanced learning. Higher Educ Res Dev 1999; 18: 57–75
  • Billings-Gagliardi S, Mazor KM, 2007. Student decisions about lecture attendance: do electronic course materials matter? Acad Med 82:S73–S76
  • Bouwmeester RAM, de Kleijn RAM, Freriksen AWM, van Emst MG, Veeneklaas RJ, van Hoeij MJW, Spinder M, Ritzen MJ, ten Cate OTJ, van Rijen HVM. Online formative tests linked to microlectures improve academic achievement. Med Teacher, epub ahead of print. DOI: 10.3109/0142159X.2013.818633
  • Buchanan T. The efficacy of a World-Wide Web mediated formative assessment. J Comp Assist Learn 2000; 16: 193–200
  • Candy PC. Physician teach thyself: The place of self-directed learning in continuing medical education. J Contin Educ Health Professions 1995; 15: 80–90
  • Carrillo-de-la-Pena MT, Bailles E, Cseras X, Martinez I, Ortet G, Perez J. Formative assessment and academic achievement in pre-graduate students of health sciences. Adv Health Sci Educ 2009; 14: 61–67
  • Collins R, Hammond M. Self-directed learning to educate medical educators. Part 2: Why do we use self-directed learning?. Med Teacher 1987; 9: 425–432
  • Cook DA, Levinson AJ, Garside S. Time and learning efficiency in Internet-based learning: A systematic review and meta-analysis. Adv Health Sci Educ 2010; 15: 755–770
  • Costello AB, Osborne J. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pract Assess Res Eval 2005; 5: 95–105
  • Creswell JW, Miller DL. Determining validity in qualitative inquiry. Theor Pract 2000; 39: 124–131
  • Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: A systematic review of the effect of continuing medical education strategies. J Am Med Assoc 1995; 274: 700–705
  • Dobson JL. The use of formative online quizzes to enhance class preparation and scores on summative exams. Adv Physiol Educ 2008; 32: 297–302
  • Gibbs G, Simpson C. Conditions under which assessment supports students’ learning. Learn Teach HighEduc 2004; 1: 3–31
  • Gikandi JW, Morrow D, Davis NE. Online formative assessment in higher education: A review of the literature. Comput Educ 2011; 57: 2333–2351
  • Hattie J, Timperley H. The power of feedback. Rev Educ Res 2007; 77: 81–112
  • Henley DC. Use of Web-based formative assessment to support student learning in a metabolism/nutrition unit. Eur J Dental Educ 2003; 7: 116–122
  • Kadri HMF, Al-Moamary MS, Elzubair M, Magzoub ME, AlMutairi A, Roberts C, van der Vleuten C. Exploring factors affecting undergraduate medical students’ study strategies in the clinical years: A qualitative study. Adv Health Sci Educ 2011; 16: 553–567
  • Kibble J. Use of unsupervised online quizzes as formative assessment in a medical physiology course: Effects of incentives on student participation and performance. Adv Physiol Educ 2007; 31: 253–260
  • Kibble JD, Johnson TR, Khalil MK, Nelson LD, Riggs GH, Borrero JL, Payer AF. Insights gained from the analysis of performance and participation in Online Formative Assessment. Teach Learn Med 2011; 23: 125–129
  • Mattic K, Crocker G, Bligh J. Medical student attendance at non-compulsory lectures. Adv Health Sci Educat 2007; 12: 201–210
  • Murad MH, Coto-Yglesias F, Varkey P, Prokop L, Murad AL. The effectiveness of self-directed learning in health professions education: A systematic review. Med Educ 2010; 44: 1057–1068
  • Olsen BL, McDonald JL. Influence of online formative assessment upon student learning in biomedical science courses. J Dent Educ 2004; 68: 656–659
  • Peat M, Franklin S. Supporting student learning: The use of computer-based formative assessment modules. Br J Educ Technol 2002; 33: 515–523
  • Rushton A. Formative assessment: A key to deep learning?. Med Teacher 2005; 27: 509–513
  • Sadler DR. Formative assessment and the design of instructional systems. Instruct Sci 1989; 18: 119–144
  • Sadler DR. Formative assessment: Revisiting the territory. Assess Educ: Principles, Pol Pract 1998; 5: 77–84
  • Sargeant J, Mann K, Sinclair D, van der Vleuten C, Metsemakers J. Understanding the influence of emotions and reflection upon multi-source feedback acceptance and use. Adv Health Sci Educ 2008; 13: 275–288
  • Sinclair HK, Cleland JA. Undergraduate medical students: Who seeks formative feedback?. Med Educ 2007; 41: 580–582
  • Sly L. Practice tests as formative assessment improve student performance on computer-managed learning assessments. Assess Eval Higher Educ 1999; 24: 339–343
  • Velan GM, Kumar RK, Dziegielewski M, Wakefield D. Web-based self-assessments in pathology with Questionmark Perception. Pathology 2002; 34: 282–284
  • Velan GM, Jones P, McNeil HP, Kumar RK. Integrated online formative assessments in the biomedical sciences for medical students: Benefits for learning. BMC Med Educ 2008; 8: 1–11
  • Walker DJ, Topping K, Rodrigues S. Student reflections on formative e-assessment: Expectations and perceptions. Learn, Media Technol 2008; 33: 221–234
  • Wang TH. Developing web-based assessment strategies for facilitating junior high school students to perform self-regulated learning in an e-Learning environment. Comput Educ 2011; 57: 1801–1812

Glossary

Self-directed learning: A process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes.

Knowles M. Self-directed learning: A guide for learners and teachers. New York, NY: Associated Press; 1975 p 18.

Self regulation: Self-generated thoughts, feelings, and actions that are planned and cyclically adapted to the attainment of personal goals.

Zimmerman BJ. 2000. Attaining self-regulation: A social-cognitive perspective. In: Boekaerts M, Pintrich P, Zeidner M, editors. Handbook of self-regulation. Orlando, FL: Academic Press. pp 13–39.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.