11,406
Views
3
CrossRef citations to date
0
Altmetric
Articles

The Impact of Assignments and Quizzes on Exam Grades: A Difference-in-Difference Approach

ORCID Icon & ORCID Icon

Abstract

Using data on students at a Canadian business school, this article studies the effect of homework assignments and in-class quizzes on exam performance. Based on a difference-in-difference approach, assignments had a statistically discernible positive impact on exam grades for the overall sample. When broken down by gender, assignments had a positive impact on exam grades for males but no statistically discernible impact for females. Quizzes had no statistically discernible impact overall or for either gender. When broken down by student residency status, both assignments and quizzes positively impacted exam grades for international students, but there was no statistically discernible impact of assignments or quizzes for domestic students.

1 Introduction

The objective of this study is to examine the effect of homework assignments and in-class quizzes on students’ exam performance in an undergraduate business program. Homework assignments and quizzes are common assessment techniques used by instructors in undergraduate programs. Both assignments and quizzes involve costs—such as instructors’ time spent in preparing and marking these assessment instruments, and students’ time in doing the homework assignments or in answering quiz questions—that could have been spent on other performance-enhancing approaches. Spending time on homework or quizzes is justified if the benefits of these assessment methods outweigh the costs involved. One of the benefits of homework or quizzes is enhanced student learning. Exam grades provide a way to measure students learning.

A substantial number of studies, primarily using data from the United States, have examined the effects of homework assignments and quizzes on exam performance. Here, we will review relatively recent studies on this issue. Using data from the US National Educational Longitudinal Study of 1988 (NELS:88), Eren and Henderson (Citation2008) assessed the role of homework in academic achievement. The study found that relative to more standard spending-related measures such as class size, extra homework had a larger and more discernible (significant) impact on test scores. However, additional homework was found to be most effective for high and low achievers. Using the data from 31 graduate students, Rehfeldt et al. (Citation2010) explored the effects of points versus no points on homework assignments submitted and quiz performance. The study found that the students were more likely to submit homework assignments during points weeks. However, it found no discernible impact on quiz scores. Latif and Miles (Citation2011) used data from a small Canadian university to examine the impact of graded assignments on the exam performance in economics courses. The study found that graded homework had a discernibly positive effect on academic performance. In addition, graded assignments were found to have their strongest effects among male students and among students with foreign (non-Canadian) backgrounds. Trost and Salehi-Isfahani (Citation2012) used data from undergraduate students in introductory economics courses to examine the effect of homework on exam performance. This experimental study found that completion of the assigned homework was positively (if not always discernibly) correlated with higher scores on the midterms. However, this result did not continue to hold for the final exam—indicating “decay” in the homework effect over the course of the semester. Grodner and Rupp (Citation2013) conducted a field experiment that involved 423 Principles of Microeconomics students. The study found considerable evidence that doing homework was beneficial to student learning. This result was particularly true for the students who initially performed poorly in the course. Using the data from 71 students in an undergraduate educational psychology course at a large southeastern state university, Galyon et al. (Citation2015) found that setting randomized reward contingencies, specifically for accuracy of homework, produced both discernibly higher accuracy and longer length of homework answers than a reward contingency based on completion of homework. The study also found that the accuracy contingency was associated with modest but discernible gains in the adjusted exam scores. Karadimitriou (Citation2016) used a sample of undergraduate students from a Greek university to explore the impact of a collaborative graded home assignment on performance on the final examination for the course. The results showed that the students performed very well on the part of the test that was relevant to their home assignment. Archer and Olson (Citation2018) used the data from a sample of online students in an entry-level economics course to examine the relationship between homework support and exam scores. The results show that the practice provided by a web-based homework management system was correlated with increased exam scores. Using data from undergraduates enrolled in four sections of a Psychology of Learning course at West Chester University of Pennsylvania, Azorlosa and Renner (Citation2006) examined whether announced quizzes improved exam performance. The study found that quizzes had some desirable effects, such as increased attendance and self-reporting of increased preparation for exams. The quiz questions were in the multiple-choice format, whereas the exams consisted of essays, and the study found no evidence of a positive impact of quizzes on exam performance. Using data from undergraduates enrolled in four sections of a Psychology of Learning course at the same university, Azorlosa (Citation2011) examined the effect of announced quizzes on exam performance when the quizzes and the exams had the same format, namely multiple-choice questions. The study concluded that introducing quizzes into the classroom produced several desirable effects, including increased attendance, more evenly spaced studying (less cramming), and improved exam performance. Padilla-Walker (Citation2006) used the data from undergraduate students enrolled in an advanced-level psychology course at a Midwestern state university to determine the impact of daily extra-credit quizzes on exam performance. The study found that students who did well on the extra credit quizzes also demonstrated improved exam performance. Johnson and Kiviniemi (Citation2009) studied the effectiveness of weekly quizzes based on required reading for the course and found that completion of the quizzes was related to improved exam performance. Galizzi (Citation2010) used data from undergraduate students taking economics courses at a US public university to examine the impact of online quizzes on academic performance. This study was based on one section of a Labor Economics course and two sections of a Principles of Microeconomics course. The Labor Economics section had 17 students, and the first Principles of Microeconomics section had 41 students, while the second section of Principles of Microeconomics had 33 students. The study found that participation in online quiz assignments made no discernible difference in the students’ ability to score higher grades on written exams.

In sum, the literature points to a consensus that homework assignments improve academic performance. However, evidence regarding the impact of quizzes on exam grade is mixed.

This study makes the following contributions to the literature:

  1. The majority of the studies reviewed did not control for the endogeneity arising from the presence of unobserved factors that are correlated with both exam grades and homework assignments or quizzes. Unobserved factors may include motivation, IQ, etc. For example, highly motivated students are likely to do homework regularly, and they are also likely to work hard to achieve better grades. Estimations without taking such endogeneity into account are likely to produce biased results. This study’s contribution to the literature is the use of the difference-in-difference (DID) method to control for unobserved individual-specific time-invariant factors.

  2. To the best of the knowledge of the authors, only one study so far had examined the impact of assignments on exam performance of Canadian students (Latif and Miles Citation2011) and no study had examined the effect of quizzes on exam grades. Thus, this study is expected to make an important addition to the Canadian literature on this topic.

2 Methodology

2.1 Conceptual Framework

Quizzes benefit student learning in a number of ways. They help students to retain material for a longer period of time; consequently, students do better on exams (Johnson and Kiviniemi Citation2009). Also, quizzes provide students with an incentive to attend classes regularly (Clump, Bauer, and Whiteleather Citation2003). Furthermore, graded quizzes force students to study to prepare for the quizzes. Working on homework assignments is hypothesized to lead to better understanding of the course material and increased retention of factual knowledge (Cooper, Robinson, and Patall Citation2006). However, the students might not benefit from working on assignments if they copy the assignment answers from other students or otherwise do not do the assignments themselves.

2.2 Data

The study is based on data collected from three sections of an Introductory Statistics course (Econ 2320) in the School of Business and Economics.Footnote1,Footnote2 One of the authors taught these sections in the fall of 2019. The instructor used the same textbook and had the same course curriculum for all three of these sections.Footnote3 However, the evaluation systems in these three sections differed from one another. In the first section, there were no assignments or quizzes and the evaluation was based on two midterms, and a final exam. In the second section, the evaluation system was based on homework assignments, two midterms, and a final exam. The third section had an evaluation system that consisted of quizzes, two midterms, and a final exam. The sample sizes for the first, second, and third sections were 30, 27, and 34, respectively.

2.3 Empirical Method

The study uses the DID method to examine the impact of assignments/quizzes on students’ midterm grades. DID is a quasi-experimental design, and it can be used when two periods of data are available for treatment and control groups. To measure the treatment effect, the DID estimator looks at the difference between the average outcome in the control and treatment groups, before and after the treatment (Abadie Citation2005). To clarify how this method works, let’s assume that we have data for two periods: 0 and 1. Assume that one section, Section A, is administered the treatment (assignments/quizzes) between periods 0 and 1, while the second section, Section B, does not receive the treatment. Let the difference in the outcome for Section A be O¯A1O¯A0, while the difference in the outcome for Section B is O¯B1O¯B0.

With the assumption that O¯B1O¯B0 provides a good estimate of the changes in outcomes for Section A had Section A not received the treatment, the treatment effect can be defined as follows:β=(O¯A1O¯A0)(O¯B1O¯B0).

This study estimates the treatment effect by running the following regression:(1) Oit=β0+β1T+β2A+β3A×T+β4Xit+εit,(1) where Oit is the educational outcome represented by the midterm grades for student i in period t, A is a dummy variable indicating the section to which individual i belongs (A = 1 if i is in the section that received the treatment, and A = 0 otherwise), T is a dummy variable for the time period (T = 0 for period 0, and T = 1 for period 1). The estimation also controls for the following covariates (X): gender, student residency status (i.e., domestic vs. international), and GPA.

The DID estimator is the coefficient β3, which captures the interaction between the treatment dummy variable (A) and the period dummy variable (T).

Between the start of the course and midterm 1, none of the three sections of students had any graded assignments or quizzes. Between midterms 1 and 2, there were no graded assignments or quizzes in Section 1, but homework assignments were given in Section 2. During that time, in every class period (except for first one after midterm 1) the students were assigned homework based on the material that was covered during the preceding class period, which they had to hand in for grading. There were a total of 8 graded homework assignments, each consisting of short-answer questions. Each homework assignment should have taken the students about 30 min to complete. Between midterms 1 and 2, quizzes were introduced in Section 3. During that time, the students spent 10–15 min during each class period (except for the first one after midterm 1) taking quizzes based on the material covered during the preceding period. There were a total of 8 quizzes, each consisting of multiple-choice questions. The instructor used the question bank’s difficulty rating to ensure that the level of difficulty of each of the two midterm exams was the same across all three sections of the course.

3 Results of the Study

The descriptive statistics are shown in . The descriptive statistics on GPA by gender and residency status are shown in . The results presented in suggest that irrespective of section, males and international students have a lower GPA than females and domestic students, respectively. The results of show that the students in Section 1 had the highest GPA, and the students in Section 2 had the lowest. The same is true of the highest and lowest average grades on midterm 1. The descriptive statistics also show that the students in Section 2 had the highest average midterm 2 grade, and the students in Section 1 had the lowest. However, these differences are not statistically discernible (significant) at the 5% level of discernibility (significance). shows that after the introduction of homework assignments and quizzes in Sections 2 and 3, respectively, the students in Section 1 did worse on midterm 2 than the students in Sections 2 and 3. There, the difference between Sections 1 and 2 was statistically discernible, but the difference between Sections 1 and 3 was not. The question is whether the students in Sections 2 and 3 performed better than the students in Section 1 on midterm 2 because of the introduction of assignments and quizzes. To analyze this issue, the study utilized the DID method.

Table 1 Descriptive statistics.

Table 2 Descriptive statistics: GPA by gender and residency.

shows the results of the DID estimation, with the introduction of assignments being the treatment. The DID estimator suggests that the introduction of assignments had a statistically discernible positive impact on the exam grades. The results also show that GPA had a statistically discernible positive impact on exam grades, while being an international student had a statistically discernible negative impact. On the other hand, gender was found not to have a statistically discernible impact on exam grades.

Table 3 DID estimation with assignments as treatment.

The results of the DID estimation with the introduction of quizzes as treatment are shown in . The sample for that analysis included students from Section 1 and Section 3. The DID estimator suggests that the introduction of quizzes had no statistically discernible effect on exam grades. In this model, only GPA had a statistically discernible positive impact on exam grades.

Table 4 DID estimation with quizzes as treatment.

3.1 Subgroup Analyses

The study conducted subgroup analyses based on gender and residency status. The study adopted subgroup analysis instead of using the interaction term, since there are gender differences due to unobserved factors such as academic motivation (Vecchione, Alessandri, and Marsicano Citation2014). Furthermore, there are differences between domestic and international students with regard to cultural and educational background. shows the results, by gender, of DID estimation with the introduction of assignments being the treatment. They suggest that the introduction of assignments had a statistically discernible positive impact on exam grades in only the male sample. GPA had a statistically discernible positive impact on exam grades in both male and female samples. shows the results of the DID estimations for male and female samples with the introduction of quizzes as treatment. They suggest that the introduction of quizzes did not have a statistically discernible impact on exam grades in either of these samples. However, GPA positively impacts exam grades in both samples.

Table 5 DID estimation with assignments as treatment, by gender.

Table 6 DID estimation with quizzes as treatment, by gender.

shows the results, by student residency status (domestic vs. international), of DID estimations with the introduction of assignments as treatment. They suggest that the introduction of assignments had a statistically discernible positive effect on exam grades in the international sample but not in the domestic sample. They also suggest that GPA positively impacted exam grades in both samples. The results of DID estimations for the domestic and international student samples with the introduction of quizzes as treatment are shown in . They suggest that the introduction of quizzes had a statistically discernible positive impact on exam grades in the international student sample, while it had no discernible impact in the domestic student sample. Similar to other results, GPA had a statistically discernible positive effect on exam grades in both domestic and international student samples.

Table 7 DID estimation with assignments as treatment, by student residency.

Table 8 DID estimation with quizzes as treatment, by student residency.

3.2 Robustness Check

To test the robustness of the findings, the study examined the impact of assignments and quizzes on the final examination results. The result of the DID estimations on the impact of assignments on final examination results are shown in . The results suggest that assignments had a statistically discernible positive impact on the final examination score. shows the results of DID estimation on the impact of quizzes on the final examination. These results also suggest that quizzes had a statistically discernible positive impact on the final examination score. Unlike the midterms, the final examination contained questions that were similar to the assignment questions and multiple-choice questions that were similar to the quiz questions. This might explain why the students to whom quizzes were administered also demonstrated improved performance on the final examination.

Table 9 DID estimation with assignments as treatment.

Table 10 DID estimation with quizzes as treatment.

4 Conclusion

This article used primary data from students taking an Introductory Statistics course at a small Canadian business school to examine the impact of assignments and quizzes on their exam grades. To analyze the causal impact, the study utilized the DID method. In the overall sample, assignments had a statistically discernible positive impact on exam grades, while quizzes had no statistically discernible impact. The study also conducted subgroup analyses based on gender and student residency status. The subgroup analyses suggest that assignments had a statistically discernible impact on exam grades for males but no statistically discernible impact for females. On the other hand, quizzes had no statistically discernible impact on exam grades for males or females. The subgroup analyses based on student residency status show that both assignments and quizzes positively impacted exam grades for international students, while neither assignments nor quizzes had any statistically discernible impact on exam grades for domestic students. In all of the subgroups, GPA had a statistically discernible positive impact on exam grades.

The result of this study that homework assignments have a statistically discernible positive impact on exam grades is consistent with the findings of the other studies reviewed in this article (Eren and Henderson Citation2008; Trost and Salehi-Isfahani Citation2012; Grodner and Rupp Citation2013). The result that quizzes do not have a statistically discernible effect on academic performance is consistent with the results of Azorlosa and Renner (Citation2006) and Galizzi (Citation2010).

The results of this study have important implications for teaching practices. Business students find statistics to be hard compared to other courses. The result that assignments positively impact exam grades for the students in a statistics course implies that instructors can use this method to improve their students’ performance. In particular, assignments help males and international students to improve their performance. Based on GPA, males and international students had poor academic background.

In general, international students struggle with lower-level courses because of the language barrier as well as being used to a different education system. This study implies that assignments can help students with a poor academic background to perform well in a course. The results further suggest that quizzes help international students.

A possible reason for the improved performance in classes where assignments were introduced is that the midterm exam included short-answer questions that were similar to assignment questions. On the other hand, the midterm exam did not include any multiple-choice questions and quizzes consisted of multiple-choice questions. It is possible that the dissimilarity between the format of the quizzes and that of midterm 2 is the reason why students in the section with quizzes did not do discernibly better than the students in the section that had neither assignments nor quizzes. The implication of these results confirms the finding of Azorlosa (Citation2011) that to improve their students’ performance, instructors need to use the same format (e.g., short-answer questions or multiple-choice questions) on their midterms as on their assignments and/or quizzes.

A limitation of this article is that it did not adopt any procedure to control the family-wise error rate (the probability of at least one Type I error).

Future studies may address the relationship between in-class quizzes and class attendance and examine the impact of online homework assignments on academic performance. In recent times, online assignments have become easier to implement (many textbooks offer online practice questions), and it will be interesting to see the effectiveness of such online assignments in improving academic performance.

Notes

1 The course covered the following topics: descriptive statistics, the concept of probability, probability distributions, sampling distribution, confidence interval estimations, and hypothesis testing. The learning outcomes involve the ability to apply these techniques in business and economics. The quizzes and homework were mapped to course topics and were designed to test whether students have the knowledge of the course concepts and whether they can apply statistical techniques.

2 An upper-level Statistics course, focusing on ANOVA, regression analyses, and forecasting, is also required of students who are majoring in business or economics.

3 The co-author used the following textbook: David R. Anderson, Dennis J. Sweeney, Thomas A. Williams, Jeffrey D. Camm, and James J. Cochran (2016). Statistics for Business & Economics. Cengage Learning. Quiz materials accompany the textbook.

References

  • Abadie, A. (2005), “Semiparametric Difference-in-Differences Estimators,” The Review of Economic Studies, 72, 1–19. DOI: 10.1111/0034-6527.00321.
  • Archer, K. K., and Olson, M. (2018), “Practice. Practice. Practice. Do Homework Management Systems Work?,” International Journal for the Scholarship of Teaching and Learning, 12, 12–17. DOI: 10.20429/ijsotl.2018.120212.
  • Azorlosa, J. L. (2011), “The Effect of Announced Quizzes on Exam Performance: II,” Journal of Instructional Psychology, 38, 3–8.
  • Azorlosa, J. L., and Renner, C. H. (2006), “The Effect of Announced Quizzes on Exam Performance,” Journal of Instructional Psychology, 33, 278–283.
  • Clump, M. A., Bauer, H., and Whiteleather, A. (2003), “To attend or not to attend: Is that a good question?,” Journal of Instructional Psychology, 30, 220–224.
  • Cooper, H., Robinson, J. C., and Patall, E. A. (2006), “Does Homework Improve Academic Achievement? A Synthesis of Research, 1987–2003,” Review of Educational Research, 76, 1–62. DOI: 10.3102/00346543076001001.
  • Eren, O., and Henderson, D. J. (2008), “The Impact of Homework on Student Achievement,” Econometrics Journal, 11, 326–348. DOI: 10.1111/j.1368-423X.2008.00244.x.
  • Galizzi, M. (2010), “An Assessment of the Impact of Online Quizzes and Textbook Resources on Students’ Learning,” International Review of Economics Education, 9, 31–43. DOI: 10.1016/S1477-3880(15)30062-1.
  • Galyon, C. E., Voils, K. L., Blondin, C. A., and Williams, R. L. (2015), “The Effect of Randomized Homework Contingencies on College Students’ Daily Homework and Unit Exam Performance,” Innovative Higher Education, 40, 63–77. DOI: 10.1007/s10755-014-9296-1.
  • Grodner, A., and Rupp, N. G. (2013), “The Role of Homework in Student Learning Outcomes: Evidence From a Field Experiment,” The Journal of Economic Education, 44, 93–109. DOI: 10.1080/00220485.2013.770334.
  • Johnson, B. C., and Kiviniemi, M. T. (2009), “The Effect of Online Chapter Quizzes on Exam Performance in an Undergraduate Social Psychology Course,” Teaching of Psychology, 36, 33–37. DOI: 10.1080/00986280802528972.
  • Karadimitriou, K. (2016), “The Impact of Collaborative Graded Home Assignments on the Performance of University Students,” International Online Journal of Educational Sciences, 8, 62–70. DOI: 10.15345/iojes.2016.02.006.
  • Latif, E., and Miles, S. (2011), “The Impact of Assignments on Academic Performance,” Journal of Economics and Economic Education Research, 12, 1–12.
  • Padilla-Walker, L. M. (2006), “The Impact of Daily Extra Credit Quizzes on Exam Performance,” Teaching of Psychology, 33, 236–239. DOI: 10.1207/s15328023top3304_4.
  • Rehfeldt, R. A., Walker, B., Garcia, Y., Lovett, S., and Filipiak, S. (2010), “A Point Contingency for Homework Submission in the Graduate School Classroom,” Journal of Applied Behavior Analysis, 43, 499–502. DOI: 10.1901/jaba.2010.43-499.
  • Trost, S., and Salehi-Isfahani, D. (2012), “The Effect of Homework on Exam Performance: Experimental Results From Principles of Economics,” Southern Economic Journal, 79, 224–242. DOI: 10.4284/0038-4038-79.1.224.
  • Vecchione, M., Alessandri, G., and Marsicano, G. (2014), “Academic Motivation Predicts Educational Attainment: Does Gender Make a Difference?,” Learning and Individual Differences, 32, 124–131. DOI: 10.1016/j.lindif.2014.01.003.