2,296
Views
31
CrossRef citations to date
0
Altmetric
RESEARCH IN ECONOMIC EDUCATION

Evaluating the flipped classroom: A randomized controlled trial

, &
Pages 115-129 | Published online: 14 Mar 2018
 

ABSTRACT

Despite recent interest in flipped classrooms, rigorous research evaluating their effectiveness is sparse. In this study, the authors implement a randomized controlled trial to evaluate the effect of a flipped classroom technique relative to a traditional lecture in an introductory undergraduate econometrics course. Random assignment enables the analysis to eliminate other potential explanations of performance differences between the flipped and traditional classrooms, while assignment of experimental condition by section and lesson enables improved statistical precision. The authors find that the flipped classroom increases scores on medium-term, high-stakes assessments by 0.16 standard deviation, with similar long-term effects for high-performing students. Estimated impacts are robust to alternative specifications accounting for possible spillover effects arising from the experimental design.

JEL codes:

Acknowledgments

The authors thank Victoria Bhavsar, Ashley Miller, Amy Munson, Gregor Novak, Sarah Robinson, Lauren Scharff, participants at the 2017 Conference on Teaching and Research in Economic Education and the 2016 International Society for the Scholarship of Teaching and Learning Conference, two anonymous reviewers at the U.S. Air Force Academy, and two anonymous referees for their comments and suggestions on study design, implementation, and analysis. The authors also thank the Department of Economics and Geosciences and the students in Econometrics I for their support in implementing the research. The views expressed in this article are those of the authors and not necessarily those of the U.S. Air Force Academy, the U.S. Air Force, the Department of Defense, or the U.S. Government.

Notes

1. See Bishop and Verleger (Citation2013) for an overview of research on flipped classrooms.

2. Schochet (Citation2008) provides a thorough treatment of statistical precision in clustered randomized experiments.

3. Randomized trials with assignment at the student by lesson level are possible for certain types of interventions, such as technological interventions, providing similar benefits when feasible.

4. So-called “crossover” designs are more common in clinical trials (Wellek and Blettner Citation2012). However, the analytic methods used by Prunuske et al. (Citation2016) appear to treat students' learning gains as independent despite the clustered nature of the design. Single-case experimental design, another technique for estimating the impact of an intervention with a modest sample size, requires a similar assumption that impacts of an intervention do not carry over after the intervention is stopped by the researcher (Dallery, Cassidy, and Raiff Citation2013).

5. We used a computer program with a random number generator to determine the classroom assignments. As suggested by Moulton (Citation2004), the assignment program looped until it identified an outcome that met the criteria for the number of flipped lessons in each section and number of sections with flipped lessons. This method ensures the randomness but not independence of random draws across lessons or sections.

6. This study cannot distinguish between the effects of the various components of the flipped classroom condition, such as more individualized instructor feedback or the availability of the videos.

7. We were interested in flipping more lessons but were limited by a lack of videos appropriate for the course content and the time-consuming nature of creating new multimedia content.

8. Because an exam question may draw on material from multiple lessons, we each made independent determinations of the lesson material primarily associated with each question. The pattern of results is not sensitive to which author's judgments are used.

9. Lesson type remains uncorrelated with the model's error term under our assumption because the lesson fixed effects absorb any systematic differences in performance in the nonexperimental lessons. Intuitively, the additional data improve identification of the individual fixed effects but not the treatment effect.

10. In particular, our analysis of qualitative assessment data does not control for student effects, as this would limit the sample to sections of the course that were assigned opposite styles on the two classes with these surveys.

11. Because each student participated in equal numbers of flipped and lecture classes by design, however, student fixed effects are mechanically uncorrelated with lesson type, except where data are missing.

12. Estimated impacts are nearly identical when limiting the sample to experimental lessons.

13. An exception is for the short-term effects, where the standard error of the impact estimate is slightly larger when clustering by section (0.057) or individual and section (0.059).

14. Specifically, we estimate a model that interacts flipped status with dummies for each experimental lesson, and we fail to reject that the impact of the flipped classroom is the same across all lessons (p > 0.10).

15. Despite the low stakes of the short-term assessments, average scores were well above levels consistent with random guessing, suggesting that lack of student effort does not explain the results. In results available from us, we also found that the estimated impact does not vary with the time elapsed from the lesson's coverage to the assessment, suggesting that study patterns rather than time elapsed explain the difference between short- and medium-term assessments.

16. Out of necessity, this specification excludes individual and lesson fixed effects, and the standard error is clustered by section.

17. We did not have the ability to track student access to videos, so we cannot rule out that students accessed videos assigned to other sections. However, we provided students with links to videos assigned only to them, and did not advertise which lessons had videos assigned to another section.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 130.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.