Abstract
Introduction to American Government is a foundational general education course meant to promote understanding of democracy and students’ ability to participate in it. But, there is substantial variation in how the course is structured: it can enroll anywhere from a dozen students to hundreds; it can be delivered online, face-to-face, or in hybrid format; it can feature active, interactive, or passive pedagogy. Does course structure (i.e., class size, modality, active/interactive pedagogy) affect students’ learning about democracy? We surveyed students enrolled in American government at a large university both at the beginning and end of the course. We leverage variation in structure at the university to assess its impact on growth in students’ political efficacy and confidence-in-knowledge over the semester. We find that courses that feature more active/interactive learning exhibit greater student growth in both efficacy and knowledge confidence, robust to model specification. We also find that face-to-face and online courses produce greater gains than hybrid courses, but this depends on model specification. We find no support for a direct effect of class size on student learning. The results illuminate how best to structure courses to achieve civic education goals.
Acknowledgment
The authors would like to thank Emma Flournoy for her research assistance.
Data availability statement
The data that support the findings of this study are openly available at https://doi.org/10.15139/S3/BVCIOH.
Notes
1 See Means et al. Citation2010 for full discussion of active, interactive, and instructor-directed (or expository) pedagogy.
2 As we discuss in the conclusion section, it is possible that the effect of a given course structure is mediated by other course structures. For example, modality could be mediated by the degree of active/interactive pedagogy employed by the instructor, especially as some modalities facilitate active/interactive learning more easily than others. But given the lack of attention in the extant literature to the direct effect of each course structure on learning while controlling for the other course structures, we pose expectations that suppose direct effects and test for these effects. Further, in the survey research design that we employ we did not obtain sufficient combinations of course structures to be able to assess conditional effects. While it was our hope to do so, this has become a long-term goal as we continue to gather data that adds to the variation in course structure combinations.
3 Approval for this study (IRB-20-559) was obtained from the Oklahoma State University Institutional Review Board (IRB).
4 See Part A of Online Supplemental Material for information about survey question wording.
5 The external efficacy scale is less consistent than the other scales with a Cronbach’s alpha of 0.58 for the pre-test and 0.57 for the post-test. Dropping the “elections represent the will of the people” question improves alpha to 0.63. But the alternative 2-item scale also does not vary from the pre-test to post-test. Students simply did not exhibit much change in external efficacy over the semester.
6 A difference of means test comparing external efficacy for the pre-test and post-test is not statistically significant, while the differences for confidence-in-knowledge and internal efficacy are.
7 For histogram/boxplot of external efficacy, see Figure F1 in Online Supplemental Material.
8 In results not reported here, we also estimated the models with a continuous and trichotomous measure of class size and obtained the same substantive results.
9 We also tried running our models with essays included as active/interactive pedagogy. The substantive results were unchanged.
10 In results not reported here, we also estimated the models with a continuous measure of active/interactive pedagogy and obtained the same substantive results.
11 We also provide the results of baseline models (i.e. without student-level controls) in Part B of the Online Supplemental Material.
12 In alternative specifications reported in Figure C1 of Online Supplemental Material, we use a multilevel mixed effects model to control for individual instructor and section effects. The substantive results strongly support the key takeaways.
13 These models are reported in Table D1 of the Online Supplemental Material.
14 See Table D1 of Online Supplemental Material.
15 These course structure combinations are (1) online courses with small enrollments and no active/interactive pedagogy, (2) online courses with large enrollments and no active/interactive pedagogy, (3) online courses with extra-large enrollments and moderate active/interactive pedagogy, (4) hybrid courses with medium enrollments and high active/interactive pedagogy, (5) F2F courses with medium enrollments and no active/interactive pedagogy, and (6) F2F courses with medium enrollments and low active/interactive pedagogy. See Table 1 for more information.
Additional information
Notes on contributors
Joshua M. Jansa
Joshua M. Jansa is an Associate Professor of Political Science at Oklahoma State University where he teaches Introduction to American government among other courses. His research focuses on state politics and policy, political and economic inequality, and civic education.
Eve M. Ringsmuth
Eve M. Ringsmuth is an Associate Professor of Political Science at Oklahoma State University where she teaches Introduction to American government among other courses. Her research focuses on judicial behavior, American political institutions, and civic education.