4,124
Views
3
CrossRef citations to date
0
Altmetric
Articles

Explanation and Facilitation Strategies Reduce Student Resistance to Active Learning

Abstract

Active learning increases student learning, engagement, and interest in STEM and subsequently, the number and diversity of graduates. Yet, its adoption has been slow, partially due to instructors’ concerns about student resistance. Consequently, researchers proposed explanation and facilitation instructional strategies designed to reduce this resistance. Using surveys from 2-year and 4-year institutions including minority-serving institutions, we investigate the relationship between students’ affective and behavioral responses to active learning, instructors’ use of strategies, and active learning type. Analyses revealed low levels of student resistance and significant relationships between both explanation and facilitation strategy use and positive student responses.

Introduction

Active learning

Active learning, defined by Prince (Citation2004) as “any instructional method [used in the classroom] that engages students in the learning process” (223), is a learner-centered teaching practice that “engages students in the hard, messy work of learning … and … motivates students by giving them some control over learning processes” (Weimer Citation2012, 1). It has been the focus of educational research for decades, both in K-12 settings (e.g., Boyer Citation2002, Harmin Citation1994) and postsecondary education (Bonwell and Eison Citation1991, Faust and Paulson Citation1998) and it is widely considered a key principle for good instructional practices in undergraduate education (Chickering and Gamson Citation1987).

There are many forms of active learning (e.g., in-class problem solving, think-pair-share, and role play) that can be classified in various ways. Here, we use the Chi and Wylie (Citation2014) interactive-constructive-active-passive (ICAP) model to classify instructional activities by their level of student engagement. According to the ICAP model, passive instruction occurs when students are oriented toward receiving information from the instructor without directly doing anything else, such as when listening to the instructor lecture during class or watching the instructor demonstrate how to solve problems. Active instruction occurs when students individually engage in some form of “overt motoric action or physical manipulation” with the course content, such as taking verbatim notes while the instructor solves a problem or underlining text students deem to be important while reading. Constructive instruction involves individuals generating new ideas beyond what was provided in the learning materials, as occurs when drawing concept maps, asking or answering questions in class, or solving problems individually during class. And finally, interactive instruction occurs when students dialogue together with classmates, as when solving problems in a group or discussing concepts with peers during class.

An overwhelming body of literature documents the benefits of instructors adopting active learning techniques in their classrooms, including improved student learning and performance in science, technology, engineering, and math (STEM) disciplines (Freeman et al. Citation2014, Haak et al. Citation2011, Lucke, Dunn, and Christie Citation2017, Prince and Felder Citation2006). Further, the boosts to student learning have been shown to increase as instruction moves from passive to active to constructive to interactive (Chi and Wylie Citation2014). Additionally, active learning has been demonstrated to increase student engagement and subsequently, interest in STEM (Seymour and Hewitt Citation1997, Koch et al. Citation2017, Prince Citation2004, Smith et al. Citation2005, Lucke, Dunn, and Christie Citation2017), and it has been proven to be particularly effective for educating a diverse student body and increasing student performance in STEM (Theobald et al. Citation2020).

However, despite the overwhelming evidence of the benefits of active learning, the translation of this research to classrooms has been slow (Jamieson and Lohmann Citation2012, Dancy, Henderson, and Turpen Citation2016, Hora, Ferrare, and Oleson Citation2012, National Research Council Citation2012, Christie and de Graaff Citation2017), especially in STEM, where the majority of classrooms are still taught using lecture-based methods (Stains et al. Citation2018). The literature has identified several barriers to instructors’ use of active learning, with fear of negative student responses being one of the most frequently cited barriers (Prince et al. Citation2013, Felder and Brent Citation1996, Finelli et al. Citation2014, Henderson and Dancy Citation2009). These concerns subsequently prevent the overwhelming benefits of active learning techniques from reaching as many students as possible (Stains et al. Citation2018, Borrego, Froyd, and Hall Citation2010).

Students’ responses to active learning

Students can respond to active learning in many ways, such as showing open resistance, rushing through an activity, or being deeply engaged, for example. Students’ responses can also vary by type of active learning. For instance, some research has shown that working with other students in a cooperative learning environment (i.e., engaging in interactive instruction) may diminish the value of active learning instruction (Machemer and Crawford Citation2007) and generate student resistance (Lake Citation2001, Bacon, Stewart, and Silver Citation1999, Donohue and Richards Citation2009). To better understand students’ responses to active learning, we developed a conceptual framework that integrates constructs from classroom engagement (Fredricks, Blumenfeld, and Paris Citation2004), productive engagement (Chasteen Citation2014), and resistance (Weimer Citation2002). We also include the concept of evaluation in our framework because of the importance of end-of-term student evaluations among STEM instructors.

In our framework (DeMonbrun et al. Citation2017), we define two types of affective response (value and positivity) and three types of behavioral response (participation, distraction, and evaluation) to active learning (see ). Value is a type of cognitive engagement which reflects the degree to which students see the activity as worthwhile; positivity is an affective-emotional measure of engagement which indicates how positive or negative students feel about an activity; participation and distraction are measures of behavioral engagement that indicate the extent to which students do or do not participate and the extent to which students distract themselves or their peers during the learning process, respectively; and evaluation is a measure of the way students rate the instructor or course at the end of the term.

Table 1. Student responses to active learning.

As we’ve applied our framework (summarized in Finelli and Borrego Citation2020), we have found that, in fact, students don’t often respond negatively to active learning, despite instructors’ fears about student resistance. In a systematic review of the literature about student response to active learning (Shekhar et al. Citation2020), we found that just 57 of 412 studies on active learning in undergraduate STEM classrooms report any type of negative response to active learning from students. Further, through classroom observations (Nguyen et al. Citation2017, Shekhar et al. Citation2015), we found that students rarely demonstrate the open resistance that instructors fear most; rather, students more often exhibit passive resistance by being distracted or talking to their neighbor about something unrelated. And similarly, we find that students’ responses to active learning are mostly positive across surveys of nearly 1,500 undergraduate students (Finelli et al. Citation2018, Andrews et al. Citation2020). Consistently across these studies, students report that they saw value in an activity when an instructor asked them to engage in active learning, felt positively about it, participated, and planned to evaluate the course/instructor highly; congruently, they do not report that they were distracted.

Strategies to reduce student resistance

Still, concerns about student resistance remain a key barrier to the adoption of active learning in postsecondary classrooms, particularly beyond the 4-year, primarily white institutions at which most prior studies have been conducted. Through literature synthesis (Nguyen et al. Citation2021), interviews (Tharayil et al. Citation2018), and a prior study of 18 engineering courses (Finelli et al. Citation2018), we have identified several strategies instructors can use to increase engagement and reduce student resistance during active learning. These strategies can be grouped into two categories: explanation, or the ways an instructor introduces the activity and describes its purpose, and facilitation, how an instructor works to promote engagement and keep the activity running smoothly. We list specific examples of each type of strategy in .

Table 2. Strategies instructors use to reduce student resistance to active learning.

In our prior work specific to the implementation of these strategies in engineering classrooms, we have shown that instructors’ use of both types of strategies are correlated with higher participation and evaluation, as well as less distraction (Finelli et al. Citation2018). Here, we build upon prior work (e.g., Finelli et al. Citation2018, Andrews et al. Citation2020) by using a larger, broader, and more diverse sample; employing a stronger methodology which accounts for variability attributable to instructors/classrooms; and by controlling for the type of active learning used. Importantly, this study examines the question of explanation and facilitation strategy efficacy in a larger variety of institutional contexts (2-year, 4-year, minority-serving institutions), across multiple STEM disciplines, and with more diverse student populations.

Research questions

Using data collected from instructor and student surveys based on the same class period of STEM courses, we seek to understand:

  1. What are students’ affective and behavioral responses (value, positivity, participation, distraction, evaluation) to active learning?

  2. What is the relationship between students’ perception of their instructor’s use of strategies (explanation, facilitation) and these responses?

  3. What is the relationship between student responses and the type of active learning (interactive, constructive, none) used?

In doing so, we expand upon prior understandings of student resistance and strategies to reduce it in STEM classrooms, using quantitative empirical methods and validated instruments.

Methods

Participant recruitment

We recruited instructors for this project via email, as part of a broader study on active learning in undergraduate STEM classrooms. Rather than use our typical professional networks that would bias the sample toward 4-year, primarily white institutions, we used a geographic sampling method to recruit participants from a wider variety of institutions that would be more representative of STEM undergraduate student populations. To identify participants (n = 48), we first compiled a list of all 2-year and 4-year institutions of higher education within 150 miles of our research teams in either Austin, TX, or Eugene, OR, that offered at least one STEM degree program. Here, we define STEM as including chemistry, computer and information sciences, engineering, geosciences, life sciences, materials research, mathematical sciences, and physics/astronomy (United States Government Accountability Office Citation2014).

Then, we accessed each institution’s public website to identify a contact within each STEM department, who was asked to distribute a recruitment letter to fellow instructors and to consider applying themselves. In most cases, an administrator (e.g., Department Chair) was contacted; else, all listed STEM instructors were contacted directly. Our recruitment materials, sent via email, advertised an opportunity to participate in classroom-based data collection; instructors were eligible to participate in the study if they were teaching a first- or second-year STEM course and planned to use active learning in their classrooms. We obtained approvals for all data collection protocols from the local institutional review boards.

Data collection

To better understand STEM instructors’ and students’ attitudes and behaviors regarding active learning, we administered online surveys to instructors and their students. First, we asked instructors to identify a target class session in which they planned to use active learning, and then we administered surveys immediately after that class session. We also asked instructors to describe their activities during the target class session. The instructor survey measured instructors’ attitudes toward and use of active learning, strategies used to reduce student resistance, and perceptions of student behavior. The corresponding student survey asked students to report their instructors’ teaching practices, as well as students’ own attitudes and behaviors during the same target class session. A subset of that data is examined here. Data collection took place in in-person courses at 2-year and 4-year institutions during the Fall of 2019 and Spring of 2020, before the pandemic; the student and instructor surveys, which were administered online, each refer to the same class period of one course, during the middle of each respective semester. Students and instructors both had up to a week to respond to the survey.

Our analytical sample is comprised of n = 48 instructors and their n = 1,020 students. Instructors taught at 21 institutions across the United States, which are mostly public and a nearly equal split of 2-year and 4-year institutions. Additionally, 10 of the institutions in the sample are Minority-Serving Institutions (MSIs); two of these institutions are Historically Black Colleges and Universities (HBCUs) and eight are Hispanic Serving Institutions (HSIs). Instructors represented a variety of STEM departments and taught courses with a breadth of class sizes, with reported enrollment ranging from 10-200 students. An average of 26 students per instructor responded to our survey.

provides a characteristic overview of instructors (e.g., gender identity, STEM discipline) and their institutions. Instructors were prompted to self-describe their racial identity, to indicate whether they identified as “Hispanic, Latinx or Chicanx” and were asked to indicate their gender identity from six multiple-choice options, with additional options to self-describe or select prefer not to answer. Instructors reflect the national-level racial and ethnic diversity of STEM instructors at 4-year institutions (National Science Foundation, and National Center for Science and Engineering Statistics Citation2019), though our sample shows an overrepresentation of instructors who identify as female. Of the STEM disciplines, Biology classrooms are the most represented in this sample, with 13 of 21 instructors.

Table 3. Instructor & institutional characteristics, n = 48.

Additionally, provides an overview of the students in our sample. Like the instructors, students were prompted to self-describe their racial identity, to indicate whether they identified as “Hispanic, Latinx or Chicanx” and were asked to indicate their gender identity from six multiple-choice options, with additional options to self-describe or select prefer not to answer. Students of racial and ethnic backgrounds which are underrepresented in STEM (defined by NSF as: Black or African American, Hispanic or Latinx, and American Indian or Alaskan Native) color comprise 27% of the sample, which is an overrepresentation compared to national-level statistics from 4-year institutions (National Science Foundation, and National Center for Science and Engineering Statistics Citation2019). Additionally, 51% of students in this sample identify as male and 43% identify as female; here, we see an underrepresentation of students who identify as female (National Science Foundation, and National Center for Science and Engineering Statistics Citation2019).

Table 4 . Student demographics, n = 1,020.

Instructor measures

Our instructor survey, administered via Qualtrics immediately after the target class period, collected data on eight validated metrics in three main categories: type of active learning, use of strategies to reduce student resistance, and student responses, as detailed in and 2 (Husman et al. Citation2020). First, instructors were prompted to think of and describe a specific activity that they incorporated into class that day. Two authors/researchers then coded these responses by type of active learning, according to the ICAP framework of cognitive engagement (Chi and Wylie Citation2014). In this framework, student engagement is characterized as interactive, constructive, active, or passive. The key distinction between the first two categories is that interactive activities involve student collaboration (e.g., discussion, consensus building, brainstorming) whereas constructive activities consist of students individually generating new information beyond what is presented (e.g., minute papers, student polls, etc.). In our dataset, 43 of 48 instructors used active learning and provided enough detail about the activity for coding; of these instances, 32 (75%) were interactive, and 11 (25%) were constructive activities. Examples of each from our instructor surveys are detailed in .

Table 5. Interactive and constructive activity examples.

Additionally, instructors were guided to reflect on their use of explanation and facilitation strategies (see ) on ten-point Likert-type scales. Explanation strategies, comprised of four items, emphasize how an instructor can frame the purpose and the goal of the activity. Facilitation strategies, comprised of seven items, focus on how to better engage students in the activity. We averaged instructor responses across all items within factors to create a mean score for each measure.

Student measures

Student survey measures were adapted from the surveys developed in our prior work and correspond directly with the factors included on the instructor survey (DeMonbrun et al. Citation2017). The online survey, which took approximately ten minutes to complete, was administered via Qualtrics immeadiately after the target class period. Students were not obligated to participate and were asked to indicate their consent to do so at the beginning of the survey. Students were not provided with a strict definition of active learning before completing the survey; instead survey question stems prompted students to “think about any activities you engaged in as a part of this class period.”

Students were first prompted to report instructors’ use of explanation and facilitation strategies (see ) during a specifc class period. Then, students were asked to report their own affective and behavioral response to any active learning activies during the specific class period, including their value, positivity, participation, distraction, and evaluation of the instructor (see ). All items were measured using a seven-point Likert-type scale. Mirroring data processing for instructor-level items, we averaged students’ responses across all items within factors to create a mean score for each measure.

Statistical analyses

First, we calculated descriptive statistics of instructors’ and students’ responses on their respective surveys. Then, due to the hierarchical nature of our data (i.e., students nested within classrooms), we used multilevel modeling to predict students’ affective and behavioral responses to active learning. We calculated variance partition coefficients (VPC) from the null models, which ranged from 0.08-0.25, indicating that this method was appropriate. VPCs represent the portion of variability in the dependent variable that is attributable to Level 2 (i.e., instructors), and estimates above 0.05 indicate a multilevel model is appropriate (Mehmetoglu and Jakobsen Citation2016). We used mixed-effects models, which give accurate estimates of the fixed effects (i.e., strategies used) in the presence of correlated errors (i.e., random effects) attributed to a data hierarchy (i.e., students within classrooms) (Seltman Citation2012). This methodology, which accounts for variation in the dependent variable both between classrooms and within them, is more rigorous and more appropriate for the data than methodologies used in our prior work (e.g., Andrews et al. Citation2020).

Each student response (value, positivity, participation, distraction, evaluation) was modeled individually, resulting in five total multilevel mixed effects regression models, which controlled for the fixed effects of (1) students’ perceptions of instructors’ use of explanation strategies, (2) students’ perceptions of instructors’ use of facilitation strategies, and (3) the type of active learning. Additionally, the models account for the random effects of students being clustered by instructor (n = 48). Including these random effects allows the model to control for Level 2 aspects (i.e., instructors) that affect all of the Level 1 subjects (i.e., students) similarly within groups (i.e., classrooms). In preliminary analyses, we also controlled for instructors’ report of their own use of explanation and facilitation strategies; however, these estimates were never statistically significant and were removed from our models.

Results

We present means, standard deviations, and correlations of our measures in and results of five multilevel regression models predicting student responses to active learning (value, positivity, participation, distraction, evaluation) in . Our analyses reveal strong, positive, statistically significant relationships between student affective and behavioral responses to active learning and students’ perceptions of their instructors’ use of both explanation and facilitation strategies to reduce student resistance. We find little effect of the type of active learning used.

Table 6. Means, standard deviations, and correlations for student responses.

Table 7. Multilevel regression models predicting student responses, n = 1,020.

shows the Pearson’s bivariate correlation matrix; this test measures the strength and direction of linear relationships between variables, where a correlation of 1.00 indicates a perfectly linear, positive association. All correlation coefficients were less than 0.70, (with one exception: value and positivity, 0.76) indicating the constructs do not overlap with one another at a problematic level (Meyers, Gamst, and Guarino et al., Citation2006); in general, correlations below 0.30 indicate weak associations, between 0.30 and 0.50 indicate a moderate association, and those greater than 0.50 indicate strong associations. In addition, we report the means, standard deviations, and alphas for each factor.

Generally, students perceived that instructors frequently engaged in explanation and facilitation strategies, with means of 5.91 and 5.55, respectively. Additionally, consistent with our prior work (e.g., Andrews et al. Citation2020), students report largely positive affective and behavioral responses to active learning, where students valued, liked and participated in the activities; were seldom distracted; and planned to evaluate their instructors highly at the end of the course. Next, we present the results of five multilevel regression models predicting student responses to active learning (value, positivity, participation, distraction, evaluation) in .

Our models show strong, statistically significant relationships between students’ affective and behavioral responses to active learning and students’ perceptions of their instructors’ use of both explanation and facilitation strategies to reduce student resistance. With instructors’ use of explanation strategies (, row 1), we see strong, significant, positive relationships with students’ value, positivity, participation, and planned evaluation of the instructor, with estimates between 0.31-0.50 on variables with a scale of 1-7; correspondingly, explanation strategy use has a significant, negative association with student distraction. Facilitation strategies also have significant, positive associations with student responses, but the strength of these associations is weaker than explanation strategies. Facilitation strategies are positively associated with students’ value, positivity, participation, and planned evaluation, with estimates ranging between 0.13-0.33. Interestingly, facilitation strategies do not have a statistically significant relationship with students’ level of distraction. Similarly, we do not see any statistically significant associations between the type of active learning utilized during the target class session (interactive, constructive, or none).

While we find a clear pattern of statistically significant, positive effects of instructors using both explanation and facilitation strategies on student responses, interestingly, the amount of variance in the outcome accounted by the model itself varies between each student response. For instance, R-squared estimates of the model predicting students’ value indicate that 35% of the Level 1 variance (i.e., variance in the dependent variable due to student-level data) and 77% of the Level 2 variance (i.e., variance in the dependent variable due to instructor-level data) in the outcome are accounted for by the model. However, R-squared estimates for distraction account for only 4% and 22% of Level 1 and Level 2 variance, respectively. This implies that while use of explanation and facilitation strategies is significantly associated with positive student responses, there are other factors not captured by this model that contribute to student distraction.

Discussion

This study extends earlier work examining students’ response to active learning in undergraduate STEM classes (Andrews et al. Citation2020; Finelli et al. Citation2018) to demonstrate the consistent efficacy of explanation and facilitation strategies in reducing student resistance across more STEM disciplines, 2-year institutions, minority-serving institutions, and a more racially and ethnically diverse sample of students. The analytical methods also extend prior work by employing multilevel modeling to take into account variation in student response due to instructor and classroom characteristics, exploring a broader range of student response measures, and conducting a preliminary investigation into how student response may vary by active learning type. Our analyses reveal strong, positive, statistically significant relationships between student affective and behavioral responses to active learning and students’ perceptions of their instructors’ use of both explanation and facilitation strategies to reduce student resistance. We found little effect of the type of active learning used.

Positive student responses to active learning

Consistent with previous work (Andrews et al. Citation2020), yet contrary to instructors’ concerns, we found little evidence of student resistance to active learning in our study population (). Students in these active learning environments generally liked, valued, and participated in the activities; seldom distracted others; and often planned to evaluate their courses and instructors highly at the end of the course. These high levels of value, engagement, and positivity are consistent with prior student-focused literature on active learning in general (Lumpkin, Achen, and Dodd Citation2015), and in STEM specifically (Miller and Metz Citation2014, Smith and Cardaciotto Citation2011). Further, by demonstrating this result across 2-year and minority-serving institutions, our results lend support to arguments for active learning strategies to reduce achievement gaps (Theobald et al. Citation2020).

Previous research (Stains et al. Citation2018, Finelli et al. Citation2018, Hora, Ferrare, and Oleson Citation2012, Lucke, Dunn, and Christie Citation2017, Smith and Cardaciotto Citation2011, Koch et al. Citation2017) drew from predominantly white or masters and doctoral granting institutions and in many cases examined fewer student responses; the current analysis draws much more heavily from 2-year and minority serving institutions. The consistency of our findings related to student resistance levels throughout a larger and more diverse group of students and institutions should provide reassurance to instructors who view student resistance as a barrier to adopting active learning strategies. While individual students may demonstrate resistance to active learning – and we recognize that this in itself may deter some instructors from adopting new methods – on average students’ response to classroom activities is positive.

Impact of type of active learning

This analysis of how interactive and constructive activities relate to students’ response to active learning showed little variation of student response by active learning type. Activities in which students worked individually (i.e., constructive) rather than in small groups (i.e., interactive) showed more positive student response on average, but these differences were not statistically significant. Given the documented educational benefits and strong arguments in support of group activities’ potential to boost student learning outcomes (Chi and Wylie Citation2014), we do not believe that concerns related to greater distraction of students during group activities or greater resistance to group work should deter instructors from adopting collaborative forms of active learning. Instead, given that differences by activity type were not statistically significant, we encourage instructors to include both interactive and constructive activities given their benefits (Chi and Wylie Citation2014) and the lack of adverse associations demonstrated here. We also advocate for additional research exploring different student responses by active learning type.

Strategies reducing student resistance

Perhaps our most relevant finding for both individual instructors and faculty development personnel relates to instructor strategies to reduce student resistance. Our work finds significant positive correlations between both explanation and facilitation strategies and almost all measures of students’ affective and behavioral response to active learning. In short, instructors who take time to explain why they are using active learning, provide clear instructions for the activities, and engage with students during activities to encourage participation see more positive student responses related to participation, value, positivity, and evaluation. It is important to note that this finding was based on a diverse group of institutions, including a significant number of 2-year colleges and minority serving institutions, and is consistent with our previous research, supporting the generalizability of these results across all U.S. postsecondary contexts.

This research provides empirical evidence to support and quantify the strength of relationships suggested but not frequently verified in previous literature. Specifically, we demonstrate the degree to which explanation and facilitation strategies account for variation in students’ response to active learning and quantify the strength of the correlation between these strategies and student response variables. The data suggest that explanation strategies are generally even more effective than facilitation strategies for reducing student resistance; as instructors consider incorporating active learning into their classrooms, they should focus first on effectively presenting and explaining the activity in question, and second on facilitating throughout the activity.

Interestingly, our models predict some attitudes and behaviors better than others. Specifically, little variance in students’ distraction or participation is explained by our models, and instructors’ use of facilitation strategies did not have a statistically significant relationship to distraction. While we still find positive associations between strategy use and these behaviors, this finding implies that instructional choices are not significantly related to some student behaviors. In other words, this might mean that some students will simply be distracted during class (though not often, according to mean responses in ), and that distraction is neither because instructors are using active learning nor because they are not facilitating or explaining it well. Again, we encourage instructors to implement active learning despite their fears of negative student responses.

In all, our analyses have consistently shown relatively low levels of student resistance to active learning and positive correlations of student response with instructors’ use of explanation and facilitation strategies. While we believe that additional research should continue to replicate these findings, there is clearly a need for more nuanced research to examine how student, instructor, and classroom characteristics influence both student response to active learning and the degree to which instructor strategies influence this response. We also consider our analysis of how active learning type influences student resistance to be preliminary and believe this deserves additional study. Finally, the COVID-19 pandemic interrupted a randomized control trial examining student resistance and strategies to reduce that resistance; we hope to complete that randomized control trial in the near future to reduce potential bias in our current results.

Acknowledgements

This work was supported by the National Science Foundation under Grant numbers DUE-1821092, DUE-1821036, DUE-1821488, and DUE-1821277. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Disclosure statement

The authors declare that they have no competing interests.

Data availability statement

The datasets generated and analyzed during the current study are not publicly available but are available from the corresponding author on reasonable request.

References

  • Andrews, Madison E., Matthew Graham, Michael Prince, Maura Borrego, Cynthia Finelli, and Jenefer Husman. 2020. “Student Resistance to Active Learning: Do Instructors (Mostly) Get It Wrong?” Australasian Journal of Engineering Education 25 (2):142–54. doi: 10.1080/22054952.2020.1861771.
  • Bacon, Donald R., Kim A. Stewart, and William S. Silver. 1999. “Lessons from the Best and Worst Student Team Experiences: How a Teacher Can Make the Difference.” Journal of Management Education 23 (5):467–88. doi: 10.1177/105256299902300503.
  • Bonwell, C. C., and J. A. Eison. 1991. Active Learning: Creating Excitement in the Classroom. Washington, D.C.: The George Washington University.
  • Borrego, Maura, Jeffrey E. Froyd, and T. Simin Hall. 2010. “Diffusion of Engineering Education Innovations: A Survey of Awareness and Adoption Rates in US Engineering Departments.” Journal of Engineering Education 99 (3):185–207. doi: 10.1002/j.2168-9830.2010.tb01056.x.
  • Boyer, Kimberly R. 2002. “Using Active Learning Strategies to Motivate Students.” Mathematics Teaching in the Middle School 8 (1):48–51. doi: 10.5951/MTMS.8.1.0048.
  • Chasteen, S. 2014. Measuring and improving students’ engagement [Blog post]. accessed January 9. https://blog.sciencegeekgirl.com/2014/11/02/measuring-and-improving-students-engagement/.
  • Chi, Michelene T. H., and Ruth Wylie. 2014. “The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes.” Educational Psychologist 49 (4):219–43. doi: 10.1080/00461520.2014.965823.
  • Chickering, Arthur W., and Zelda F. Gamson. 1987. “Seven Principles for Good Practice in Undergraduate Education.” AAHE Bulletin 3 (7):1–6.
  • Christie, Michael, and Erik de Graaff. 2017. “The Philosophical and Pedagogical Underpinnings of Active Learning in Engineering Education.” European Journal of Engineering Education 42 (1):5–16. doi: 10.1080/03043797.2016.1254160.
  • Dancy, Melissa, Charles Henderson, and Chandra Turpen. 2016. “How Faculty Learn about and Implement Research-Based Instructional Strategies: The Case of Peer Instruction.” Physical Review Physics Education Research 12 (1):1–7. doi: 10.1103/PhysRevPhysEducRes.12.010110.
  • DeMonbrun, Matt, Cynthia J. Finelli, Michael Prince, Maura Borrego, Prateek Shekhar, Charles Henderson, and Cindy Waters. 2017. “Creating an Instrument to Measure Student Response to Instructional Practices.” Journal of Engineering Education 106 (2):273–98. doi: 10.1002/jee.20162.
  • Donohue, SusanK, and LarryG. Richards. 2009. “Factors Affecting Student Attitudes toward Active Learning Activities in a Graduate Engineering Statistics Course.” In 2009 39th IEEE Frontiers in Education Conference.
  • Faust, Jennifer L., and Donald R. Paulson. 1998. “Active Learning in the College Classroom.” Journal on Excellence in College Teaching 9 (2):3–24.
  • Felder, Richard M., and Rebecca Brent. 1996. “Navigating the Bumpy Road to Student-Centered Instruction.” College Teaching 44 (2):43–7. doi: 10.1080/87567555.1996.9933425.
  • Finelli, CynthiaJ, and Maura Borrego. 2020. “Evidence-Based Strategies to Reduce Student Resistance to Active Learning.” In Active Learning in College Science: The Case for Evidence-Based Practice, edited by Joel J Mintzes and Emily M Walter. Cham: Springer.
  • Finelli, Cynthia J., Kevin Nguyen, Matthew DeMonbrun, Maura Borrego, Michael Prince, Jenefer Husman, Charles Henderson, Prateek Shekhar, and Cynthia K. Waters. 2018. “Reducing Student Resistance to Active Learning: Strategies for Instructors.” Journal of College Science Teaching 47 (5):80–91.
  • Finelli, CynthiaJ, Matthew DeMonbron, Prateek Shekhar, Maura Borrego, Charles Henderson, Michael Prince, and CindyK. Waters. 2014. “A Classroom Observation Instrument to Assess Student Response to Active Learning.” In 2014 IEEE Frontiers in Education Conference (FIE) Proceedings. doi: 10.1109/FIE.2014.7044084.
  • Fredricks, Jennifer A., Phyllis C. Blumenfeld, and Alison H. Paris. 2004. “School Engagement: Potential of the Concept, State of the Evidence.” Review of Educational Research 74 (1):59–109. doi: 10.3102/00346543074001059.
  • Freeman, Scott, Sarah L. Eddy, Miles McDonough, Michelle K. Smith, Nnadozie Okoroafor, Hannah Jordt, and Mary Pat Wenderoth. 2014. “Active Learning Increases Student Performance in Science, Engineering, and Mathematics.” Proceedings of the National Academy of Sciences of the United States of America 111 (23):8410–5. doi: 10.1073/pnas.1319030111.
  • Haak, David C., Janneke HilleRisLambers, Emile Pitre, and Scott Freeman. 2011. “Increased Structure and Active Learning Reduce the Achievement Gap in Introductory Biology.” Science (New York, N.Y.) 332 (6034):1213–6. doi: 10.1126/science.1204820.
  • Harmin, Merrill. 1994. Inspiring Active Learning: A Handbook for Teachers. Alexandria, VA: ERIC.
  • Henderson, Charles, and Melissa H. Dancy. 2009. “Impact of Physics Education Research on the Teaching of Introductory Quantitative Physics in the United States.” Physical Review Special Topics-Physics Education Research 5 (2):020107.
  • Hora, MatthewT, J. Ferrare, and Amanda Oleson. 2012. “Findings from Classroom Observations of 58 Math and Science Faculty.” In Wisconsin Center for Education Research. Madison, WI: University of Wisconsin-Madison.
  • Husman, Jenefer, Matthew Graham, Maura Borrego, Cynthia Finelli, Michael Prince, and Bobbie Bermudez. 2020. “Reducing Student Resistance to Active Learning: Development and Validation of a Measure.” In CANCELLED: American Educational Research Association Annual Meeting. San Francisco, CA: American Educational Research Association.
  • Jamieson, Leah H., and Jack R. Lohmann. 2012. “Innovation with Impact: Creating a Culture for Scholarly and Systematic Innovation in Engineering Education.” American Society for Engineering Education, Washington 77
  • Koch, Franziska D., Andrea Dirsch-Weigand, Malte Awolin, Rebecca J. Pinkelman, and Manfred J. Hampe. 2017. “Motivating First-Year University Students by Interdisciplinary Study Projects.” European Journal of Engineering Education 42 (1):17–31. doi: 10.1080/03043797.2016.1193126.
  • Lake, David A. 2001. “Student Performance and Perceptions of a Lecture-Based Course Compared with the Same Course Utilizing Group Discussion.” Physical Therapy 81 (3):896–902. doi: 10.1093/ptj/81.3.896.
  • Lucke, Terry, Peter K. Dunn, and Michael Christie. 2017. “Activating Learning in Engineering Education Using ICT and the Concept of ‘Flipping the Classroom.” European Journal of Engineering Education 42 (1):45–57. doi: 10.1080/03043797.2016.1201460.
  • Lumpkin, Angela, Rebecca M. Achen, and Regan K. Dodd. 2015. “Student Perceptions of Active Learning.” College Student Journal 49 (1):121–33.
  • Machemer, Patricia L., and Pat Crawford. 2007. “Student Perceptions of Active Learning in a Large Cross-Disciplinary Classroom.” Active Learning in Higher Education 8 (1):9–30. doi: 10.1177/1469787407074008.
  • Mehmetoglu, Mehmet, and TorGeorg Jakobsen. 2016. Applied Statistics Using Stata: A Guide for the Social Sciences. London: Sage.
  • Meyers, L.S., G., Gamst, and A. J. Guarino. 2006. Applied Multivariate Research: Design and Interpretation. Sage Publications.
  • Miller, Cynthia J., and Michael J. Metz. 2014. “A Comparison of Professional-Level Faculty and Student Perceptions of Active Learning: its Current Use, Effectiveness, and Barriers.” Advances in Physiology Education 38 (3):246–52. doi: 10.1152/advan.00014.2014.
  • National Research Council. 2012. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, D.C.: National Academies Press.
  • National Science Foundation, and National Center for Science and Engineering Statistics. 2019. “Women, Minorities, and Persons with Disabilities in Science and Engineering: 2019.” In Special Report NSF 19-304. Alexandria, VA.
  • Nguyen, Kevin A., Maura Borrego, Cynthia J. Finelli, Matt DeMonbrun, Caroline Crockett, Sneha Tharayil, Prateek Shekhar, Cynthia Waters, and Robyn Rosenberg. 2021. “Instructor Strategies to Aid Implementation of Active Learning: A Systematic Literature Review.” International Journal of STEM Education 8 (1):1–18. doi: 10.1186/s40594-021-00270-7.
  • Nguyen, Kevin, Jenefer Husman, Maura Borrego, Prateek Shekhar, Michael Prince, Matt Demonbrun, and C. Waters. 2017. “Students’ Expectations, Types of Instruction, and Instructor Strategies Predicting Student Response to Active Learning.” International Journal of Engineering Education 33 (1):2–18.
  • Prince, Michael. 2004. “Does Active Learning Work? A Review of the Research.” Journal of Engineering Education 93 (3):223–31. doi: 10.1002/j.2168-9830.2004.tb00809.x.
  • Prince, Michael, Maura Borrego, Charles Henderson, Stephanie Cutler, and Jeff Froyd. 2013. “Use of Research-Based Instructional Strategies in Core Chemical Engineering Courses.” Chemical Engineering Education 47 (1):27–37.
  • Prince, Michael, and Richard M. Felder. 2006. “Inductive Teaching and Learning Methods: Definitions, Comparisons, and Research Bases.” Journal of Engineering Education 95 (2):123–38. doi: 10.1002/j.2168-9830.2006.tb00884.x.
  • Seltman, Howard J. 2012. Experimental Design and Analysis. Pittsburgh: Carnegie Mellon University.
  • Seymour, Elaine, and Nancy M. Hewitt. 1997. Talking about Leaving: Why Undergraduates Leave the Sciences. Boulder, CO: Westview Press.
  • Shekhar, Prateek, Demonbrun, Matt, Maura Borrego, Cynthia, Finelli, Michael Prince, and Charles Henderson, Waters, Cynthia. 2015. “Development of an Observation Protocol to Study Undergraduate Engineering Student Resistance to Active Learning.” International Journal of Engineering Education 31 (2):597–609.
  • Shekhar, P., M. Borrego, M. DeMonbrun, C., Finelli, C. Crockett, and K. Nguyen, 2020. “Negative Student Response to Active Learning in STEM Classrooms: A Systematic Review of Underlying Reasons.” Journal of College Science Teaching 49 (6):45–54.
  • Smith, Veronica, and LeeAnn Cardaciotto. 2011. “Is Active Learning like Broccoli? Student Perceptions of Active Learning in Large Lecture Classes.” Journal of the Scholarship of Teaching and Learning 11 (1):53–61.
  • Smith, Karl, Sheri D. Sheppard, David W. Johnson, and Roger T. Johnson. 2005. “Pedagogies of Engagement: Classroom‐Based Practices.” Journal of Engineering Education 94 (1):87–101. doi: 10.1002/j.2168-9830.2005.tb00831.x.
  • Stains, M., J. Harshman, M. K. Barker, S. V. Chasteen, R. Cole, S. E. DeChenne-Peters, M. K. Eagan, J. M. Esson, J. K. Knight, F. A. Laski, et al. 2018. “Anatomy of STEM Teaching in North American Universities.” Science (New York, N.Y.) 359 (6383):1468–70. doi: 10.1126/science.aap8892.
  • Tharayil, Sneha, Maura Borrego, Michael Prince, Kevin A. Nguyen, Prateek Shekhar, Cynthia J. Finelli, and Cynthia Waters. 2018. “Strategies to Mitigate Student Resistance to Active Learning.” International Journal of STEM Education 5 (1):7. doi: 10.1186/s40594-018-0102-y.
  • Theobald, Elli J., Mariah J. Hill, Elisa Tran, Sweta Agrawal, E. Nicole Arroyo, Shawn Behling, Nyasha Chambwe, Dianne Laboy Cintrón, Jacob D. Cooper, Gideon Dunster, et al. 2020. “Active Learning Narrows Achievement Gaps for Underrepresented Students in Undergraduate Science, Technology, Engineering, and Math.” Proceedings of the National Academy of Sciences of the United States of America 117 (12):6476–83. doi: 10.1073/pnas.1916903117.
  • United States Government Accountability Office. 2014. Science, Technology, Engineering, and Mathematics Education: Assessing the Relationship between Education and the Workforce (GAO-14-374).
  • Weimer, Maryellen. 2002. Learner-Centered Teaching: Five Key Changes to Practice. San Francisco, CA: John Wiley & Sons.
  • Weimer, Maryellen. 2012. Five Characteristics of Learner-Centered Teaching. Madison, WI: Faculty Focus.