2,787
Views
6
CrossRef citations to date
0
Altmetric
Articles

Flipping Statistics Courses in Graduate Education: Integration of Cognitive Psychology and Technology

Abstract

This article examines the integration of cognitive psychology research and technology within existing frameworks of statistics course design and implementation for a sequence of flipped graduate-level courses. Particular focus is the use of the principles of spacing and retrieval practice within the flipped classroom format as strategic approaches to curriculum design and instructional delivery within and across courses. The reporting of student perceptions regarding their engagement in learning, statistical thinking and practice, and course components that contributed to their learning serves to shed light on ways educators can bridge theory to practice in statistics education at the graduate-level.

1 Introduction

A key emphasis in statistics education in higher education is identifying effective pedagogical approaches to improve students’ learning of statistics (Wild et al. Citation2004; Zieffler et al. Citation2008; Mills and Raju Citation2011; Tishkovskaya and Lancaster Citation2012). The impetus to improve statistics education stems from the importance of statistically literate citizens (Gal Citation2002), relevance and utility of statistics across disciplines and professions (Lindsay, Kettenring, and Siegmund Citation2004), and students’ general lack of interest (Lee Citation2007; Keeley, Zayac, and Correia Citation2008) and negative attitudes (Mills Citation2004) toward statistics. The demand for qualified job applicants with the statistical knowledge and skills to meet job demands further reinforces continual reflection on ways to enhance statistics education (Lindsay, Kettenring, and Siegmund Citation2004; Brown and Kass Citation2009). In response, there are continued calls for changes in statistics curriculum and pedagogy to better equip students to understand and use statistics (e.g., Gal Citation2002; Ridgway Citation2016). For example, the Guidelines for Assessment and Instruction in Statistics Education College Report (GAISE 2016) provides a range of recommendations to improve students’ statistical thinking and literacy in the teaching of statistics in higher education. Correspondingly, over the past 20 years, research in cognitive psychology has garnered increased attention regarding techniques that can readily be transferred in statistics education to improve student learning (Lovett and Greenhouse Citation2000; Greenhouse and Seltman Citation2018), including the use of technology as a pedagogical strategy (Tishkovskaya and Lancaster Citation2012).

Research on statistics teaching and learning has predominantly focused on introductory statistics courses within undergraduate education (e.g., Smith Citation1998; Roseth, Garfield, and Ben-Zvi Citation2008; Zieffler et al. Citation2008; Carey and Dunn Citation2018; Cheng, Ferris, and Perolio Citation2018; Nielsen, Bean, and Larsen Citation2018; Songsore and White Citation2018). A less developed area of concentration is the teaching and learning of statistics in graduation education. Attention to this area is necessary as the number of adults 25 years and older with graduate degrees has increased from 7.8% in 1995 to 12% in 2015 (Baum and Steele Citation2017). Okahana and Zhou (Citation2018) reported an average annual rate increase of 4% for graduate applications from 2007 to 2017, with enrollment increases between Fall 2016 and Fall 2017 across diverse race groups, particularly among Latinos (5.3%), Asian/Pacific Islanders (3.6%), and African Americans (3%). Furthermore, between Fall 2016 and 2017, first-time graduate enrollment reported the largest one-year gain in the disciplines of mathematics and computer science (3.8%), followed by business (3.7%), respectively.

As applications to and participation rates in graduate programs increases, there is a need to examine approaches with potential to enhance statistics education to meet students’ diverse learning needs. This is particularly critical since graduate curriculum typically requires students to complete a sequence of core statistics courses for degree attainment. Typically, these “service” statistics courses enroll students from diverse disciplines and backgrounds. Due to their integration in graduate curriculum, statistics courses have a critically important role in graduate education in terms of their responsibility to develop students’ statistical thinking and ability to pursue independent research. For example, success in graduate school depends on students’ abilities to, among others, critically evaluate empirical studies, generate testable research questions, and select and execute methods that will produce empirical findings that will inform practice, theory, and research. While GAISE (2016) suggest that their recommendations “also apply to statistics beyond the introductory level,” (p. 7), there is a noticeable literature gap on ways to enhance statistics education at the graduate-level. Of interest to this article is the integration of cognitive psychology research and technology to enhance statistics teaching and learning in graduate education, specifically within a flipped classroom framework.

This article has four aims related to graduate-level statistics teaching and learning. First, I review existing frameworks of statistics course design, based on principles of learning (e.g., Lovett and Greenhouse Citation2000) and the use of memory-enhancement strategies (e.g., spacing) to improve student learning and retention of information. Second, I examine flipped classrooms as a strategic pedagogical approach to statistics education. Third, I explore the integration of the principles of cognitive psychology and flipped courses for the design and delivery of a sequence of graduate-level statistics courses. Last, I summarize student perceptions of the courses and their statistics learning, including their evaluation of course components perceived as useful to their learning. The article seeks to meet the call of Zieffler et al. (Citation2008) for faculty to contribute ideas to promote guidelines and suggestions for improving statistics teaching and learning.

The context of this article is a sequence of three flipped graduate-level statistics courses. These three-unit courses include applied (introductory) statistics, intermediate statistics, and multiple regression at a large, public metropolitan university in the east south-central region of the United States. It should become evident that the ideas of this article can apply to any specific course or sequence of courses. The first course, applied statistics, is a prerequisite, introductory-level graduate course required of all incoming education graduate students. The intermediate and multiple regression courses represent a set of core doctoral curriculum courses students are required to complete for their doctoral degree. Notably, instead of the multiple regression course, students can elect to enroll into a multivariate statistics course to meet course requirements. The applied statistics course introduces students to the principles and application of data visualization (e.g., graphs), descriptive statistics, and inferential statistics (e.g., t-tests, chi-square), and concludes with the one-way analysis of variance (ANOVA). Subsequently, the intermediate statistics course introduces students to ANOVA-based procedures for hypothesis testing (e.g., repeated measures, mixed design). Last, the multiple regression familiarizes students to regression models appropriate to continuous and categorical outcomes, including testing mediation and moderator effects.

The subsequent article is divided into sections that address the integration of cognitive psychology research and technology to the design and delivery of a series of flipped graduate-level statistics courses. Section 2 examines the principles of spacing and retrieval practice, including spaced retrieval practice, as strategies to improve students’ learning and retention of information. Section 3 identifies and describes the core features of flipped classrooms as a pedagogical approach, including the structure of the aforementioned statistics courses. Section 4 summarizes student perceptions of the flipped statistics classrooms and resources perceived as useful in their statistics learning. The article concludes with recommendations for instructors on the integration of cognitive science and technology into statistics teaching and research.

2 Cognitive Psychology Principles and Student Learning

Over the past 20 years, there has been an increased emphasis in bridging the gap between cognitive psychology and education to improve students’ learning. Within statistics education, specifically, Lovett and Greenhouse (Citation2000) and, more recently, Greenhouse and Seltman (Citation2018) offered frameworks to guide the design and implementation of courses and programs to increase students’ attainment of the requisite skills to engage in statistical thinking and real-world problem-solving. These perspectives provide a general structure for statistics educators to implement and evaluate the effectiveness of diverse pedagogical approaches to improve students’ attainment of course learning objectives.

Lovett and Greenhouse (Citation2000) identified five empirically based principles of learning for educators to consider in the teaching and learning of statistics. These include: (1) optimal learning occurs when students practice and perform autonomously, (2) knowledge is contextually bound to where it was learned, (3) real-time feedback supports learning efficiency, (4) learning is based on the integration of existing and new knowledge, and, lastly, (5) mental overload decreases learning efficiency. These principles offer statistics educators a set of core pillars to guide course design and implementation.

More recently, Greenhouse and Seltman (Citation2018) articulated the application of the principles of learning to improve students’ development of the foundational (or, what they refer to as, component) skills necessary to engage in the complex tasks ubiquitous to applied statistics. Component skills represent the domain-specific skills students need to master and fluently orchestrate to accomplish complex tasks. For instance, hypothesis testing using multiple regression may represent a complex task for a first-year doctoral student that requires the effective execution of such skills as: (a) demarcating the independent and dependent variables and their levels of measurement, (b) data management and organization, (c) exploratory data analysis, (d) model specification and assessment, and, (d) oral and written communication. Greenhouse and Seltman emphasize that content mastery, thus, requires students be provided the opportunity to acquire the key skills to address a complex problem, engage in their repetitious use, and, recognize when to use the skills. Drawing upon the literature on how experts approach solving problems, the authors propose nine component skills necessary for students to engage in the four stages of problem-solving used by experts: represent the problem, determine the solution, execute the strategy, and evaluate the results. For example, the two component skills corresponding to Stage 1 (i.e., represent the problem) are the ability to understand the problem and context and, second, understand the variables and data structure. According to the authors, component skill development rests upon providing students the opportunity to engage repeatedly in deliberate practice and, second, transfer skill use to address novel problems in a new context. A critical component to students’ skill acquisition is their exposure to instructional practices designed to promote their long-term retention of information.

In recent years, cognitive researchers have made intentional efforts to pursue research that will yield substantively meaningful results with implications to improve student learning within classroom settings (e.g., Lyle and Crawford Citation2011; Dunlosky et al. Citation2013; Schwieren, Barenberg, and Dutke Citation2017). A review of this literature illuminates a common denominator across the empirical findings of over 100 years of learning and memory research: Repeated engagement with learned information has a positive effect on learners’ subsequent task performance and retention of information (e.g., Dempster Citation1989; Cepeda et al. Citation2006; Karpicke and Grimaldi Citation2012). The premise is that spacing a learner’s exposure to information over separate durations increases long-term retention. Comparatively, information is massed when it is presented within the confines of a designated timeframe (e.g., 1-day; Carpenter Citation2014). This is a typical scenario in education when students are exposed and assessed (e.g., quiz) on a particular content, or unit, module (e.g., descriptive statistics) within an established allotment of time before progressing to the next lesson. Thus, students are more likely to learn and remember information if provided multiple opportunities to practice retrieving previously learned information. Carpenter (Citation2014) provides an illustrative example, “while learning about dependent-samples t-tests in a statistics course, students could receive a number of practice problems dealing with dependent-samples t-tests, in addition to some problems that cover the earlier-learned concepts of independent-samples t-tests and one-sample t-tests” (p. 138). As research in cognitive psychology emphases, statistics educators should intentionally select and use practices that will promote students’ engagement with, and retrieval of, previously learned information.

Empirically supported techniques to improve students’ long-term retrieval of information has critically important implications for statistics education. Various techniques exist that can be induced by students as they engage in their learning (e.g., highlight/underline meaningful text) or instructors through their pedagogical practices. While a comprehensive review of available techniques is beyond the scope of this article (see Dunlosky et al. Citation2013, for a review), three approaches that have received particular attention among cognitive researchers that can be readily integrated into existing classroom practices are presented. Collectively, they seek to improve students’ long-term retention of information through repeated exposure at separate intervals before an assessment of content mastery (e.g., final exam); however, they differ according to how the learner reengages with the information (e.g., testing, restudy) between the initial learning phase and the final assessment.

The first is the testing effect (or, retrieval practice; Roediger and Karpicke Citation2006; Karpicke and Roediger Citation2008) based on the salient finding that individuals are better able to remember information if it was tested (retrieved) during the initial stages of learning. According to Karpicke (Citation2012), retrieval-based learning hinges on two key ideas, namely: retrieval is a central learning process and, second, retrieval does not constitute a neutral assessment of a learner’s acquired information. Instead, retrieval of information influences learning. There is a substantial body of empirical evidence on the positive effects of retrieval practice on improved long-term retention (e.g., Roediger and Karpicke Citation2006; Richland, Kornell, and Kao Citation2009; Karpicke and Roediger Citation2010; Karpicke and Zaromb Citation2010; Blunt and Karpicke Citation2014). For example, Lyle and Crawford (Citation2011) examined the effects of retrieval practice on the exam performance for undergraduate psychology students enrolled in an introductory statistics course section (Spring) who were administered a post-lecture quiz compared to students enrolled in the course the previous semester (Fall) who were not administered post-lecture quizzes. Results indicated that students exposed to the post-lecture quizzes obtained statistically higher exam scores than the control section and, correspondingly, perceived the post-lecture exams positively.

The second is the spacing effect, also referred to as distributed-practice (e.g., Cepeda et al. Citation2008), in which the learner is exposed to new information followed by a set time interval (e.g., 5 min, 1-week) and then engages in a repeated study session prior to another gap in time before an assessment of knowledge. The spacing gap refers to the time interval between the learner’s initial exposure to the information and the follow-up restudy session, whereas the test delay is the time interval between restudy and the final assessment. The idea is that individuals will begin forgetting learned information over time and spacing will benefit long-term retention of learned information due to re-exposure. A lag effect occurs when the spacing gap is altered, which may have a differential effect on learning (Cepeda et al. Citation2006). Research on the spacing effect extends over 100 years and has yielded consistently positive effects on long-term retention for various learning tasks (e.g., coordinated motor skills, mathematics tasks) across populations (e.g., elementary school-aged children, college students; see reviews by Cepeda et al. (Citation2006) and Carpenter et al. (Citation2012)). While the research is inconclusive regarding the optimal duration of the spacing gap between study sessions (Balota, Duchek, and Logan Citation2007), Carpenter et al.’s (Citation2012) synthesis of the literature suggests that, “any form of spacing—whether it is fixed or expanding—appears to promote learning” (p. 375). Interleaving and blocking represent alternative instances of spacing in which students are introduced to new information that is systematically interwoven with previously learned information (Carpenter Citation2014).

Spaced retrieval practice represents the final memory-enhanced technique considered and takes into account the collective contribution of both retrieval practice and spacing to improve the long-term acquisition of information. It is based on the premise that students will retain information for longer periods if they engage in repeated retrieval practice than simply restudying alone (Hopkins et al. Citation2016, Citation2018). The effects of spaced retrieval practice on undergraduate students’ academic performance have begun to emerge, shedding light on its effectiveness as a memory-enhanced technique in classroom settings (e.g., Butler et al. Citation2014). Hopkins et al. (Citation2016) reported undergraduate engineering students assigned to a spaced retrieval practice condition (i.e., completed questions aligned to target objectives across quizzes administered throughout semester) scored statistically higher than mass condition students on a cumulative exam in a precalculus engineering course. The study also examined whether this promoted students’ long-term retention of precalculus knowledge by comparing students’ performance across three assessments administered the following (Spring) semester in an engineering calculus course. Although no statistically significant score differences were found across spaced and massed conditions on the first and third unit exams, students in the spaced condition scored statistically higher on the cumulative exam. In a subsequent study, Hopkins et al. (Citation2018) reported undergraduate engineering students exposed to space retrieval practice in a Fall semester precalculus engineering course scored statistically higher on a precalculus knowledge assessment administered in a subsequent Spring semester engineering course compared to students in the massed condition (i.e., no re-exposure to information). Butler et al. (Citation2014) reported a homework-based intervention based on the combination of space retrieval and other cognitive science principles (e.g., spacing, feedback) produced a medium effect on exam performance among undergraduate engineering students in an upper-level course. The accumulating research suggests the positive effects that the combination of retrieval practice and spacing has on students’ academic performance.

The association between reengaging with information and long-term retrieval corresponds with Greenhouse and Seltman’s (Citation2018) assertion that for automaticity of statistical thinking and practice to occur requires deliberate practice and opportunities to transfer acquired skills to new contexts. While we certainly may hope that students engage in study behaviors that facilitate retention of learned material, unfortunately, this is commonly not the case (Karpicke Citation2009; Karpicke, Butler, and Roediger Citation2009). Consequently, as educators, we must consider strategies that may fit within our classroom practices to improve student learning and engagement with statistics. Indeed, cognitive researchers have offered practical recommendations to facilitate students’ retention of information (e.g., Pashler et al. Citation2007; Carpenter et al. Citation2012; Dunlosky et al. Citation2013). However, these recommendations typically target pre-Kindergarten through 12 or undergraduate education and, thus, consideration of their relevance and practicality to graduate education is warranted.

3 Flipped Classrooms as Pedagogical Approach

Flipped classrooms have increasingly gained attention as a pedagogical approach to teaching statistics in higher education (e.g., Schwartz Citation2014; Winquist and Carlson Citation2014; Peterson Citation2016; Nielsen, Bean, and Larsen Citation2018). Within this structure, students engage with course material via computer-mediated instruction before class (e.g., video lecture). Subsequently, within the face-to-face classroom, instructors serve as facilitators to promote students’ higher-level understanding of material through collaborative, problem-solving activities, and address misconceptions (Hamdan et al. Citation2013). As a student-centered approach to teaching, it possesses the principles of cognitive theories of learning (e.g., constructivist) in that students’ are repeatedly exposed to learned information and actively involved in their knowledge construction and acquisition. More specifically, within the constructivist approach to learning, students first develop their conceptual understanding of material online outside of class, followed by applying their knowledge and skills within the classroom (Lawson Citation2002).

While their appearance can vary accordingly, Hamdan et al. (Citation2013) identify four pillars of flipped classrooms. First, they represent a flexible environment in which a course’s structure can change to meet students’ learning needs. Second, change in the learning culture occurs with students taking more responsibility for their learning before class and, subsequently, face-to-face time focuses on supporting their higher-level understanding of learned material through active learning. Third, they require educators to use intentional content aligned to lesson and course learning objectives to support students’ acquisition of knowledge and skills. Last, their success rests upon professional educators who can identify, implement, and evaluate pedagogical strategies that support students’ attainment of learning outcomes.

Research on flipped classrooms has yielded a range of findings regarding their effects on student learning. Within statistics education, empirical results have produced generally favorable findings regarding the effectiveness of flipped classrooms to improve student performance (e.g., Winquist and Carlson Citation2014; Peterson Citation2016; Nielsen, Bean, and Larsen Citation2018), including higher course evaluation ratings for flipped courses compared to traditional courses. While encouraging, the broader cross-disciplinary research has largely generated inconclusive findings regarding the effectiveness of flipped pedagogical approaches to produce measurable learning gains (see Presti Citation2016, for a review in nursing education).

The efficacy of flipped courses to produce desired outcomes hinges on the quality of their design and implementation. While examples of the development of flipped statistics courses have emerged (e.g., Schwartz Citation2014), there is a literature gap regarding their structure and delivery within the context of graduate education, specifically with regard to statistics courses. This is particularly the case regarding the use of cognitive psychology research as a blueprint for course design and implementation.

Over the past three years, a sequence of three graduate-level “service” statistics courses (3 credit hours each) delivered in a college of education were flipped by the author for doctoral degree pursuing students. Originally, each course meets once a week for 2.5 hr over a traditional 16-week academic semester. However, after flipping a particular course, each week students engaged with the online instruction for one hour prior to the face-to-face session and, subsequently, had 1.5 contact hours in the classroom. Collectively, the courses comprise the core curriculum that doctoral students must complete at a satisfactory level (earn a grade of C or higher) for degree requirements, and typically enroll 15–20 part- and full-time graduate students from diverse disciplines (e.g., nursing, education). Notably, the applied statistics course typically enrolls both masters and doctoral degree seeking students, whereas the upper-level courses enroll only doctoral students. Among faculty, the expectation is the courses will familiarize students to statistical concepts and develop their data analytic skills to engage in applied research.

The first course, applied statistics, aims to promote students’ understanding of concepts and principles of basic statistics. The subsequent course, intermediate statistics, serves to develop students’ knowledge and use of ANOVA procedures for hypothesis testing. Finally, applied multiple regression seeks to provide students a practical and conceptual introduction to the use of regression-based techniques to pursue social science research. Factors associated with maximizing class time to engage students in collaborative, active learning and the time restraints of a traditional course format provided the impetus for flipping the courses.

Within this course format, students are responsible for completing content-specific instructional modules (e.g., descriptive statistics, factorial ANOVA) via computer-mediated instruction before class and, subsequently, participating in collaborative, active-learning activities in the face-to-face classroom. identifies and offers a brief description of each core course component.

Table 1 Core instructional components of flipped classrooms.

Instructional modules represent the core instructional delivery method of each course and generally consist of 2–4 video lectures (15 min each) for each unit topic. The videos are Powerpoint-based narrated lectures created using screen capturing software (e.g., Camptasia, Screencast-O-Matic). In a three-part video series, for example, Part 1 provides an overview of the statistical procedure, its purpose, and, as relevant, model assumptions (e.g., equal variances). Subsequently, Part 2 walks students through the mathematical formula underlying the analysis (e.g., variance components of ANOVA). Last, Part 3 provides an applied example of the particular analysis based on a small dataset to demonstrate how to conduct the analysis and interpret results. Each module also includes an Online and Reading folder that contain links to relevant websites or downloadable articles (e.g., empirical, instructional) and resources (e.g., “how-to” documents). Each instructional module includes a set of measurable learning objectives designed to develop narrow foundational, or component, skills to engage in critical statistical thinking and practice.

Each instructional module includes an objectively scored quiz (5- or 10-items) to assess students’ content mastery. Specifically, the quizzes seek to assess factual and conceptual understanding of information from the current module and include questions aligned with previously learned information to promote long-term information retrieval (i.e., retrieval practice). For example, a one-way ANOVA quiz will include questions related to an independent groups t-test and practices shared across analyses for hypothesis testing (e.g., effect size reporting). Prior to Spring 2019, students had one chance to complete the quizzes. However, based on course feedback and my desire to have the quizzes reinforce learning, students now have two quiz attempts with higher grade counting toward their final course grade. After their first attempt, students obtain immediate feedback on their score and submitted answers to determine whether they want to take it a second time. After the quiz due date (start of class), students can review submitted and correct answers to obtain additional feedback on their performance.

Corresponding to each instructional module is a 15-min video tutorial demonstrating the use of a statistical software package (i.e., SPSS) to conduct and interpret results. Students are provided the dataset (used within instructional videos) and expected to use them as a resource to bring with them to class to complete the assignment.

Assignments accompany each instructional module and require students to apply the learned analysis to an instructor-provided dataset. Assignments incorporate the principles of cognitive psychology by including questions that require students to reengage with previously learned information to apply the new material to address a novel problem. For example, an assignment on the two-way ANOVA requires students to identify a hypothesis based on their own interests that aligns with the analysis, including the independent and dependent variables, levels of measurement, and null and alternative hypotheses. Transfer of skills occurs with students generating a new hypothesis to frame the analysis with the provided data. Within and across courses, students must engage in exploratory data analysis (e.g., inspect boxplots), data management and manipulation (e.g., recoding), and analyses appropriate for testing model assumptions. Assignments are group-based (2–3 students), made available one week prior to class, and should to be completed, or mostly completed, prior to students attending class. Subsequently, within the face-to-face session, the assignments provide the basis for in-class, collaborative activities and discussions. As such, the assignments seek to provide students the opportunity to present their steps to data analysis, interpret results, and offer feedback to reinforce learning and identify misconceptions.

Each course includes a final paper structured as an empirical quantitative study. Specifically, for the applied and intermediate statistics courses, students are provided a real dataset accompanied with a set of broad research questions to guide their development of testable hypotheses and selection of appropriate statistical analyses. For the multiple regression course, students must obtain their own data aligned to their own interests. In most instances, students obtain data from their faculty advisor or through publically available resources (e.g., National Center for Educational Statistics). As a guide to facilitate their academic writing, student receives an article template with key topic headings (e.g., Methodology, Results). The final paper serves to provide students the opportunity to transfer their learning to addressing a novel problem within a new context to promote their skills with the application of statistics to develop their quantitative research skills.

Participation is the foundational element across courses and dictates instructional delivery to meet students’ learning needs. For each instructional module, and before each face-to-face session, students engage with the material in multiple and diverse ways: video lectures, quizzes, and assignments. Subsequently, in the face-to-face component, students engage in collaborative, active learning to share their understanding of information, provide feedback, and address misconceptions. The nature of the group-based assignments encourages active student collaboration within and outside of class in multiple formats. In-class, whole group discussions offer students rich opportunities to reinforce their own knowledge while, simultaneously, providing students immediate feedback on existing misconceptions or uncertainties related to steps to data management, data analysis, and reporting.

4 Merging Cognitive Psychology Research and Technology in Course Design

This section illustrates the integration of the principles of spacing and retrieval practice, including their integration via spaced retrieval, into course design and delivery within and across courses. Notably, there is no clear demarcation in which to identify the point at which students can be said to be using one technique over the other. As a case in point, spacing may occur with students repeatedly engaging with each instructional module before, during, and after face-to-face sessions to reinforce their learning. It could also occur when students reengage with course material from the introductory statistics course to remember how to create dummy codes in the upper-level multiple regression course. Notably, as reported in the literature, there is no optimal amount of spacing and so the extent to which students engage and reengage with material will ultimately depend on course learning objectives and instructor expectations. As illustrated in this section, instructors can use a host of strategies to encourage students to repeatedly engage with course material and retrieve previously learned information. Thus, this section details the construction of courses in consideration of the previously mentioned memory enhancement techniques.

Across courses, students begin their learning of statistics through their engagement with the first instructional module. Except for the introductory (applied) statistics course, the first instructional module for each upper-level course intentionally overlaps with the final module of the preceding course to initiate students’ retrieval of learned information. For example, the introductory course concludes with the one-way ANOVA, which is also the first module of the intermediate statistics course. Likewise, in the multiple regression course, the general linear model is the first instructional module and focuses on the use of the one-way ANOVA and multiple regression to test group mean score differences using examples from the published literature (e.g., Nelson and Zaichkowsky Citation1979). For the upper-level courses, the use of overlapping modules explicitly offers students the opportunity to retrieve previously learned information and reflect on their knowledge so that they have a foundation in which to organize the encoding and acquisition of new material. Because the instructional modules are delivered online and completed before class students are able to come into the first class session with a general understanding of their questions and misconceptions, irrespective of previously completed statistics courses. The autonomous nature of the instructional modules enables their use across courses so that students can restudy previously learned information.

Presentation of topics via instructional modules encourages students to engage with each content domain no less than five times, including when they: (1) review video lectures and complete readings, (2) complete quiz, (3) complete assignment on own or with partner(s) prior to class, (4) engage in collaborative active learning in face-to-face sessions, and, lastly, (5) finalize homework to submit for grading. Beyond the integration of spacing, both retrieval practice and spaced retrieval are inherent properties of curriculum design and delivery through the module quizzes and corresponding assignments. For example, the within-groups ANOVA quiz will have items to operationalize students understanding of this analysis (e.g., sphericity) and a subset of questions on related (e.g., dependent groups t-test) or different (e.g., one-way ANOVA) analyses. Each quiz offers students two attempts with immediate feedback offered after each administration to provide them information on their learning to guide their studying behavior and address misconceptions. Specifically, feedback after the first attempt includes their responses and total score (to decide if they want to retake quiz to obtain a higher score), whereas second attempt feedback includes submitted and correct answers.

Assignments provide another approach to sustain students’ use of space retrieval to improve their course learning. Assignment are typically started after the module quiz and serve to bridge students’ conceptual understanding and application of information within a collaborative, problem-based learning environment. Students have the option to complete the assignments in small, collaborative teams (2–3 students) or independently. For any given assignment, questions require students to apply their learning to their own research interests and transfer to solving novel problems. For example, the first question across assignments asks students to specify a research question based on their own interests that can be empirically addressed with the procedure presented in the instructional module. Therefore, the two-way (factorial) ANOVA assignment requires each student to specify a question with two categorical independent variables and a continuous outcome. Based on an instructor-provided dataset, subsequent questions focus on students retrieving prior and newly acquired knowledge to engage in applied data analysis to solve a set of problem. In this way, assignments seek to embrace Greenhouse and Seltman’s (Citation2018) tenet that students need to engage in deliberate practice and opportunities to transfer knowledge to novel settings to develop their statistical knowledge and practice. Therefore, before students enter the classroom, they have had multiple opportunities to engage with the material, obtain immediate feedback on their performance, and apply learned information to their own interests and a new problem.

The motivation for flipping these courses was to create interactive learning environments that would encourage students to be active participants in their own learning, as well as their peers, that would extend beyond the traditional classroom. Due to repeated exposure to instructional modules prior to face-to-face sessions, students enter the class cognizant of their questions, areas in need of clarification, and misconceptions. This facilitates class time to focus on the application of learned material as based on the assignment and accompanying dataset. After students share their conceptual understanding of the material, they breakout into their collaborative teams to review their work, identify questions, and offer feedback to one another. From the instructor’s perceptive, this time affords the opportunity to do a quick check-in with each team to identify overarching concerns or questions prior to whole class discussion. Therefore, working through statistical formulas is de-emphasized and, instead, focus is on conceptual understanding and application of information. This includes, among other activities, students sharing their hypotheses of interest and exploring the appropriateness of a particular analysis of focus to addressing the question. Students then identify and discuss their approaches to data management, analysis, and reporting of results for the provided dataset. This includes, for example, identifying and describing the hypothesis used to frame the analysis, decisions based on exploratory data analysis (e.g., identification of outliers), model assumptions, and steps for hypothesis testing. Notably, while the described approach has merit for courses that enroll non-statistics or mathematics majors, courses that enroll statistics/biostatistics students may directly benefit from using face-to-face sessions to scrutinize formulas. Thus, a defining attribute of flipped courses is that they offer instructors a flexible structure in which to design courses to meet students’ learning outcomes.

Although not a requisite, meeting in a computer lab facilitates the collaborative learning environment since students can demonstrate their steps of data analysis and information used for guiding their decisions related to hypothesis testing. Over time, this structure has resulted in students taking the lead in discussions, posing questions to one another, and providing feedback to address misconceptions.

Greenhouse and colleagues specify key attributes of students’ learning environments to improve their learning and practice of statistics. Indeed, their frameworks offers statistics educators a general structure in which to approach course design and delivery, which provides rich opportunities to instructors to implement instructional strategies aligned to meeting the learning needs of their students. Among these, memory-enhanced techniques offer statistics educators a useful set of simple, easy-to-use approaches to promoting students engagement in their learning. The efficacy of statistics courses to promote students’ attainment of course learning objectives depends, nonetheless, on the extent to which students perceive these courses as effective in meeting their learning needs.

5 Student Perceptions

From Spring 2016 to Fall 2018, student feedback was gathered to guide course design and implementation, including: (a) course elements that have supported their learning, (b) course engagement, and (c) course effectiveness. The majority of student comments were gathered using instructor-developed questions. However, a subset of items to assess course engagement were drawn from Lage, Platt, and Treglia (Citation2000), whereas several open-ended questions were selected from Davenport (Citation2018) to gather data on students’ perceptions toward the flipped course structure (e.g., perceived support). For brevity, and due to consistency of results across courses, I report aggregate results for the collective set of courses. Across courses, students reported spending an average of 12.14 (SD = 4.28, range: 5–20 hr) hours per week on the course outside of the classroom.

reports descriptive statistics, and the percentage, of students rating their perceived usefulness of core course components. Course elements receiving the highest rating (i.e., Extremely Useful) aligned with improving students’ statistical practice, including: assignments (69.3%), SPSS software (51.6%), in-class discussions (50%), and SPSS video tutorials (44.6%). These were followed by course components designed to improve students’ conceptual understanding, and included video lectures (35.9%) and online resources (21.3%). Students were more varied in their ratings of the course elements of textbooks and, not surprisingly, quizzes. Despite being a prominent course component, quizzes were rated slightly below Moderately Useful, perhaps due to their objective scoring nature and contribution to their course grade. The varied ratings for textbooks were expected since the courses rely exclusively on articles and instructor-generated resources rather than a particular textbook. Overall, 78.6% of students rated the supplemental articles, which focused on topics related to statistical practice (e.g., effect size reporting) as being Moderately Useful (48.2%), Very Useful (25%), or Extremely Useful (5.4%).

Table 2 Student ratings of course elements.

reports descriptive statistics of students’ perceptions of classroom engagement. Items were selected from Lage, Platt, and Treglia (Citation2000) and, to-date, administered in only one academic semester (Fall 2018) for two sections of the intermediate statistics course. Ratings were provided on a Likert scale (i.e., 1 = Strongly Disagree to 5 = Strongly Agree) and indicated students were perceived their course engagement positively. Specifically, students reported a preference with the flipped classroom format to a “traditional” lecture, and Agreed that they learned more statistics with this classroom format. Notably, students indicted that they would prefer to take their other statistics courses the flipped course format and felt the assignments illustrated basic statistics concepts. Average ratings fell just below Agree regarding whether they enjoyed working on group assignments and learned working in groups in class. Speculation regarding this finding may be due, for example, to varying levels of preparation or contribution of team members. Notably, despite being responsible for their learning prior to class, students did not generally report that there was too much work to do outside of class. Compared to their other classes, students only generally agreed that they worked more in these course than in their other classes during the semester.

Table 3 Descriptive statistics of student perceptions of course engagement.

summarizes student ratings related to the degree courses improved their statistics knowledge, confidence, and practice. With the exception of their confidence, student ratings indicated that they felt the courses promoted their knowledge and critical thinking of statistics. In particular, students perceived the courses as most effective in promoting their understanding of the use of statistics in research and practice, followed by knowledge of statistics and critical thinking about statistics, respectively. Based on Fall 2018 data, collected within the intermediate statistics courses, students perceptions fell just below Agree regarding the courses’ effectiveness for promoting their confidence to use statistics in research and practice.

Table 4 Item descriptive statistics.

The course feedback form included four open-ended questions to offer students the opportunity to provide more in-depth feedback on their perceptions of the courses. Of particular interest was responses indicative of use of various memory enhancement strategies and the extent to which the flipped classroom met their learning needs. Summaries of student responses are reported according the themes of each question, and accompanied with representative comments.

The first question asked students to identify what they liked about the overall course. Over a third of the 39 responses (36%) alluded to the flipped course structure and pedagogy. In particular, the autonomous nature of learning was noted, “I enjoy the flipped classroom model, as it is much easier for me to understand the concepts when I learn on my own then in class.” Additional aspects of the courses included working through assignments in class (21%) and the practicality (15%) of the courses. Overall, students positively referenced feeling challenged or empowered to learn the material prior to the actual in-person course. Students referenced this through the way in which collaborative learning contributed to course engagement. For example, it was noted that, “I like having the opportunity to work through assignments with the professor and classmates, which makes the course more engaging and improves my understanding.” Relatedly, another comment pertained to the “I like the practical nature of the course—using datasets and learning how to use SPSS as if we have our own research project.” Additional aspects of the course identified were the video lectures (10%), course pace (10%), and professor (8%). Evidence of the videos inducing spacing is shown with the comment, “I love the video modules. I watch them along with the notes more times than I would like to admit.” Due to the flipped nature of the courses, students noticed that I was able to be more responsive to their questions outside of class regarding providing timely feedback. Specifically, it was noted, “I appreciate the responses we got from the professor through the email. He always responded to me in a timely manner and appreciated our inquiries whatever it was, very useful indeed.” As with any course, timely feedback is a key factor in offering students feedback on their learning and addressing their misconceptions. Invariably, students’ ratings of the course instructor will depend on a number of factors including, for example, instructor availability, student–instructor relationship, and instructor expectations.

The question asking students to identify aspects of the courses that went well resulted in three main themes, including: collaborative active learning activities, flipped course design, and online videos and lectures. Nearly a half of the responses referenced the in-class collaborative active learning activities. For instance, it was noted that, “The group work was very effective, since we were doing it anyway. It helps to gain a better understanding and have support in the blended model to work with others;” another stated, “All face-to-face meetings were very helpful. Being able to get feedback and instant answers to questions/struggles was key to me learning stats.” As such, responses generally referenced the ability to ask questions and get immediate feedback. Students also identified the flipped course design (16%) and the video and online lectures (16%) as positive aspects of the courses. For example, “The flipped method of teaching was valuable to me because it allowed me to be semi knowledgeable about the topic before class so I could ask more application based questions.” In consideration of the videos, “I enjoyed the videos. I could view them at my pace, pause, and rewind as needed. It helped me grasp difficult concepts.” Last, 6% identified some other aspect the course, including availability of the professor, supplemental learning opportunities, and ability to network.

Students offered a range of suggestions when asked the ways the courses could be improved. Course design and structure (27%) were most identified as areas for improvement. Notably, these comments did not specifically reference the flipped course design as an improvement area. The online lectures were a less-preferred approach to learning for some students, whereas others suggested making improvements to the face-to-face component of the courses. Suggestions included conducting more hand calculations as a group and finding ways to improve the link between the online and in-class components of the courses. For example, in terms of the online instruction, “As someone who absorbs more information from hands-on learning and class discussion, I am having a very difficult time comprehending material learned in online lectures alone.” As this comment suggests, students with little to no exposure beyond a traditional course structure may find the nature of the flipped classroom challenging. Quizzes and assignments were also identified as an area of improvement with 22% and 17% identifying these areas. This included students reporting that in-class content be more directly linked to the assignments. Furthermore, other students would have benefited from having clearer expectations of the homework, such as guidance related to formatting, presentation, and tables. The final trend identified involved SPSS trainings, where 12% of respondents suggested a new style for teaching the software. For example, a student indicated the need for additional videos on using SPSS. Only 5% reported there are no areas of improvement.

For each course, students identified several changes to the flipped format to improve their effectiveness to meet their learning needs. Based on the responses of 27 students, just over one-third (31%) indicated content delivery, with a range of recommendations for improvement. This included, for example, the general flipped course design (e.g., videos and online lectures) and, in other cases, students feeling pushed outside of their comfort zone within a new learning environment. The following comment highlights the complexity associated with having an online and face-to-face course components, “I totally get why you do the flipped class and maybe it’s because I’m not used to it, but I would often feel so lost until getting the clarifying discussions in class.” In another instance, utilizing the quizzes as more of a learning opportunity in class was noted, “I would like to see more time spent reviewing the quizzes to ensure that all students are on the same page with instructor during classroom lecture.” Another area identified for improvement was reducing the amount of work, which was identified by 19% of the students. Specifically, students generally commented on their sense of imbalance in workload with some students carrying the weight for the group. Specifically, in one instance, a student stated, “The only negative for me was the partner aspect. My partner did not contribute to any assignments except in answering the individualized questions (usually first and last). I’m not sure she even had access to SPSS.” Other areas included textbooks (9%), quizzes (6%), and “other” (13%). For example, in consideration of the textbook, “I am not sure that the textbook is necessary given the videos and the supplemental readings on Blackboard.” Examples of “other” responses included changing the syllabus or grading system. Notably, areas of improvement were associated with aspects of the course that extended beyond the specific flipped course structure.

Overall, 85% of the students reported “yes” when asked if they felt they had sufficient support from the instructor, classmates, and course materials. When asked “why,” students generally alluded to the online videos and lectures and availability of the professor. For example, a student noted, “I think I was sufficiently supported. I know for a fact if we didn’t start the assignments in class together I would have completely been useless and this class would have been far more stressful,” and another stated, “Yeah, it seemed mostly cohesive. I think the PowerPoints could be more fleshed out with details said in the video as most of the slides contained info for what SPSS was doing instead of what I would be doing.” In terms of the online videos, it was stated that, “I relied HEAVILY on the videos and resources in Blackboard to get through the assignments and to prepare for class. I had to re-watch and reread things multiple times to feel ready for the quizzes. Because I relied so heavily on the above-mentioned support, I am not sure how confident I am to do some of this on my own without that assistance yet.” One student’s perspective related to the in-class discussions was, “The professor provided ample opportunities to discuss and get feedback on assignments. Classmates were less helpful because of their laypersons’ understanding of the subject matter.” The latter comment can be attributed to the nature of “service” statistics courses in which students enter the class with varying levels of experiences and expectations with statistics.

The final question asked students if they believed class time was used effectively to help them learn, and whether the flipped classroom technique was more effective than a traditional lecture? In total, 73% of students reported “yes,” with students providing some general thoughts on their experiences within the flipped classroom environment. Students noted both positive and challenges associated with learning statistics within a flipped course structure. Overall, learning statistics within a flipped course was viewed positively in terms of how it supported their learning. This is reflected in the following comment regarding the way the course allowed them to reflect on their learning, “I prefer the flipped classroom as it gives me time to process info in class to work on assignments, and I can learn the concepts at my own pace and focus on the homework in class.” The implementation of the flipped classroom was also noted, “I think the flipped classroom is a very effective use of time and it can be more effective than the traditional lecture IF you do the work before you get to class.” Nonetheless, this structure will invariably be beneficial to some students and not others. Specifically, a beneficial point of view was, “The flipped classroom was ideal for me. It made the information easier to digest in smaller doses. It also encouraged me to be ready with questions at the beginning of class.” Contrary, it may not work for all, “Personally, I struggled with the flipped classroom technique but I understand how it could be helpful for other people. I think I’d rather get a lecture and hear directly from you how you want things to be done before I try to venture out and practice how to do things on my own.” Indeed, the challenge for statistics educators seeking alternative approaches to course design and delivery is meeting the diversity of beliefs and expectations of students who come into the class.

6 Conclusion

Cognitive psychology and technology have much to contribute to the ways in which statistics educators approach course design and delivery. Articulated frameworks for how students learn and develop their expertise offer instructors useful guidelines for course design and delivery, whereas emerging research in cognitive psychology sheds light on relatively simple and efficient strategies to assist students with engaging in their learning to improve their long-term retention of learned information. The integration of memory-enhancement techniques into these broader frameworks has much potential to transform how students are engaged in their statistics education but also how they retain this information for future use. A particular challenge to this endeavor, nonetheless, is availability of examples of how these strategies can be incorporated within or across existing courses or used in course (re)design.

The past decade has seen tremendous attention toward the ways in which the flipped classroom as a pedagogical approach can optimize students’ classroom engagement and performance. Whereas research has produced mixed results on the effectiveness of flipped courses to produce desired outcomes, much of this research has been based on recently flipped courses. While there are particular challenges associated with the development of any course, flipping a new or existing course is a particularly complex process that requires instructor dedication (e.g., technologically adept), time and resources, and students willing to learn in a nontraditional format. As such, the effectiveness of these will hinge on the degree to which students feel their learning needs are met and the course components help them in their learning.

In the author’s experience in teaching and course design, providing students the opportunity to provide feedback and evaluate course components has served invaluable in ongoing course design and modification. Despite the fact this sequence of courses have been offered for just over two years, evaluation of student data serves as an impetus for my reflections as an instructor in course design and delivery. Coinciding with course development and implementation is the need to gather specific data on students’ perceptions on their perceived effectiveness of the cognitive psychology principles (e.g., spaced retrieval) to improve their retention of statistics concepts and practices. To-date, data collection has predominantly focused on gathering student data to guide course design and development purposes. However, data provided by university-administered course evaluations suggest students have favored these efforts to improve course quality and effectiveness related to course flipping. In particular, in my first year of flipping the courses, my university course evaluations increased over one-point (on a five-point scale) with both quantitative and qualitative data reporting students’ favorable ratings.

Notwithstanding the contributions of cognitive psychology and technology to statistics education, there are clear limitations and challenges to this work. First, the statistics courses described in this article were sequenced, well-defined and flipped by the author. While this structure affords students consistency across courses in terms of their structure (e.g., Blackboard) and instructor familiarity (e.g., expectations), a different set of courses and/or instructors would certainly be expected to influence students’ learning experiences. With less sequenced courses, for example, instructors may want to align the courses’ learning objectives and scaffold assignments to capitalize on students’ prior learning experiences. Correspondingly, statistics educators interested in integrating cognitive psychology principles into existing courses, or flipping new or existing courses, should approach the process cautiously and systematically. For instance, the first stage of course flipping was designing and creating the instructional modules for the introductory course, which provided a template for the subsequent courses. Thus, this approach to course design and delivery should be expected to take more time that teaching a traditional course due to a host of factors, including, for example: technology availability/support, curriculum development, and aligning assessments to learning objectives. Last, presented data was gathered from students who were only exposed to the flipped course structure. Future research is needed regarding the effects of integrating cognitive psychology practices into flipped courses on student learning outcomes beyond these practices implemented within a traditional course (i.e., comparison group).

There is tremendous opportunity to advance statistics education to meet the learning needs of students across diverse disciplines to promote their statistical thinking and practice. Advances in cognitive science and technology have much to offer statistics educators in their course design to advance students’ learning of the information needed to be effective consumes of the statistical information they will encounter in practice and research. However, their effectiveness to promote intended outcomes will depend on the course features and students’ learning experiences. As demonstrated in this article, student feedback (typically beyond a university-level course evaluation) is a necessary component to see if our intentions and efforts of course design are having the desired effect on students. As the student data reported in this article illustrates, course design and delivery is an iterative process and not a one-size fits all for students. Nonetheless, the more knowledgeable we are of the ways in which students learn and technological tools, the better we will be to help students learn.

References

  • Balota, D. A., Duchek, J. M., and Logan, J. M. (2007), “Is Expanded Retrieval Practice a Superior Form of Spaced Retrieval? A Critical Review of the Extent Literature,” in The Foundations of Remembering: Essay in Honor of Henry L. Roediger III, ed. J. Nairne, London: Psychology Press.
  • Baum, S., and Steele, P. (2017), Who Goes to Graduate School and Who Succeeds? Washington, DC: Access Group, Inc. and Urban Institute.
  • Blunt, J. R., and Karpicke, J. D. (2014), “Learning With Retrieval-Based Concept Mapping,” Journal of Educational Psychology, 106, 849–858. DOI: 10.1037/a0035934.
  • Brown, E. N., and Kass, R. E. (2009), “What Is Statistics?,” The American Statistician, 63, 105–110, DOI: 10.1198/tast.2009.0019.
  • Butler, A. C., Marsh, E. J., Slavinsky, J. P., and Baraniuk, R. G. (2014), “Integrating Cognitive Science and Technology Improves Learning in a STEM Classroom,” Educational Psychology Review, 26, 331–340, DOI: 10.1007/s10648-014-9256-4.
  • Carey, M. D., and Dunn, P. K. (2018), “Facilitating Language-Focused Cooperative Learning in Introductory Statistics Classrooms: A Case Study,” Statistics Education Research Journal, 17, 30–50.
  • Carpenter, S. K. (2014), “Spacing and Interleaving of Study and Practice,” in Applying Science of Learning in Education, eds. V. A. Benassi, C. E. Overson, and C. M. Hakala, Washington, DC: American Psychological Association.
  • Carpenter, S. K., Cepeda, N. J., Rohrer, D., Kang, S. H. K., and Pashler, H. (2012), “Using Spacing to Enhance Diverse Forms of Learning: Review of Recent Research and Implications for Instruction,” Educational Psychology Review, 24, 369–378, DOI: 10.1007/s10648-012-9205-z.
  • Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., and Rohrer, D. (2006), “Distributed Practice in Verbal Recall Tasks: A Review and Quantitative Synthesis,” Psychological Bulletin, 132, 354–380, DOI: 10.1037/0033-2909.132.3.354.
  • Cepeda, N. J., Vul, E., Rohrer, D., Wixted, J. T., and Pashler, H. (2008), “Spacing Effects in Learning: A Temporal Ridgeline of Optimal Retention,” Psychological Science, 19, 1095–1102. DOI: 10.1111/j.1467-9280.2008.02209.x.
  • Cheng, S., Ferris, M., and Perolio, J. (2018), “An Innovative Classroom Approach for Developing Critical Thinkers in the Introductory Statistics Course,” The American Statistician, 72, 354–358, DOI: 10.1080/00031305.2017.1305293.
  • Davenport, C. E. (2018), “Evolution in Student Perceptions of a Flipped Classroom in a Computer Programming Course,” Journal of College Science Teaching, 47, 30–35. DOI: 10.2505/4/jcst18_047_04_30.
  • Dempster, F. N. (1989), “Spacing Effects and Their Implications for Theory and Practice,” Educational Psychology Review, 1, 309–330. DOI: 10.1007/BF01320097.
  • Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., and Willingham, D. T. (2013), “Improving Students’ Learning With Effective Learning Techniques Promising Directions From Cognitive and Educational Psychology,” Psychological Science in the Public Interest, 14, 4–58, DOI: 10.1177/1529100612453266.
  • GAISE College Report (2016), “Guidelines for Assessment and Instruction in Statistics Education College Report 2016,” available at http://www.amstat.org/education/gaise.
  • Gal, I. (2002), “Adults’ Statistical Literacy: Meanings, Components, Responsibilities,” International Statistical Review, 70, 1–25. DOI: 10.1111/j.1751-5823.2002.tb00336.x.
  • Greenhouse, J. B., and Seltman, H. J. (2018), “On Teaching Statistical Practice: From Novice to Expert,” The American Statistician, 72, 147–154, DOI: 10.1080/00031305.2016.1270230.
  • Hamdan, N., McKnight, P., McKnight, K., and Arfstrom, K. M. (2013), “The Flipped Learning Model: A White Paper Based on the Literature Review Titled: A Review of Flipped Learning,” available at https://flippedlearning.org/wp-content/uploads/{\ldots}/WhitePaper{\_}FlippedLearning.pdf.
  • Hopkins, R. F., Lyle, K. B., Hieb, J. L., and Ralston, P. A. S. (2016), “Spaced Retrieval Practice Increases College Students’ Short- and Long-Term Retention of Mathematics Knowledge,” Educational Psychology Review, 28, 853–873, DOI: 10.1007/s10648-015-9349-8.
  • Hopkins, R. F., Lyle, K. B., Ralston, P. A., Bego, C. R., and Heib, J. L. (2018), “Retrieval Practice and Spacing: Effects on Long-Term Learning Among Engineering Precalculus Students,” in American Society of Engineering Education Proceedings of the American Society of Engineering Education, Salt Lake City, UT.
  • Karpicke, J. D. (2009), “Metacognitive Control and Strategy Selection: Deciding to Practice Retrieval During Learning,” Journal of Experimental Psychology: General, 138, 469–486. DOI: 10.1037/a0017341.
  • Karpicke, J. D. (2012), “Retrieval-Based Learning: Active Retrieval Promotes Meaningful Learning,” Current Directions in Psychological Science, 21, 157–163. DOI: 10.1177/0963721412443552.
  • Karpicke, J. D., Butler, A. C., and Roediger, J. L., III. (2009), “Metacognitive Strategies in Student Learning: Do Students Practice Retrieval When They Study on Their Own?,” Memory, 17, 471–479, DOI: 10.1080/09658210802647009.
  • Karpicke, J. D., and Grimaldi, P. J. (2012), “Retrieval-Based Learning: A Perspective for Enhancing Meaningful Learning,” Educational Psychology Review, 24, 401–418, DOI: 10.1007/s10648-012-9202-2.
  • Karpicke, J. D., and Roediger, H. L., III (2008), “The Critical Importance of Retrieval for Learning,” Science, 319, 966–968. DOI: 10.1126/science.1152408.
  • Karpicke, J. D., and Roediger, H. L., III (2010), “Is Expanding Retrieval a Superior Method for Learning Text Materials?,” Memory & Cognition, 38, 116–124. DOI: 10.3758/MC.38.1.116.
  • Karpicke, J. D., and Zaromb, F. M. (2010), “Retrieval Mode Distinguishes the Testing Effect From the Generation Effect,” Journal of Memory and Language, 62, 227–239. DOI: 10.1016/j.jml.2009.11.010.
  • Keeley, J., Zayac, R., and Correia, C. (2008), “Curvilinear Relationship Between Statistics Anxiety and Performance Among Undergraduate Students: Evidence for Optimal Anxiety,” Statistics Education Research Journal, 7, 4–15.
  • Lage, M. J., Platt, G. J., and Treglia, M. (2000), “Inverting the Classroom: A Gateway to Creating Inclusive Learning Environment,” The Journal of Economic Education, 31, 30–43, DOI: 10.1080/00220480009596759.
  • Lawson, A. E. (2002), Science Teaching and Development of Thinking, Belmont, CA: Wadsworth/Thompson Learning.
  • Lee, H. K. H. (2007), “Chocolate Chip Cookies as a Teaching Aid,” The American Statistician, 61, 351–355, DOI: 10.1198/000313007X246905.
  • Lindsay, B. G., Kettenring, J., and Siegmund, D. (2004), “A Report on the Future of Statistics,” Statistical Science, 19, 387–413, DOI: 10.1214/088342304000000404.
  • Lovett, M. C., and Greenhouse, J. B. (2000), “Applying Cognitive Theory to Statistics Instruction,” American Statistician, 54, 196–206, DOI: 10.1080/00031305.2000.10474545.
  • Lyle, K. B., and Crawford, N. A. (2011), “Retrieving Essential Material at the End of Lectures Improves Performance on Statistics Exams,” Teaching of Psychology, 38, 94–97, DOI: 10.1177/0098628311401587.
  • Mills, J. D. (2004), “Students’ Attitudes Towards Statistics: Implications for the Future,” College Student Journal, 38, 349–361.
  • Mills, J. D., and Raju, D. (2011), “Teaching Statistics Online: A Decade’s Review of the Literature About What Works,” Journal of Statistics Education, 19, 1–28, DOI: 10.1080/10691898.2011.11889613.
  • Nelson, L. R., and Zaichkowsky, L. D. (1979), “A Case for Using Multiple Regression Instead of ANOVA in Educational Research,” The Journal of Experimental Education, 47, 324–330. DOI: 10.1080/00220973.1979.11011701.
  • Nielsen, P. L., Bean, N. W., and Larsen, R. A. A. (2018), “The Impact of a Flipped Classroom Model of Learning on a Large Undergraduate Statistics Class,” Statistics Education Research Journal, 17, 121–140.
  • Okahana, H., and Zhou, E. (2018), Graduate Enrollment and Degrees: 2007 to 2017. Washington, DC: Council of Graduate Schools.
  • Pashler, H., Bain, P., Bottge, B., Gaesser, A., Koedinger, K., McDaniel, M., and Metcalfe, J. (2007), Organizing Instruction and Study to Improve Student Learning (NCER 2007–2004), Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education, available at http://ncer.ed.gov.
  • Peterson, J. (2016), “The Flipped Classroom Improves Student Achievement and Course Satisfaction in a Statistics Course: A Quasi-Experimental Study,” Teaching of Psychology, 43, 10–15, DOI: 10.1177/0098628315620063.
  • Presti, C. R. (2016), “The Flipped Learning Approach in Nursing Education: A Literature Review,” Journal of Nursing Education, 55, 252–257, DOI: 10.3928/01484834-20160414-03.
  • Richland, L. E., Kornell, N., and Kao, L. S. (2009), “The Pretesting Effect: Do Unsuccessful Retrieval Attempts Enhance Learning?,” Journal of Experimental Psychology: Applied, 15, 243–257, DOI: 10.1037/a0016496.
  • Ridgway, J. (2016), “Implications of the Data Revolution for Statistics Education,” International Statistical Review, 84, 528–549, DOI: 10.111/insr.12110.
  • Roediger, H. L., and Karpicke, J. D. (2006), “Test-enhanced Learning: Taking Memory Tests Improves Long-Term Retention,” Psychological Science, 17, 249–255. DOI: 10.1111/j.1467-9280.2006.01693.x.
  • Roseth, C. J., Garfield, J. B., and Ben-Zvi, D. (2008), “Collaboration in Learning and Teaching Statistics,” Journal of Statistics Education, 16. DOI: 10.1080/10691898.2008.11889557.
  • Schwartz, T. A. (2014), “Flipping the Statistics Classroom in Nursing Education,” The Journal of Nursing Education, 53, 199–206, DOI: 10.3928/0148434-20140325-02.
  • Schwieren, J., Barenberg, J., and Dutke, S. (2017), “The Testing Effect in the Psychology Classroom: A Meta-Analytic Perspective,” Psychology Learning & Teaching, 16, 179–196. DOI: 10.1177/1475725717695149.
  • Smith, G. (1998), “Learning Statistics by Doing Statistics,” Journal of Statistics Education, 6. DOI: 10.1080/10691898.1998.11910623.
  • Songsore, E., and White, B. J. G. (2018), “Students’ Perceptions of the Future Relevance of Statistics Afer Completing an Online Introductory Statistics Course,” Statistics Education Research Journal, 17, 120–140.
  • Tishkovskaya, S., and Lancaster, G. A. (2012), “Statistical Education in the 21st Century: A Review of Challenges, Teaching Innovations and Strategies for Reform,” Journal of Statistics Education, 20, 1–43. DOI: 10.1080/10691898.2012.11889641.
  • Winquist, J. R., and Carlson, K. A. (2014), “Flipped Statistics Class Results: Better Performance than Lecture Over One Year Later,” Journal of Statistics Education, 22, 1–10, DOI: 10.1090/10691898.2016.1158017.
  • Wild, C. J., Pfannkuch, M., Regan, M., and Horton, N. J. (2004), “Towards More Accessible Conceptions of Statistical Inference,” Journal of the Royal Statistical Society, Series A, 174, 247–295. DOI: 10.1111/j.1467-985X.2010.00678.x.
  • Zieffler, A., Garfield, J., Alt, S., Dupuis, D., Holleque, K., and Chang, B. (2008), “What Does Research Suggest About Teaching and Learning of Introductory Statistics at the College Level? A Review of the Literature,” Journal of Statistics Education, 16. DOI: 10.1080/10691898.2008.11889566.