4,688
Views
9
CrossRef citations to date
0
Altmetric
Original Article

Evidence‐based teaching: Tools and techniques that promote learning in the psychology classroom

, , &
Pages 5-13 | Received 16 Oct 2012, Accepted 02 Dec 2012, Published online: 20 Nov 2020

Abstract

Evidence‐based teaching (EBT) entails the use of empirically validated pedagogical tools and techniques that promote student learning. We offer a rationale for why psychology instructors should embrace EBT in their classrooms. We then review five areas of evidence offering specific tools and techniques that improve learning and retention: the testing effect, spaced learning, metacognition, writing to learn, and interteaching. We then briefly discuss how three student self‐regulated choices can promote learning. Finally, we urge psychology teachers and students to use the discipline's experimental findings to enhance student learning.

Higher education (HE) continues to be buffeted by evidence that undergraduate students are not learning what they need to know to do well in university settings and subsequently in the workplace and in life. Arum and Roska (Citation2011), for example, found that even after three semesters, American college students’ critical thinking, complex reasoning, and writing skills showed little improvement. Clearly, students need guidance to acclimate to and succeed in college. But what sort of guidance? More to the point, how can teachers use their knowledge of psychology to help students retain what they have learned?

One possible antidote is to use evidence‐based teaching (EBT) practices in the classroom. The term evidence‐based teaching refers to pedagogical tools and techniques that have shown through rigorous experimentation to promote learning (Saville, Citation2010; Schwartz & Gurung, Citation2012). In this article, we offer a rationale for why psychology teachers around the world should embrace EBT in their classrooms. Indeed, we believe the concerns and recommended responses raised in this article are relevant within and beyond our ken, the American HE system. We then review selected areas of evidence, that is, specific tools and techniques that enhance learning. Next, we turn the intellectual tables a bit by discussing how students can use evidence‐based learning (EBL) strategies outside of the classroom and how self‐regulation is important for learning. Finally, we encourage psychology teachers to adopt EBT in their classrooms.

The Case for EBT

Why use EBT methods? We believe that the answer is straightforward: These methods have shown through rigorous empirical testing to produce superior learning outcomes. And the need for increased learning outcomes may be as imperative now as ever before.

Over the past few decades, there has been increased concern by academicians that the level of learning occurring in secondary and post‐secondary educational settings is not what it should be. For example, 30 years ago, Cameron (Citation1983) lamented, ‘Most predictions about the future of colleges and universities as organisations include conditions of decline’ (p. 359). More recently, Hersh and Merrow (Citation2005) seemed to confirm Cameron's pessimistic prediction by stating, ‘Higher education, long viewed as the crown jewel of American education, is tarnished’ (p. 1). Simply put, by many accounts, students do not seem to be learning what they should during an important developmental period in their lives.

Concerns about HE have moved beyond academic circles as well. For example, parents paying for their children's increasingly expensive educations want to be sure that they are getting a good return on their investments. Similarly, employers want to know that recent graduates have acquired the knowledge and skills necessary to help advance their business interests and economic goals. Others, including economists and politicians, view education as one of the primary paths to economic security, where increased knowledge seems to be a vital factor in achieving global success (e.g., Friedman, Citation2005; see also Buskist & Groccia, Citation2011).

So what are the possible reasons for the purported decline in HE? Hersh and Merrow (Citation2005) discussed several potential causes: students who are unprepared for the rigors of HE, the belief that HE is nothing more than a path to job obtainment, the mismatch between what students want out of HE and what educational institutions are offering, and HE administrations that have become too focused on monetary outcomes. Robinson (Citation2009) has also suggested that the recent trend towards systematising curricula and assessment has killed much of the motivation and creativity that are necessary for students to succeed in today's world. To this list, we would like to add one other potential culprit: ineffective teaching methods.

For many HE instructors, ‘teaching’ means ‘lecturing’. Moreover, as Buskist and Groccia (Citation2011) noted, ‘Most teachers base their instructional practices on tradition, the opinion of experienced practitioners, ideology, faddism, marketing, politics, or personal experience gained through trial and error’ (p. 5). Unfortunately, many of these practices do not concur with what is known about human learning (Halpern & Hakel, Citation2003; Saville, Citation2010). Consequently, many of the methods that appear in classrooms may not be producing the types of learning that are necessary to succeed in a rapidly changing world. Although many of the factors mentioned by Hersh and Merrow (Citation2005) may be beyond the reach of any single teacher, it is important for teachers to control what they can—and one such variable is the teaching methods they adopt in their classrooms. In short, to help produce the types of learning outcomes that will allow students to succeed in today's world, it is vital that teachers adopt methods that improve student learning and enjoyment—in other words, EBT.

Areas of Evidence: A Review

Later we discuss several teaching strategies and techniques that have been demonstrated to be effective. Although we do not provide an exhaustive review of EBT, the techniques we highlight include methods that are relatively easy to implement in most psychology classrooms.

The testing effect

Although most faculty and students consider tests as a way to assess learning, fewer realise that taking tests actually enhances memory (e.g., Karpicke, Butler, & Roediger, Citation2009). A burgeoning body of research provides convincing evidence for the testing effect, or test‐enhanced learning, the finding that students remember more information when they have previously been tested on study materials. Recent research has also highlighted the effectiveness of repeated testing in promoting the transfer of learning to new contexts (e.g., different types of testing or different knowledge domains; Carpenter, Citation2012).

Testing effect research encompasses a wide range of test formats. The types of tests used for both learning and final assessment have included both recognition and recall formats. Targeted information has included paired associates or word lists, information from lectures (Butler & Roediger, Citation2007), and expository prose (e.g., Larsen, Butler, & Roediger, Citation2009). Sometimes students have received feedback during the learning phase; in other cases, they have not (e.g., Pashler, Cepeda, Wixted, & Rohrer, Citation2005; Vojdanoska, Cranney, & Newell, Citation2010). Testing situations have involved both open‐book and closed‐book conditions (Agarwal, Karpicke, Kang, Roediger, & McDermott, Citation2008). Although some research has included the same material for learning and testing, other studies (e.g., Chan, McDermott, & Roediger, Citation2006) have used related but different material on initial and final tests. Retention intervals have extended from one day to several days or weeks. Results vary depending upon the variables being manipulated; for example, as might be expected, feedback enhances subsequent performance on items that students initially answer incorrectly (Pashler et al., Citation2005; Vojdanoska et al., Citation2010). Overall, the preponderance of evidence shows strong support for test‐enhanced learning and remembering, particularly when instructors provide feedback and adequate time for processing it (Butler & Roediger, Citation2007).

Explanations for the benefits of testing often focus on the processes underlying retrieval practice. Retrieval of previously studied information strengthens existing associative memory links between related cues and the targeted information (Carpenter, Citation2011). In turn, stronger links facilitate subsequent access to subsequent retrieval of information. Furthermore, Karpicke and Blunt (Citation2011) have suggested that retrieval practice helps differentiate more effective versus less effective cues. Both explanations offer a clear rationale for the use of repeated testing to enhance retention of and access to knowledge.

The addition of more frequent testing in the classroom seems to be the most obvious way to take advantage of the testing effect. Indeed, Leeming (Citation2002) found that students in courses including a short exam at the beginning of each class scored higher on a final exam than students in courses including only three exams. Of course, instructors’ decisions to incorporate additional testing may depend on how much class time they have available or may be constrained by institutional guidelines. In such instances, instructors may develop assessments (either formative or summative) for students to complete online or may direct students to available self‐tests on textbook companion websites with immediate feedback available. For example, Yandell and Bailey (Citation2011) concluded that online, pre‐class computer‐graded quizzes that provided immediate feedback and ‘saved’ class time can also encourage students to read regularly and participate in class. Another formative option is to assess an entire class's topical knowledge simultaneously by having students respond with clickers so that right (and wrong) responses are recognised and subsequently discussed. As another alternative, instructors could explain the benefits of using EBL strategies such as repeated testing and then encourage students to test themselves while studying. For example, McDaniel, Howard, and Einstein (Citation2009) reported that students who used the read‐recite‐review technique boosted performance on a free recall test of textbook material more than students who spent an equivalent amount of time spent rereading or taking notes. In a similar vein, instructors who provide learning outcomes for each chapter or unit might suggest that students respond to each outcome after studying and then check their own answers or have their answers checked by peers.

Clearly, how instructors design their courses will depend on a number of factors. Nevertheless, the available evidence for the testing effect should encourage instructors to consider integrating more frequent testing or other activities that require retrieval into their courses. As Karpicke (Citation2012) argued, ‘Spending time actively attempting to retrieve and reconstruct one's knowledge is a simple yet powerful way to enhance long‐term, meaningful learning’ (p. 162).

Spaced learning

‘With any considerable number of repetitions a suitable distribution of them over a space of time is decidedly more advantageous than the massing of them at a single time’ (Ebbinghaus, Citation1885, ch. 8, sec. 34, para. 6). Evidence supporting the spacing effect emanates from an expansive body of research with a variety of populations including animals (Sisti, Glass, & Shors, Citation2007), young children (Vlach & Sandhofer, Citation2012), older adults (Wahlheim, Dunlosky, & Jacoby, Citation2011), and HE students. With HE students, the majority of researchers have identified the benefits of distributed practice as it pertains to memory for verbal material. In most cases, the study items involved word lists or word pairs such as those used in foreign language learning (Bahrick & Hall, Citation2005), but memory for prose material also benefits from distributed practice (Rawson & Kintsch, Citation2005), particularly if testing did not occur immediately after learning. Moreover, the value of distributed practice also extends to conceptual learning (Wahlheim et al., Citation2011). Thus, over a century of research has revealed that studying information or practising skills in multiple short sessions yields greater long‐term retention than consolidating study or practice into a single long session (i.e., ‘cramming’) for a comparable amount of time.

There are multiple explanations for why the spacing effect occurs. One explanation is based on the idea that people are more likely to retrieve information if the information is accessible via a wide variety of retrieval cues. Because the context for encoding information often differs from one study session to another, multiple study sessions provide more retrieval cues (Cepeda, Pashler, Vul, Wixted, & Rohrer, Citation2006). Another explanation emphasises encoding when studying textual material for a second time. In a massed situation, with the second reading immediately following the first, information from the initial reading may still be active in working memory. The active information conveys a sense of familiarity that leads people only to skim the material. In contrast, with distributed practice, the information is ‘deactivated’ with the passage of time. Essentially, then, people adopt more effective encoding strategies that promote greater retention (Krug, Davis, & Glover, Citation1990).

Yet a third explanation for the spacing effect focuses on neurobiological processes relating to memory consolidation. Two aspects of consolidation appear to relate to the greater effectiveness of distributed practice. First, consolidation throughout the memory system takes time (Squire & Alvarez, Citation1995). Second, when people process material for a second time, the new memory trace builds on the strength of the first one in long‐term memory (Cepeda et al., Citation2006). When practice is distributed, information from the initial session has time to strengthen before relearning.

Even with a relatively large literature on the spacing effect, there are still many unanswered questions about the optimal time between practice sessions and how the time between sessions relates to the delay between final study and test. For example, Bahrick and Hall (Citation2005) suggested that longer intervals between study sessions are beneficial because longer intervals provide learners with a better opportunity to identify retrieval failures, which may encourage learners to adopt more effective encoding strategies (see, e.g., Rawson & Kintsch, Citation2005). Cepeda et al. (Citation2009) suggested that the time between distributed study sessions that was associated with the best test performance seemed to increase as the time between the final study session and the test increased. However, a longer gap between study sessions is not always better. For example, in an experiment (Verkoeijen, Rikers, & Özsoy, Citation2008) involving two study opportunities for textual material spaced immediately after one another, 4 days apart, or 3.5 weeks apart, students who studied 4 days apart performed better on a test 2 days later than students who restudied the text immediately. Students in the 3.5 week and immediate conditions, however, performed similarly. In this case, the inability to retrieve material 3.5 weeks later appeared to outweigh the potential benefit of differential encoding.

Given the multiple factors influencing the spacing effect, how might instructors use this research to benefit their students? Because some form of distributed practice consistently boosts learning, one application is clear: Instructors should encourage students to use another EBL strategy, specifically, spacing study sessions rather than cramming before examinations. Although students may already understand the advantage of spaced practice (Toppino & Cohen, Citation2010), understanding its value is quite different from actually using it. Perhaps one way for instructors to boost long‐term retention with spaced practice is to introduce the same key concepts systematically at multiple points in the semester. Another possibility might involve including more frequent quizzes (either formative or summative) to promote more frequent study and simultaneously allowing students to accrue benefits from the testing effect.

Metacognition: Thoughts about thinking

One of the most important skills students can acquire is knowing what they already know. In effect, students need to be able to think critically about their thinking, or what psychologists call metacognition. Our metacognitions can affect both what we do and how well we accomplish our actions; in fact, thinking about thinking can heighten, diminish, or change the direction of our original thoughts (e.g., Briñol & DeMarree, Citation2012). Metacognitive skills develop in childhood and represent a change from other‐directed learning to self‐regulated learning (Dimmitt & McCormick, Citation2012).

One approach to promoting metacognitive learning (another form of EBL) involves teaching students to develop procedural knowledge regarding particular academic tasks, such as outlining, drafting, and revising a persuasive essay on a specific course topic. More critical, reflective note taking (e.g., rewriting and revising notes after a class period ends), too, can promote learning and retention (Dunn, Citation2011). Strategies for performing such tasks should be taught and repeated with some regularity so that students learn to use these skills in other classes. A key issue here is that teachers must help students learn to become responsible where intellectual tasks are concerned (Baker, Citation2008). In effect, students need to develop self‐assessment skills regarding their execution of college‐level work (Dunn, McEntarffer, & Halonen, Citation2004). Part of the effort to develop metacognitive skills entails ‘self‐talk’, where students literally talk themselves through the steps necessary to solve some problem (e.g., crafting a logical argument, defending a position opposite one's beliefs, and writing a literature review for an empirical paper). Further research on metacognition in educational settings can be found in Dimmitt and McCormick (Citation2012). Metacognitive skills can be fostered through a variety of techniques, and it is worth noting that helping students identify what they do and do not know is, at the very least, an implicit goal of the other strategies we discuss here.

Writing to learn

The development of writing skills is an important goal of undergraduate education. Psychology instructors typically assess student performance on writing assignments to check understanding of course content, as well as to evaluate skills in written expression. Although demonstrating content knowledge and writing skills are traditional uses of writing assignments, less recognised is the role that writing can play in student learning.

Writing involves multiple activities that engage metacognitive processes (Bereiter & Scardamalia, Citation1987; Berninger, Citation2012). The ‘writing‐to‐learn’ approach is based on the conception that writing about a topic can help students identify areas of confusion or lack of knowledge, reason through problems, and bring concepts together in new ways. As Forsyth (Citation2003) pointed out, writing assignments can be ‘exercises in learning’ (p. 114) that help students ‘identify and define problems, evaluate evidence, analyse assumptions, recognise emotional reasoning and oversimplification, consider alternative interpretations, and reduce their uncertainty’ (p. 115). Dunn (Citation2011) discussed writing for self‐understanding, highlighting the role that writing can play in helping students actively process and learn course material. Writing‐to‐learn strategies can be implemented in multiple ways. A hallmark of all writing to learn activities, however, is the use of the targeted writing assignments that require students to apply, integrate, or reflect on some content knowledge.

Positive effects of writing‐to‐learn activities have been demonstrated across several disciplines, using different techniques. For example, Balgopal, Wallace, and Dahlberg (Citation2012) have examined the effects of guided writing assignments (reflective essays) on introductory biology students’ ecological literacy. Papadopoulos, Demetriadis, Stamelos, and Tsoukalas (Citation2011) used ‘writing prompts’ to enhance learning in an online computer science class. They found that students in a writing group scored higher on post‐tests than students in either of two control groups. Stewart, Myers, and Culley (Citation2010) examined the use of in‐class writing in a psychology of women course. During several class periods, students wrote short, structured essays designed to elicit critical thinking about topics in the course. Students were later tested on material covered in these writing assignments, and their scores were compared with those of students in a control psychology of women class that covered the same material but did not experience the writing assignments. Those in the writing‐focused class scored higher on these subsequent learning assessments.

Other strategies that require students to think about and reflect on content often include a writing component. For example, Connor‐Greene (Citation2005) introduced ‘questions, quotations, and talking points’ (QQTPs) to facilitate meaningful class discussion. With QQTPs, students complete a one‐page paper prior to class in which they identify a question, a quotation, and some talking points based on a reading assignment. They then use these papers as a basis for class discussion. Angelo and Cross (Citation1993) also presented multiple class assessment techniques (e.g. minute papers, muddiest point, word journal, invented dialogues) that involve writing. Any of these activities may be useful in increasing student learning.

Evaluating students’ writing can take a lot of time, which is enough to make many instructors shy away from multiple writing assignments. But as Dunn (Citation1994) pointed out, writing to learn activities need not require an inordinate amount of feedback from faculty to be useful (e.g., see Korn & Sikorski, Citation2010). For example, Nevid, Pastva, and McClelland (Citation2012) found that brief ungraded writing assignments in introductory psychology improved test performance over the same material. Ultimately, frequent writing may be one way to improve student learning.

Interteaching

Interteaching (Boyce & Hineline, Citation2002) is a multi‐component teaching method that has its roots in B. F. Skinner's behavioural psychology. Although behavioural teaching methods have been around for over 50 years (e.g., Skinner, 1954/Citation1954), interteaching is relatively new. A growing body of research suggests that interteaching is a viable alternative to more traditional teaching methods (see Saville, Lambert, & Robertson, Citation2011, for a review).

In a typical interteaching session, the instructor first prepares a preparation (prep) guide, which contains questions (typically from 5 to 12 items) designed to guide students through a reading assignment. The prep‐guide items contain both lower and higher level questions that require students to define, analyse, apply, and synthesise course information. The instructor usually distributes the prep guide several days before it is due, and students complete it before class.

Each class period starts with a brief lecture in which the instructor reviews material from the previous class (see later for more information). Students then spend most of the remaining time in pairs discussing their responses to the prep‐guide items they answered before class. During the discussions, the instructor circulates among the groups, answering questions and guiding the discussions. Following the discussions, students complete a record sheet on which they list, among other things, which prep‐guide items were difficult to understand and which items they would like the instructor to review. Based on this information, the instructor prepares a brief lecture that targets the items that most students had trouble understanding. The lecture starts the next class period and precedes students’ discussion of the next prep guide.

In addition to this general structure, interteaching has other components. For example, Boyce and Hineline (Citation2002) suggested that instructors should give at least five exams during the semester, which gives students ample opportunity to ‘show what they know’ and allows them to get used to interteaching. Boyce and Hineline also suggested that the exams should be closely linked to the prep guides. As a result, students know that completing the prep guides and having substantive class discussions will provide practice for the exams. To improve discussion quality, Boyce and Hineline introduced the concept of quality points, where part of a student's course grade (up to 10%) depends on how his or her discussion partners answered some of their exam questions (but see Saville and Zinn, Citation2009, for a study showing no impact of quality points on exam grades). Finally, Boyce and Hineline suggested that students should receive a small number of participation points each day (which total 10% of their course grades). This increases the likelihood that students will attend class.

Since Boyce and Hineline's (Citation2002) introduction of interteaching, a growing body of research has examined its efficacy relative to more traditional teaching methods. In an early, lab‐based study of interteaching, Saville, Zinn, and Elliott (Citation2005) assigned students to one of four teaching conditions: interteaching, lecture, reading, or control. Students took part in an initial teaching session and then returned 1 week later to take a quiz. Students who took part in interteaching answered significantly more questions correctly than students in the other three conditions. In a subsequent study, Saville, Zinn, Neef, Van Norman, and Ferreri (Citation2006) compared interteaching with lecture in a graduate‐level special education course (study 1) and in two sections of an undergraduate research methods course (study 2). In both studies, students earned higher exam scores following interteaching sessions; students also reported that they preferred interteaching to lectures. More recently, researchers have found that interteaching seems to improve critical thinking (Saville, Zinn, Lawrence, Barron, & Andre, Citation2008; Scoboria & Pascual‐Leone, 2009) and that it benefits students with both low and high grade point averages (Saville, Pope, Truelove, & Williams, Citation61in press). Finally, although most research on interteaching has been conducted in psychology courses, researchers have begun to examine interteaching in non‐psychology courses (e.g., Cannella‐Malone, Axe, & Parker, Citation2009; Emurian & Zheng, Citation2010; Goto & Schneider, Citation2010; Tsui, Citation2010; Zeller, Citation2010). Together, these results suggest that interteaching is an effective, evidence‐based alternative to more traditional teaching methods.

Reminding Students of the Link Between Self‐regulation and Learning

As quality principle 1 of the principles for quality undergraduate education in psychology (American Psychological Association [APA], Citation2011; Halpern et al. Citation2010) attests, students are ultimately responsible for their own learning. Although teachers of psychology should share information with students regarding what types of EBL strategies promote or inhibit learning, students must learn to engage in self‐regulation to advance their own academic interests. We review three choices that students can make (and that teachers may wish to review) that can have a positive effect on learning.

Pauses that refresh: Pastoral breaks

Researchers who study Attention Restoration Theory claim that people demonstrate better concentration after spending time in nature (e.g., in parks or gardens), seeing nature from indoors (e.g., looking out a window at a copse of trees), or viewing images of natural settings (e.g., Hartig, Evans, Jamner, Davis, & Gärling, Citation2003; Kaplan, Citation1995; Kaplan & Kaplan, Citation1989). Experiencing nature directly or indirectly has decided cognitive and emotional benefits. One set of studies, for example, found that exposure to such restorative environments eased people's recovery from mental fatigue caused by performing a sustained attention test, thereby restoring their later capacities to direct attention (Berto, Citation2005). Other research has demonstrated that interacting with nature improves cognition and emotion among individuals who are depressed (Berman et al., Citation2012) and improves memory in HE students (Berman, Jonides, & Kaplan, Citation2008).

The ‘take‐home’ message for students is simple and straightforward: Taking pastoral breaks by going outside or looking at images of nature is physically, intellectually, and emotionally restorative. Students can reduce their stress and enhance their attention by taking frequent study breaks, and teachers should encourage their students to do so.

Sufficient sleep

Lack of sleep is a well‐documented psychological problem (Dement & Vaughn, Citation2000). Undergraduate life is rife with student stories of ‘pulling all nighters’ to study for an exam or to complete some other assignment. At the same time, many students enjoy recreational pursuits because of the relative freedom that universities allow. In short, sleep deprivation is a hallmark of undergraduate students’ experiences.

Unfortunately, when students become sleep deprived, physical and mental performance suffers (Walker & Stickgold, Citation2004). The great challenge for instructors, of course, is convincing students that they cannot ‘get by’ and be successful with less than the recommended amount of sleep per night, as many students believe that their own experiences defy the available evidence.

Becoming open minded regarding basic study habits

Students often claim to know which approaches to studying work well, at least in their individual experiences. The chief problem is that few students ever subject their suppositions to simple experimentation (e.g., trying a new study habit to test its effectiveness). Consider this example: Students often claim that studying while listening to music is not detrimental to their learning (in spite of the fact that research does not support this conclusion; e.g., Gurung, Weidert, & Jeske, Citation2010). Thus, we advocate that teachers not only share the available evidence but that they encourage students to test their assumptions.

Gurung and McCann (Citation2012) provided a long list of actions students can take to assume responsibility for and to maximise their learning. Many of the items found in this list (e.g., review assignments before class and use study guides) seem commonsensical, but as Voltaire (1764/Citation1950) reminds us, ‘common sense is not so common’ (p. 79). Sharing such basic recommendations and explaining their effectiveness can prompt students to try out and perhaps adopt some evidence‐based, self‐regulated learning suggestions.

Conclusion

As most educators will attest, teaching is a tricky business—not only because of complex factors that occur within the classroom but also because of external factors that, for better or worse, have a significant impact on our educational system. Unfortunately, by many accounts, these factors have not produced the most positive results. But there is hope. Psychology is one of the few disciplines in which teachers and students can use what they know about their own subject matter to produce positive changes. One specific way to do this is for teachers to incorporate EBTs in their classrooms. At the same time, teachers can show their students how to use EBL strategies. In both cases, these applications are likely to have a positive effect on psychological literacy, a desired outcome in undergraduate psychology education (e.g., APA, Citation2011). In this article, we have discussed a handful of empirically supported techniques—EBTs and EBLs—that teachers of psychology can incorporate into their classrooms and that students can use on their own. Of course, the introduction of new and unfamiliar teaching methods may be met with some skepticism. ‘Certainly’, as Saville (Citation2010) noted, ‘implementing any new teaching method can be tricky and sometimes frustrating, especially in classroom settings where students have grown accustomed to lectures’ (p. 51). But with consistent use and a little tweaking to fit each specific situation, the application of EBT and learning strategies is likely to have a positive and extended impact on student learning and enjoyment.

References

  • Agarwal, P. K., Karpicke, J. D., Kang, S. H. K., Roediger, H. L., III, & Mcdermott, K. B. (2008). Examining the testing effect with open‐ and closed‐book tests. Applied Cognitive Psychology, 22, 861–876.
  • American Psychological Association (2011). Principles for quality undergraduate education in psychology. American Psychologist, 66, 850–856. doi:https://doi.org/10.1037/a0025181
  • Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco, CA: Jossey‐Bass.
  • Arum, R., & Roska, J. (2011). Academically adrift: Limited learning on college campuses. Chicago, IL: University of Chicago Press.
  • Bahrick, H. P., & Hall, L. K. (2005). The importance of retrieval failures to long‐term retention: A metacognitive explanation of the spacing effect. Journal of Memory and Language, 52, 566–577. doi:https://doi.org/10.1016/j.jml.2005.01.012
  • Baker, L. (2008). Metacognition in comprehension instruction: What we've learned since NRP. In C. C. Block & S. R. Parris (Eds.), Comprehension instruction: Research‐based best practices (2nd ed.). (pp. 65–79). New York: Guilford Press.
  • Balgopal, M. M., Wallace, A. M., & Dahlberg, S. (2012). Writing to learn ecology: A study of three populations of college students. Environmental Education Research, 18, 67–90.
  • Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillsdale, NJ: Lawrence Erlbaum.
  • Berman, M. G., Jonides, J., & Kaplan, S. (2008). The cognitive benefits of interacting with nature. Psychological Science, 19(12), 1207–1212. doi:https://doi.org/10.1111/j.1467‐9280.2008.02225.x
  • Berman, M. G., Kross, E., Krpan, K. M., Askren, M. K., Burson, A., Deldin, P. J., Jonides, J. (2012). Interacting with nature improves cognition and affect for individuals with depression. Journal of Affective Disorders, 140, 300–305. doi:https://doi.org/10.1016/j.jad.2012.03.012
  • Berninger, V. W. (Ed.) (2012). Past, present, and future contribution of cognitive writing research to cognitive psychology. New York: Psychology Press.
  • Berto, R. (2005). Exposure to restorative environments helps restore attentional capacity. Journal of Environmental Psychology, 25(3), 249–259. doi:https://doi.org/10.1016/j.jenvp.2005.07.001
  • Boyce, T. E., & Hineline, P. N. (2002). Interteaching: A strategy for enhancing the user‐friendliness of behavioral arrangements in the college classroom. Behavior Analyst, 25, 215–226.
  • Briñol, P., & Demarree, K. G. (2012). Social metacognition. New York: Psychology Press.
  • Buskist, W. , & Groccia, J. (Eds.) (2011). Evidence‐based teaching: New directions in teaching and learning. San Francisco, CA: Jossey‐Bass.
  • Butler, A. C., & Roediger, H. L., III (2007). Testing improves long‐term retention in a simulated classroom setting. European Journal of Cognitive Psychology, 19, 514–527.
  • Cameron, K. (1983). Strategic responses to conditions of decline: Higher education and the private sector. Journal of Higher Education, 54, 359–380.
  • Cannella‐malone, H. I., Axe, J. B., & Parker, E. D. (2009). Interteach preparation: A comparison of the effects of answering versus generating study guide questions on quiz scores. Journal of the Scholarship of Teaching and Learning, 9, 22–35.
  • Carpenter, S. K. (2011). Semantic information activated during retrieval contributes to later retention: Support for the mediator effectiveness hypothesis of the testing effect. Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 1547–1552. doi:https://doi.org/10.1037/a0024140
  • Carpenter, S. K. (2012). Testing enhances the transfer of learning. Current Directions in Psychological Science, 21, 279–283. doi: https://doi.org/10.1177/0963721412452728
  • Cepeda, N. J., Coburn, N., Rohrer, D., Wixted, J. T., Mozer, M. C., & Pashler, H. (2009). Optimizing distributed practice: Theoretical analysis and practical implications. Experimental Psychology, 56, 236–246. doi:https://doi.org/10.1027/1618‐3169.56.4.236
  • Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132, 354–380. doi:https://doi.org/10.1037/0033‐2909.132.3.354
  • Chan, J. C. K., Mcdermott, K. B., & Roediger, H. L., III (2006). Retrieval induced facilitation: Initially nontested material can benefit from prior testing of related material. Journal of Experimental Psychology: General, 135, 553–571.
  • Connor‐greene, P. A. (2005). Fostering meaningful classroom discussions: Student‐generated questions, quotations, and talking points. Teaching of Psychology, 32, 173–175.
  • Dement, W. E., & Vaughn, C. (2000). The promise of sleep. New York: Dell.
  • Dimmitt, C., & Mccormick, C. B. (2012). Metacognition in education. In K. R. Harris , S. Graham , & T. Urdan (Eds.), APA educational psychology handbook: Vol 1. Theories, constructs, and critical issues (pp. 157–187). Washington, DC: American Psychological Association. doi:https://doi.org/10.1037/13273‐007
  • Dunn, D. S. (1994). Lessons learned from an interdisciplinary writing course: Implications for student writing in psychology. Teaching of Psychology, 21, 223–227.
  • Dunn, D. S. (2011). A short guide to writing about psychology (3rd ed.). New York: Pearson Longman.
  • Dunn, D. S., Mcentarffer, R., & Halonen, J. S. (2004). Empowering psychology students through self‐assessment. In D. S. Dunn , C. M. Mehrotra , & J. S. Halonen (Eds.), Measuring up: Assessment challenges and practices for psychology (pp. 171–186). Washington, DC: American Psychological Association.
  • Ebbinghaus, H. (1885). Memory: A contribution to experimental psychology. (H. A. Ruger & C. H. Bussenius, Trans.) In C. A. Green (Ed.), Classics in the Teaching of Psychology. Retrieved from http://psychclassics.yorku.ca/Ebbinghaus/index.htm
  • Emurian, H. H., & Zheng, P. (2010). Programmed instruction and interteaching applications to teaching Java(TM): A systematic replication. Computers in Human Behavior, 26, 1166–1175.
  • Forsyth, D. R. (2003). The professor's guide to teaching: Psychological principles and practices. Washington DC: American Psychological Association.
  • Friedman, T. L. (2005). The world is flat: A brief history of the twenty‐first century. New York: Farrar, Straus and Giroux.
  • Goto, K., & Schneider, J. (2010). Learning through teaching: Challenges and opportunities in facilitating student learning in food science and nutrition by using the interteaching approach. Journal of Food Science Education, 9, 31–35.
  • Gurung, R. A. R., Weidert, J., & Jeske, A. S. (2010). A closer look at how students study (and if it matters). Journal of the Scholarship of Teaching and Learning, 10, 28–33.
  • Gurung, R. R., & Mccann, L. I. (2012). How should students study? In B. M. Schwartz & R. R. Gurung (Eds.), Evidence‐based teaching for higher education (pp. 99–116). Washington, DC: American Psychological Association. doi:https://doi.org/10.1037/13745‐006
  • Halpern, D. F., Anton, B., Beins, B. C., Bernstein, D. J., Blair‐broeker, C. T., Brewer, C. L., … Rocheleau, C. A. (2010). Principles for quality undergraduate education in psychology. In D. F. Halpern (Ed.), Undergraduate education in psychology: A blueprint for the future of the discipline (pp. 161–173). Washington, DC: American Psychological Association. doi:https://doi.org/10.1037/12063‐010
  • Halpern, D. F., & Hakel, M. D. (2003). Applying the science of learning to the university and beyond. Change, 35(4), 36–42.
  • Hartig, T., Evans, G. W., Jamner, L. D., Davis, D. S., & Gärling, T. (2003). Tracking restoration in natural and urban field settings. Journal of Environmental Psychology, 23(2), 109–123. doi:https://doi.org/10.1016/S0272‐4944(02)00109‐3
  • Hersh, R. H. , & Merrow, J. (Eds.) (2005). Declining by degrees: Higher education at risk. New York: Palgrave McMillan.
  • Kaplan, R., & Kaplan, S. (1989). The experience of nature: A psychological perspective. New York: Cambridge University Press.
  • Kaplan, S. (1995). The restorative benefits of nature: Toward an integrative framework. Journal of Environmental Psychology, 15, 169–182.
  • Karpicke, J. D. (2012). Retrieval‐based learning: Active retrieval promotes meaningful learning. Current Directions in Psychological Science, 21, 157–163. doi:https://doi.org/10.1177/0963721412443552
  • Karpicke, J. D., & Blunt, J. R. (2011). Retrieval practice produces more learning than elaborative studying with concept mapping. Science, 331, 772–775. doi:https://doi.org/10.1126/science.1199327
  • Karpicke, J. D., Butler, A. C., & Roediger, III, H. L. (2009). Metacognitive strategies in student learning: Do students practise retrieval when they study on their own? Memory, 17, 471–479. doi:https://doi.org/10.1080/09658210802647009
  • Korn, J. H., & Sikorski, J. (2010). A guide for beginning teachers of psychology. Retrieved from the Society for the Teaching of Psychology Web site: http://www.teachpsych.org/ebooks/guide2010/index.php
  • Krug, D., Davis, T., & Glover, J. A. (1990). Massed versus distributed repeated reading: A case of forgetting helping recall? Journal of Educational Psychology, 82, 366–371. doi:https://doi.org/10.1037/0022‐0663.82.2.366
  • Larsen, D. P., Butler, A. C., & Roediger, H. L. III (2009). Repeated testing improves long‐term retention relative to repeated study: A randomised controlled trial. Medical Education, 43, 1174–1181. doi:https://doi.org/10.1111/j.1365‐2923.2009.03518
  • Leeming, F. C. (2002). The exam‐a‐day procedure improves performance in psychology classes. Teaching of Psychology, 29, 210–212.
  • Mcdaniel, M. A., Howard, D. C., & Einstein, G. O. (2009). The read–recite–review study strategy: Effective and portable. Psychological Science, 20, 516–522.
  • Nevid, J. S., Pastva, A., & Mcclelland, N. (2012). Writing‐to‐learn assignments in introductory psychology: Is there a learning benefit? Teaching of Psychology, 39, 272–275. doi: https://doi.org/10.1177/009862831245662
  • Papadopoulos, P. M., Demetriadis, S. N., Stamelos, I. G., & Tsoukalas, I. A. (2011). The value of writing‐to‐learn when using question prompts to support web‐based learning in ill‐structured domains. Educational Technology Research and Development, 59, 71–90.
  • Pashler, H., Cepeda, N. J., Wixted, J. T., & Rohrer, D. (2005). When does feedback facilitate learning of words? Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 3–8. doi:https://doi.org/10.1037/0278‐7393.31.1.3
  • Rawson, K. A., & Kintsch, W. (2005). Rereading effects depend on time of test. Journal of Educational Psychology, 97, 70–80. doi:https://doi.org/10.1037/0022‐0663.97.1.70
  • Robinson, K. (2009). The element: How finding your passion changes everything. New York, NY: The Penguin Group.
  • Saville, B. K. (2010). Using evidence‐based teaching methods to improve education. In S. A. Meyers & J. R. Stowell (Eds.), Essays from e‐xcellence in teaching (pp. 48–54). (Vol. 9), Published on the Society for the Teaching of Psychology Web site. Retrieved from http://www.teachpsych.org/resources/e‐books/eit2009/index.php
  • Saville, B. K., Lambert, T., & Robertson, S. (2011). Interteaching: Bringing behavioral education into the 21st century. Psychological Record, 61, 153–166.
  • Saville, B. K., Pope, D., Truelove, J., & Williams, J. (in press). The relation between GPA and exam performance during interteaching and lecture. Behavior Analyst Today.
  • Saville, B. K., & Zinn, T. E. (2009). Interteaching: The effects of quality points on exam scores. Journal of Applied Behavior Analysis, 42, 369–374. doi:https://doi.org/10.1901/jaba.2009.42‐369
  • Saville, B. K., Zinn, T. E., & Elliott, M. P. (2005). Interteaching versus traditional methods of instruction: A preliminary analysis. Teaching of Psychology, 32, 161–163.
  • Saville, B. K., Zinn, T. E., Lawrence, N. K., Barron, K. E., & Andre, J. (2008). Teaching critical thinking in statistics and research methods. In D. S. Dunn , J. S. Halonen , & R. A. Smith (Eds.), Teaching critical thinking in psychology: A handbook of best practices (pp. 149–160). Malden, MA: Wiley‐Blackwell.
  • Saville, B. K., Zinn, T. E., Neef, N. A., Van Norman, R., & Ferreri, S. J. (2006). A comparison of interteaching and lecture in the college classroom. Journal of Applied Behavior Analysis, 39, 49–61.
  • Schwartz, B. M. , & Gurung, R. A. R. (Eds.) (2012). Evidence‐based teaching for higher education. Washington, DC: American Psychological Association.
  • Scoboria, A., & Pascual‐leone, A. (2009). An ‘interteaching’ informed approach to instructing large undergraduate classes. Journal of the Scholarship of Teaching and Learning, 9, 29–37.
  • Sisti, H. M., Glass, A. L., & Shors, T. J. (2007). Neurogenesis and the spacing effect: Learning over time enhances memory and the survival of new neurons. Learning and Memory, 14, 368–375.
  • Skinner, B. F. (1954). The science of learning and the art of teaching. Harvard Educational Review, 24, 86–97. (Reprinted in Cumulative record, definitive edition, pp. 179‐191, 1999, Cambridge, MA: B. F. Skinner Foundation).
  • Squire, L., & Alvarez, P. (1995). Retrograde amnesia and memory consolidation: A neurobiological perspective. Current Opinion in Neurobiology, 5, 169–177.
  • Stewart, T. L., Myers, A. C., & Culley, M. R. (2010). Enhanced learning and retention through ‘writing to learn’ in the psychology classroom. Teaching of Psychology, 37, 46–49. doi: https://doi.org/10.1080/00986280903425813
  • Toppino, T. C., & Cohen, M. S. (2010). Metacognitive control and spaced practice: Clarifying what people do and why. Journal of Experimental Psychology: Learning, Memory, and Cognition, 36, 1480–1491. doi:https://doi.org/10.1037/a0020949
  • Tsui, M. (2010). Interteaching: Students as teachers in lower‐division sociology courses. Teaching Sociology, 38, 28–34.
  • Verkoeijen, P. L., Rikers, R. P., & Özsoy, B. (2008). Distributed rereading can hurt the spacing effect in text memory. Applied Cognitive Psychology, 22, 685–695. doi:https://doi.org/10.1002/acp.1388
  • Vlach, H. A., & Sandhofer, C. M. (2012). Distributing learning over time: The spacing effect in children's acquisition and generalization of science concepts. Child Development, 83, 1137–1144. doi:https://doi.org/10.1111/j.1467‐8624.2012.01781.x
  • Vojdanoska, M., Cranney, J., & Newell, B. R. (2010). The testing effect: The role of feedback and collaboration in a tertiary classroom setting. Applied Cognitive Psychology, 24, 1183–1195. doi:https://doi.org/10.1002/acp.1630
  • Voltaire (1950). Philosophical dictionary (Vol. 1). New York: Carlton House. (Original work published 1764).
  • Wahlheim, C. N., Dunlosky, J., & Jacoby, L. L. (2011). Spacing enhances the learning of natural concepts: An investigation of mechanisms, metacognition, and aging. Memory and Cognition, 39, 750–763. doi:https://doi.org/10.3758/s13421‐010‐0063‐y
  • Walker, M. P., & Stickgold, R. (2004). Sleep dependent learning and memory consolidation. Neuron, 44, 121–133. doi: https://doi.org/10.1016/j.neuron.2004.08.031
  • Yandell, L. R., & Bailey, W. N. (2011). Online quizzes: Improving reading compliance and student learning. In D. S. Dunn , J. H. Wilson , J. E. Freeman , & J. R. Stowell (Eds.), Best practices for technology‐enhanced teaching and learning (pp. 271–282). New York: Oxford.
  • Zeller, B. E. (2010). ‘We learned so much when you weren't there!’: Reflections on the interteach method and the acephalous classroom. Teaching Theology and Religion, 13, 270–271.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.