4,963
Views
3
CrossRef citations to date
0
Altmetric
Commentaries

Making insights from educational psychology and educational technology research more useful for practice

ORCID Icon

Abstract

Articles in this special issue on “Diverse Lenses on Improving Online Learning Theory, Research, and Practice” begin to address the gap between (1) research on psychological constructs that are too abstract to guide many instructional decisions and (2) empirically derived guidance that is quite concrete but limited in explanatory value and generalizability. Needed now is a multi-level framework for online learning that offers specific guidance for practitioners’ instructional decisions while also supporting a conceptual organization of accumulated research findings that fosters new insights and research questions. In this commentary, I describe a framework that would encompass multiple kinds of learning; different learning goals; discipline-specific ways of knowing and demonstrating knowledge; key technology features; and learner differences.

The five lenses on online learning in this issue—community, engagement, pedagogy, equity, and design-based research—reveal the diversity and complexity of online teaching and learning as practiced today (Greenhow et al., Citation2022/this issue). These articles address the fundamental challenge of integrating conceptually driven psychological research and empirically driven educational technology research. In this commentary, I reflect on these contributions and inquire into how the field might build on this work to help educators and instructional designers improve the quality, equity, and effectiveness of online learning.

The crux of this challenge is moving beyond the division of the scholarly literature into (1) broad psychology-based theoretical constructs that are too abstract to guide many practical decisions, on the one hand, and (2) empirically derived guidance, which is connected to messy reality, but has less explanatory value and generalizability than the field needs, on the other. The authors in this special issue sought to address the current gap between research and practice to enable educators and design-researchers to develop more engaging, equitable, and effective online learning experiences for a wider range of students (Greenhow et al., Citation2022/this issue). Building on these ideas, I propose a multi-level framework linking theoretical constructs to specific practices and empirical findings described at a level of detail suitable for informing practice.

Gap between theory and practice

Educational and cognitive psychology research address abstract psychological constructs that are not directly observable. Researchers design studies and measures with care to elicit evidence of the operation of these psychological processes, and experimental manipulations of single variables under controlled conditions have been important for identifying principles of learning (Mandler, Citation2011). Education technology research, on the other hand, has been performed mostly in situ, in the variable and complicated world of courses and classrooms. Educational technology researchers look to the psychology research literature for inspiration, but typically do not try to isolate the effects of individual online teaching practices or design features. Rather, they have adopted a design approach (see Hoadley & Campos, Citation2022/this issue), manipulating many aspects of practice and design simultaneously in search of a combination that improves online learning engagement, equity, or effectiveness. Such studies may reveal the efficacy and effectiveness of a complex intervention for the learners and contexts where it is tested, but do not support attributing outcomes to individual underlying psychological mechanisms (Means et al., Citation2017).

Martin and Borup (Citation2022/this issue) made a similar argument, emphasizing that educational technology researchers tend to frame their studies of student interactions online in behavioral terms (i.e., describing interactions with technology, with peers, and with the instructor) whereas educational psychologists typically conceptualize engagement in more motivational terms (i.e., with cognitive, affective, and behavioral dimensions). Martin and Borup observed further that the field of online learning has lacked an explicit mapping of learner and teacher behaviors and specific affordances of digital technologies to a conceptual framing of psychological constructs in the cognitive and motivational realms. Such a mapping will be essential to making progress. Without it, teachers, curriculum developers, and instructional designers face a choice between two daunting tasks—either (1) try to draw inferences for how to teach particular content and students in their local context from general abstract statements about psychological constructs or general principles of learning or (2) try to locate and decipher often inconsistent empirical findings from studies conducted in different settings, using different combinations of practices, and different design elements.

Existing online learning frameworks

Three articles in this issue (Archambault et al., Citation2022/this issue; Martin & Borup, Citation2022/this issue; Shea et al., Citation2022/this issue) discussed existing frameworks for online instruction, highlighting their connections to foundational learning research and theory, and offering an entry point for dialogue among educational psychologists, learning scientists, and educational technology researchers. These online learning frameworks provide an orientation to what quality online learning activities look like, advancing the field toward a more complex, systems view.

In the first of these three articles, Shea et al. (Citation2022/this volume) provided a critical review of one of the most widely cited frameworks, the Community of Inquiry (COI) model developed by Garrison et al. (Citation1999). This framing, consisting of social, cognitive, and teaching “presences” online, has inspired much of the professional development and academic writing on online learning, particularly within higher education (Valverde-Berrocoso et al., Citation2020). Shea et al. explicated the roots of the COI model in the psychological construct of the collaborative construction of knowledge (Scardamalia & Bereiter, Citation2014) and the model’s original purpose of guiding practice within text-based, asynchronous online courses. The COI model has sensitized instructors to the need to attend to motivational and social aspects of learning and has catalyzed the design and testing of different approaches to instantiating the three presences. Shea et al. pointed to research documenting relationships between measures of students’ perception of the three presences in the COI model and both students’ satisfaction with their course and their perception of the quality of their learning in it. However, as Shea et al. noted, there is a dearth of evidence relating student perceptions of the three COI presences to objective measures of course learning. Shea et al. (Citation2022/this issue) cited another limitation of the COI model—its lack of “explanatory mechanisms of explicit cognitive or socio-cultural processes grounded in contemporary learning science” (p. 157). These authors suggested that the COI framework could be expanded to incorporate contemporary understanding of learning sciences constructs. I would argue, however, that the model fundamentally reflects the emergent collaborative environments that gave rise to it and is more suited to post hoc explanation of observed differences in engagement than to serving as an all-purpose guiding framework for online teaching and learning. Moreover, retrofitting an old model with a new set of considerations invites force-fitting and overcomplication (Gierl & Cui, Citation2008). Building a framework explicitly for the purpose of guiding online teaching and learning design, implementation, and research should be a priority. Later, I suggest qualities that a new framework should embody.

Martin and Borup (Citation2022/this issue) offered the Academic Communities of Engagement (ACE) framework as an alternative to the COI model. The ACE framework highlights cognitive, affective and behavioral engagement in a course and brings in psychological concepts, such as self-regulation skills, transactional distance, social presence, and off-task behavior. In addition to having a stronger connection to the psychological research and reflecting the diversity of today’s online learning options, the ACE framework is notable for highlighting the role of a student’s personal community as well as the course community in supporting engagement in an online course. ACE’s treatment of learning ecosystems (i.e., personal and course communities) is more encompassing than that of COI, and the ACE framework provides a useful perspective for investigating equity issues in online and blended learning, such as those described by Tate and Warschauer (Citation2022/this issue). Tate and Warschauer went beyond documenting differences in access to broadband internet and computing devices to describe subtler equity issues such as those stemming from differences in the personal communities learners can tap to help them prepare for and participate in online learning (e.g., older siblings or friends who can help troubleshoot software problems).

Archambault et al. (Citation2022/this issue) offered yet another framework for online learning using a pedagogy lens and proposing five “pillars” of online pedagogy: (1) build relationships and community, (2) incorporate active learning, (3) leverage learner agency, (4) embrace mastery learning, and (5) personalize the learning process. This conceptual framework connects more overtly to the concerns of teachers and instructional designers because it is framed as a prescription rather than description of processes that occur online. The five pillars have roots in multiple learning theories. The building relationships and community pillar echoes the COI and ACE frameworks, described earlier, in their derivation from constructivist and sociocultural perspectives. In contrast, the mastery and personalized learning pillars have roots in theory and research on individual cognition (Anderson, Citation1983). The teacher or instructional designer seeking to implement the mastery and personalized learning pillars, which by definition have learners working at different paces and on different content independently, would need to balance implementing these pillars with building relationships and community (the first pillar) through group-based activities. This is not to say these different approaches cannot or should not be combined, but rather that the pillars themselves provide little or no guidance about when, why, or how to combine them.

Practitioners must make specific decisions, such as whether to allocate time for students to interact with a simulation or whether to put students in online breakout rooms for groupwork. Encouragement to apply one of the abstract dimensions in existing online learning models, such as “active learning” (Archambault et al., Citation2022/this issue, p. 180), may be helpful in moving instructors away from lecturing online for 50 minutes straight, but does not give instructors adequate guidance about alternatives to lecturing and how to create coherence among the learning activities in the course. Linking broadly applicable learning principles to concrete descriptions of course design and instructor actions is a first step. Tate and Warschauer (Citation2022/this issue) offered examples of such linkages by describing studies demonstrating the positive effects of instructional innovations scaffolding students’ self-regulation, operationalized as giving students the opportunity to schedule their lecture watching in advance (Baker et al., Citation2019), and greater student-instructor interaction online, in the form of optional in-person meeting hours and frequent instructor emails (Cung et al., Citation2018). Linking psychology constructs to concrete descriptions of practice helps instructional designers and practitioners fill in the kinds of details instructors must attend to. Existing online learning frameworks are helpful in building awareness of important aspects of online teaching and learning; the next step is an evolution toward frameworks including a more concrete layer of specific guidance backed by empirical findings.

Moving beyond current online learning frameworks

Existing online learning frameworks focus on knowledge construction, which enables reasoning in new situations and with new content. Knowledge construction is central to acquiring expertise in many subject domains (e.g., math, science, social science), but other kinds of learning are also important in schooling and in life (National Academies of Sciences et al., Citation2018, chapter 3). Perceptual, procedural, and information learning are key instructional objectives in many subject areas. Acquiring the vocabulary for a different language requires information learning; becoming fluent in recognizing an abnormal electrocardiogram requires perceptual learning. Knowledge construction is fostered by explaining one’s ideas in dialogue with others (Chi & Wylie, Citation2014), but these other kinds of learning are better supported through different experiences, such as deliberate individual practice (Ericsson, Citation1996). Current digital learning systems are extremely well-suited to providing some of these conditions—for example, systematic practice regimens with numerous examples—but not necessarily to providing other conditions, such as interactions with peers. Another important type of learning involves metacognition and self-regulation skills, which depend on cognitive, affective and behavioral engagement (Martin & Borup, Citation2022/this issue). Ideally, frameworks for online pedagogy would either describe how to promote these different kinds of learning or be explicit about the limits of the framework’s applicability.

The online learning frameworks described in this issue are domain neutral. They do not incorporate discipline-specific aspects of inquiry, discourse, and evidence (Brown, Citation1990; Goldman et al., Citation2018). Research on the effectiveness of teacher professional development has found that it is more effective when tied to a particular discipline (Darling-Hammond et al., Citation2017), and it is reasonable to conjecture that online learning frameworks incorporating a disciplinary dimension would exert more influence on teacher practice than frameworks that are domain neutral.

The kind of framework that would help integrate educational psychology and educational technology research would have several levels, starting with a top level of general principles for supporting different kinds of learning and then linking those principles to more elaborated levels with specific practices and related empirical findings. The organizational scheme developed by Ken Koedinger and colleagues (Citation2013) illustrates some elements of this approach. These learning scientists presented sets of learning principles that apply to different kinds of learning and outcomes. For each principle, they also provided educational technology research findings related to that principle. Examples of principles and “typical effects” cited by Koedinger et al. (Citation2013, p. 936) are:

  • Memory for information can be supported by spacing practice across time (rather than providing it all at once).

  • More accurate applications of perceptual or conceptual categories are facilitated by practice involving varied instances rather than similar instances.

  • Understanding and sense making can be promoted by presenting real-world problems rather than abstract problems.

Koedinger and colleagues (Citation2012) have linked their typical effects to particular studies and noted subject matter domain differences in the effects of applying a principle-based technique (e.g., incorporating student collaboration improved learning of the pressure concept but not learning of the rules of algebraic equations). My point here is not that the Koedinger et al. framework is the “correct” one (and certainly it is more oriented around mastery and personalized learning than sociocultural aspects of knowledge construction), but rather that it has some properties that would be desirable in the next generation of online learning frameworks: consideration of the type of learning involved; organization of the research literature by the type of outcome addressed; and specification of an instructional practice and its typical effects at a concrete enough level to serve as a template for teachers and designers of online courses. One could go further to map specific research studies to the typical findings to give online learning practitioners and designers the option of drilling down to ascertain how an instructional practice has been operationalized in the research.

A framework incorporating specific, concrete instructional practices along with their conditions of applicability would enable tagging studies so practitioners and instructional designers could easily locate research relevant to their instructional decisions. A useful framework also would allow multiple entry points into the research literature (e.g., exploring findings for middle school or other specific contexts or looking for different practices to stimulate active learning) and views at different levels of granularity (e.g., allowing views organized by general learning principles or by specific practices) to help users see trends and make sense of the complexity of research findings without losing access to their details. The top-level of a hierarchically organized framework would consist of statements of learning principles along with tags for the kinds of learning and learning outcomes to which they are applicable. For each principle, users would be able to drill down to greater levels of detail with examples of instructional practices designed to embody the principle and specific empirical findings with respect to those practices. In this way, principles from educational psychology and learning sciences research and empirical findings for specific instructional interventions in educational technology research would be brought together in a manner that invites inspection, revision, and application.

The need for a principles->practices->evidence framework that supports knowledge management is pressing now when educational technologists, and others, can perform rapid-cycle experiments within online learning systems, using system log data (Krumm et al., Citation2018). Developers of learning platforms are using system log data from thousands of learning system users to test the effects of different instructional practices with different kinds of learners, content, and conditions. Using a common nomenclature for practices, learners, and learning contexts, to the extent possible, would enhance the utility of such a framework (Ferber et al., Citation2019). To benefit from the burgeoning set of empirical findings on learning online with digital systems, a framework for relating research findings not just to high-level general principles but also to specific practices and conditions is needed by educational psychology researchers and practitioners alike. Such a framework would incorporate dimensions of social engagement, as highlighted in the COI and ACE frameworks, as well as findings around the mastery learning and personalization pillars described by Archambault et al. (Citation2022, this issue).

Implications of complex systems for research approaches

The multi-level framework described above could help integrate educational psychology and educational technology research but would be unlikely to bridge the gulf between research and practice by itself. That bridging will require involving practitioners in producing new online learning approaches and knowledge about their effectiveness for particular outcomes, kinds of students, and settings. Hoadley and Campos (Citation2022/this issue) argued for the necessity of such collaborations and described how they can be realized through design-based research. These authors asserted that the possible combinations of learning materials, instructor actions, and students within learning activity systems are such that system outcomes are not reliably predictable in advance. An implication of this complexity for policymakers is that they should refrain from mandating presumed “best” instructional practices for online learning. There are multiple ways in which to offer effective and equitable learning experiences, and there is simply too much variability in outcomes for approaches assumed to be equivalent to justify blanket prescriptions. Hoadley and Campos proposed that rather than relying on “best practices,” instructional technology designers and educators should join in a creative partnership with researchers and become co-producers, not just consumers, of research knowledge. They made the case for design-based research as an approach to generating findings and improving instruction concurrently through iterations of designing, implementing and testing different learning activity systems (see also Jackson, Citation2022).

An essential element of the approach described by Hoadley and Campos (Citation2022/this issue) is the design pattern—a “known partial solution to a category of problems” (p. 210). Design-based research results in design patterns and tentative prescriptions about how to apply them in a specified range of contexts, while maintaining the expectation that there will always be a necessity for adaptations of the design to specific circumstances. From the designer or practitioner’s standpoint, the typical effects identified by Koedinger et al. imply design patterns for instruction. Presenting knowledge derived from research in online learning as a set of design patterns could provide a common language accessible to instructional designers and teachers while also making connections to fundamental learning principles. Further, Hoadley and Campos suggested that designers encountering new combinations of constraints may combine design patterns in new ways or develop entirely new patterns, thereby adding to the knowledge base. This perspective resonates with the current movement toward “evidence-based practice” that integrates individual teaching and learning expertise with “the best available external evidence from systematic research” (Davies, Citation1999, p. 117).

Conclusion

A major challenge for efforts to improve online instruction through design-based research is how to implement such labor-intensive activities on a broad scale. Leveraging educational psychology research to improve the quality of online teaching and learning requires new organizational and career models that embed design research and continuous improvement activities into schools and expectations for teachers—transformational changes that would require the support of policymakers. The ideal is a system in which educators have incentives and supports for engaging in the process of data-informed reflection on their practice without fear of external sanctions when they uncover areas where their teaching could be improved (Penuel, Citation2019). Policymakers, including district superintendents and higher education leaders, should consider setting up systems that provide educators with the know-how, tools, and supports for examining student engagement and learning outcomes in their online classes on an ongoing basis. Ideally, participating in design-based research activities would become an organizational norm for schools and colleges, with teachers and faculty engaging with research findings, sometimes in collaboration with external researchers, and adding to the body of online learning findings and design guidance themselves.

The articles in this issue provide a foundation for such educational improvement efforts. They make a strong case that the online environment needs to be studied and understood as having affordances and limitations distinct from those of classroom-based teaching and learning (Greenhow et al., Citation2022/this issue). Both Shea et al. (Citation2022/this issue) and Martin and Borup (Citation2022/this issue) reviewed research on the importance of social engagement as a motivator and support for learning online. In a similar vein, Tate and Warschauer (Citation2022/this issue) reviewed research on interventions designed to heighten interaction and student self-regulation when learning online and suggested that the practices used in these interventions may improve online learning outcomes for student groups that have done less well learning online in the past. Archambault et al. (Citation2022/this issue) explicated ways in which online teaching calls on new instructional skills and often places the teacher in the role of instructional designer. The role of the online instructor was expanded further in the design research model of Hoadley and Campos (Citation2022/this issue) calling for collaborations of researchers and practitioners engaged in exploration and testing of new online learning designs and approaches.

Such work can be supported by building on the foundations described in this issue to develop a multi-level framework to support knowledge management. Such a framework would link abstract principles from learning research to specific concrete practices and associated empirical findings discoverable by practitioners, designers, and policymakers as well as researchers. Both research and practice would benefit from an organizational structure for research findings that recognizes multiple kinds of learning; different learning goals (e.g., retention versus application or transfer); discipline-specific ways of knowing and demonstrating one’s knowledge; key technology features; and learner differences.

References

  • Anderson, J. R. (1983). The architecture of cognition. Harvard University Press.
  • Archambault, L., Leary, H., & Rice, K. (2022/this issue). Pillars of online pedagogy: A framework for teaching in online learning environments. Educational Psychologist, 57(3), 178–191. https://doi.org/10.1080/00461520.2022.2051513
  • Baker, R., Evans, B., Li, Q., & Cung, B. (2019). Does inducing students to schedule lecture watching in online classes improve their academic performance? An experimental analysis of a time management intervention. Research in Higher Education, 60(4), 521–552. https://doi.org/10.1007/s11162-018-9521-3
  • Brown, A. L. (1990). Domain-specific principles affect learning and transfer in children. Cognitive Science, 14(1), 107–133. https://doi.org/10.1207/s15516709cog1401_6
  • Chi, M. T. H., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219–243. https://doi.org/10.1080/00461520.2014.965823
  • Cung, B., Xu, D., & Eichhorn, S. (2018). Increasing interpersonal interactions in an online course: Does increased instructor email activity and voluntary meeting time in a physical classroom facilitate student learning? Online Learning, 22(3), 193–215. https://doi.org/10.24059/olj.v22i3.1322
  • Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional development. Learning Policy Institute. https://doi.org/10.54300/122.311
  • Davies, P. (1999). What is evidence-based teaching? British Journal of Educational Studies, 47(2), 108–121. https://doi.org/10.1111/1467-8527.00106
  • Ericsson, K. A. (1996). The acquisition of expert performance: An introduction to some of the issues. In K. A. Ericsson (Ed.), The road to excellence: The acquisition of expert performance in the arts and sciences, sports, and games (pp. 1–50). Erlbaum.
  • Ferber, T., Wiggins, M. E., & Sileo, A. (2019). Advancing the use of core components of effective programs. Forum for Youth Investment. https://forumfyi.org/knowledgecenter/advancing-core-components/
  • Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2–3), 87–105. https://doi.org/10.1016/S1096-7516(00)00016-6
  • Gierl, M. J., & Cui, Y. (2008). Defining characteristics of diagnostic classification models and the problem of retrofitting in cognitive diagnostic assessment, measurement. Measurement: Interdisciplinary Research & Perspective, 6(4), 263–268. https://doi.org/10.1080/15366360802497762
  • Goldman, S., Ko, M. -L. M., Greenleaf, C., & Brown, W. (2018). Domain-specificity in the practices of explanation, modeling, and argument in the sciences. Routledge. https://doi.org/10.4324/9780203731826
  • Greenhow, C., Graham, C. R., & Koehler, M. J. (2022/this issue). Foundations of online learning: Challenges and opportunities. Educational Psychologist, 57(3), 131–147. https://doi.org/10.1080/00461520.2022.2090364
  • Hoadley, C., & Campos, F. C. (2022/this issue). Design-based research: What it is and why it matters to studying online learning. Educational Psychologist, 57(3), 207–220. https://doi.org/10.1080/00461520.2022.2079128
  • Jackson, C. (2022). Democratizing the development of evidence. Educational Researcher, 51(3), 209–215. https://doi.org/10.3102/0013189X211060357
  • Koedinger, K. R., Booth, J. L., & Klahr, D. (2013). Instructional complexity and the science to constrain it. Science, 342(6161), 935–937. https://doi.org/10.1126/science.1238056
  • Koedinger, K. R., Corbett, A. T., & Perfetti, C. (2012). The Knowledge-Learning-Instruction Framework: Bridging the science-practice chasm to enhance robust student learning. Cognitive Science, 36(5), 757–798. https://doi.org/10.1111/j.1551-6709.2012.01245.x
  • Krumm, A., Means, B., & Bienkowski, M. (2018). Learning analytics goes to school. Routledge. https://doi.org/10.4324/9781315650722
  • Mandler, G. (2011). A history of modern experimental psychology: From James and Wundt to cognitive science. MIT Press.
  • Martin, F., & Borup, J. (2022/this issue). Online learner engagement: Conceptual definition, research themes, and supportive practices. Educational Psychologist, 57(3), 162–177. https://doi.org/10.1080/00461520.2022.2089147
  • Means, B., Shear, L., & Murphy, R. (2017). Understand, implement, & evaluate. Pearson-SRI series on building efficacy in learning technologies (Vol. 1). Pearson.
  • National Academies of Sciences, Engineering, and Medicine. (2018). How people learn II: Learners, contexts, and cultures. National Academy Press.
  • Penuel, W. R. (2019). Co-design as infrastructuring with attention to power: Building collective capacity for equitable teaching and learning through design-based implementation research. In J. Pieters. (Eds.), Collective curriculum design for sustainable innovation and teacher learning. Springer. https://doi.org/10.1007/978-3-030-20062-6_21
  • Scardamalia, M., & Bereiter, C. (2014). Knowledge building and knowledge creation: Theory, pedagogy, and technology. In Sawyer, R.K. (Ed.), The Cambridge handbook of the learning sciences (2nd ed., pp. 397–417). Cambridge University Press.
  • Shea, P., Richardson, J., & Swan, K. (2022/this issue). Building bridges to advance the community of inquiry framework for online learning. Educational Psychologist, 57(3), 148–161. https://doi.org/10.1080/00461520.2022.2089989
  • Tate, T., & Warschauer, M. (2022/this issue). Equity in online learning. Educational Psychologist, 57(3), 192–206. https://doi.org/10.1080/00461520.2022.2062597
  • Valverde-Berrocoso, J., Garrido-Arroyo, M., Burgos-Videla, C., & Morales-Cevallos, M. B. (2020). Trends in educational research about e-learning: A systematic literature review (2009–2018). Sustainability, 12(12), 5153. https://doi.org/10.3390/su12125153