Abstract
Social learning, simply defined as learning from others, is valuable as a modality that provides quick, informal education. Augmented reality (AR) may provide a framework for human-machine teaming paradigms which integrate both virtual Pedagogical Agents as Learning Companions (PALs) and human learning collaborators. This article details the results of three collaborative AR experiments to explore social learning with PALs and humans. Our use case focuses on medical school students learning how to interview a patient with stroke symptoms. Despite noted challenges in quickly advancing technology, specifically the natural language processing (NLP), the research produced many instances of significant results in self-efficacy and conceptual and procedural learning. Findings are presented along with a way-ahead perspective on key focus areas to advance human-machine teaming in collaborative AR for learning.
Ethical approval
This study was reviewed and approved by the IRB at the University of Texas at Dallas—IRB-22-136: Social Learning in Augmented Reality. Participant and consent was obtained.
Authors contributions
Marjorie Zielke, Djakhangir Zakhidov, and Tiffany Lo wrote the main manuscript. Scotty D. Craig led the learning science analysis and provided general critique. Nolan Kuo contributed technical expertise and writing to the manuscript. Robert Rege, Hunter Pyle, and Nina Velasco Meer provided subject matter expertise, interpretation of results, and future research directions. All authors reviewed the manuscript.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Data availability statement
The data set associated with this research is available from the corresponding author upon request.
Additional information
Funding
Notes on contributors
Marjorie A. Zielke
Marjorie A. Zielke is the director and founder of the Center for Simulations and Synthetic Humans at the University of Texas at Dallas. Zielke’s multidisciplinary research focuses on new teaching frameworks that consist of immersive learning environments, AI-driven conversational virtual humans, and novel types of interactivity.
Djakhangir Zakhidov
Djakhangir Zakhidov is the associate director of the Center for Simulations and Synthetic Humans at the University of Texas at Dallas. Zakhidov’s research explores human-computer teaming paradigms that optimize educational objectives and provide access to new types of learning with virtual and real collaborators in mixed-reality environments.
Tiffany Lo
Tiffany Lo is a research analyst at the Center for Simulation and Synthetic Humans at the University of Texas at Dallas. She has expertise in empirical research, statistical analysis, and exploratory data analysis. Lo’s research interests encompass augmented and virtual reality, social learning, and medical technology.
Scotty D. Craig
Scotty D. Craig is an Associate Professor of Human Systems Engineering and Director of Research & Evaluation for the Learning Engineering Institute at Arizona State University. He has expertise in cognitive psychology, usability, and learning science and has published widely in the intersecting areas of Psychology, Education, and Technology.
Robert Rege
Robert Rege is the associate dean of undergraduate and continuing medical education at the University of Texas Southwestern Medical School. Rege’s research focuses on utilizing advanced clinical simulation methods, such as robotic surgery trainers and conversational virtual patients, for medical education.
Hunter Pyle
Hunter Pyle is a resident physician at the University of Texas Southwestern Medical Center interested in the use of simulation in medical education. He has a special interest in the use of virtual technologies to increase pre-clerkship medical student exposure to clinical medicine.
Nina Velasco Meer
Nina Velasco Meer is a third-year medical student at the University of Texas Southwestern Medical School. Meer is interested in the integration of artificial intelligence into current teaching models for medical students.
Nolan Kuo
Nolan Kuo is a software developer in the Center for Simulation and Synthetic Humans at the University of Texas at Dallas. His research interests include virtual reality, human computer interaction, and computer graphics. Kuo is deeply experienced in the design of virtual and augmented reality interfaces.