Abstract
Gamified and immersive educational interventions have proven to be powerful motivators to encourage participation in unattractive activities, and human–computer interaction (HCI) is crucial for the successful development of such educational interventions. However, existing interaction methods suffer from either visuo-tactile inconsistency or unfriendly interaction methods, resulting in an unenjoyable user experience. Hence, in this article, the visuo-tactile interaction design based on Norman’s design principles (visibility, feedback, constraint, consistency, affordance, and mapping) and spatial augmented reality (SAR) for intuitive cultural education is proposed for the first time, and an application for Chinese chess, SARChess, is implemented as a case study. Players interact with tangible chess pieces rather than through a digital interface, and the visual output and tangible input are integrated seamlessly to achieve high-level visuo-tactile consistency. Furthermore, a mixed investigation method based on pretest–posttest design, subjective questionnaires, and theme-based interviews was designed and performed to evaluate the user experience, learning performance, and player motivation of SARChess from both objective and subjective perspectives. The results show that SARChess has a significantly better user experience, learning performance, and player motivation than conventional chess and mouse-interaction-based chess, and the unique gameplay and visual cues in SARChess make it more engaging to participants. Additionally, this study finds and summarizes some design implications for applying design principles to traditional cultural education applications.
Disclosure statement
We declare that we have no conflict of interest. The results/data/figures in this manuscript have not been published elsewhere, nor are they under consideration by another publisher. We have read the journal policies on author responsibilities and submitted this manuscript in accordance with those policies. All of the material is owned by the authors and/or no permissions are required.
Additional information
Funding
Notes on contributors
Qingshu Yuan
Qingshu Yuan received his BEng and PhD degrees in computer science and technology from Zhejiang University in 2002 and 2009 respectively. He is currently an assistant professor at the School of Information Science and Technology, Hangzhou Normal University. His research interests include virtual reality, augmented reality and human–computer interaction.
Keming Chen
Keming Chen received a BEng degree in software engineering from Hangzhou Normal University, Hangzhou, China in 2022. He is currently a master’s student at the School of Information Science and Technology, Hangzhou Normal University, Hangzhou, China. His research interests include virtual reality, augmented reality, and human–computer interaction.
Qihao Yang
Qihao Yang is currently an undergraduate at the School of Information Science and Technology, Hangzhou Normal University, Hangzhou, China. His research interests include virtual reality, augmented reality, and human–computer interaction.
Zhigeng Pan
Zhigeng Pan received his PhD degree in computer science and technology from Zhejiang University in 1993. He is currently a professor at the School of Artificial Intelligence, Nanjing University of Information Science and Technology. His research interests include virtual reality, augmented reality and human–computer interaction.
Jin Xu
Jin Xu received her BEng degree in computer software from Northeast Normal University in 2013, and PhD degree in computer science and technology from Zhejiang University in 2018. She is currently an assistant professor at the Alibaba Business School, Hangzhou Normal University. Her research interests include information visualization.
Zhengwei Yao
Zhengwei Yao received his PhD degree in computer science and technology from Shanghai University in 2010. He is currently an associate professor at the School of Information Science and Technology, Hangzhou Normal University. His research interests include augmented reality and human–computer interaction.