Abstract
Consistent visual and haptic feedback is an important way to improve the user experience when interacting with virtual objects. However, the perception provided in Augmented Reality (AR) mainly comes from visual cues and amorphous tactile feedback. This work explores how to simulate positional vibrotactile feedback (PVF) with multiple vibration motors when colliding with virtual objects in AR. By attaching spatially distributed vibration motors on a physical haptic proxy, users can obtain an augmented collision experience with positional vibration sensations from the contact point with virtual objects. We first developed a prototype system and conducted a user study to optimize the design parameters. Then we investigated the effect of PVF on user performance and experience in a virtual and real object alignment task in the AR environment. We found that this approach could significantly reduce the alignment offset between virtual and physical objects with tolerable task completion time increments. With the PVF cue, participants obtained a more comprehensive perception of the offset direction, more useful information, and a more authentic AR experience.
Disclosure statement
The authors declared no potential conflict of interest.
Notes
Additional information
Funding
Notes on contributors
Li Zhang
Li Zhang received the bachelor’s degree in Advanced Manufacturing Engineering from Northwestern Polytechnical University in 2015. He is currently pursuing the Ph.D. degree from the Cyber-Physical Interaction Lab, Northwestern Polytechnical University. His research interests include VR/AR, haptics, and human-computer interaction.
Weiping He
Weiping He received his B.Eng. Degree in 1985, master’s degree in 1988, and PH.D. in 1998 in advanced manufacturing from Northwestern Polytechnical University. His research interest covers direct part marking, smart manufacturing, empathic computing, and multimodal interaction. He has spent decades on advanced manufacturing and applying the technique to practical applications.
Zhiwei Cao
Zhiwei Cao received the B.Eng. Degree in Advanced Manufacturing Engineering from Northwestern Polytechnical University in 2019. He is currently pursuing a master’s degree from the Cyber-Physical Interaction Lab, Northwestern Polytechnical University. His research direction is AR-based assembly and digital twin.
Shuxia Wang
Shuxia Wang received her Ph.D. Degree in Mechanical Engineering and Automation from Northwestern Polytechnical University in 2008. She is currently a professor in the Department of Mechanical Engineering, Northwestern Polytechnical University. Her research interests include engineering graphics, computer-aided design, computer vision, and intelligent manufacturing.
Huidong Bai
HuiDong Bai received his Ph.D. from the HIT Lab NZ in 2016, supervised by Prof. Mark Billinghurst and Prof. Ramakrishnan Mukundan. He is currently a Research Fellow at the Empathic Computing Laboratory established within The University of Auckland. His research areas include exploring remote collaborative Mixed Reality (MR) and empathic interfaces.
Mark Billinghurst
Mark Billinghurst received the Ph.D. degree in electrical engineering from the University of Washington under the supervision of Prof. T. Furness III and Prof. L. Shapiro. He is currently a Professor and leading the Empathetic Computing Laboratory, Auckland Bioengineering Institute, The University of Auckland.