818
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

HapticProxy: Providing Positional Vibrotactile Feedback on a Physical Proxy for Virtual-Real Interaction in Augmented Reality

ORCID Icon, , , , ORCID Icon & ORCID Icon
Pages 449-463 | Received 05 Feb 2021, Accepted 10 Feb 2022, Published online: 18 Apr 2022
 

Abstract

Consistent visual and haptic feedback is an important way to improve the user experience when interacting with virtual objects. However, the perception provided in Augmented Reality (AR) mainly comes from visual cues and amorphous tactile feedback. This work explores how to simulate positional vibrotactile feedback (PVF) with multiple vibration motors when colliding with virtual objects in AR. By attaching spatially distributed vibration motors on a physical haptic proxy, users can obtain an augmented collision experience with positional vibration sensations from the contact point with virtual objects. We first developed a prototype system and conducted a user study to optimize the design parameters. Then we investigated the effect of PVF on user performance and experience in a virtual and real object alignment task in the AR environment. We found that this approach could significantly reduce the alignment offset between virtual and physical objects with tolerable task completion time increments. With the PVF cue, participants obtained a more comprehensive perception of the offset direction, more useful information, and a more authentic AR experience.

Disclosure statement

The authors declared no potential conflict of interest.

Notes

Additional information

Funding

This work was partly supported by the National Key Research and Development Program of China [Grant No. 2019YFB1703800], the Natural Science Basic Research Program of Shaanxi Province under [Grant No. 2016JM6054], and the Higher Education Discipline Innovation Project under [Grant No. B13044].

Notes on contributors

Li Zhang

Li Zhang received the bachelor’s degree in Advanced Manufacturing Engineering from Northwestern Polytechnical University in 2015. He is currently pursuing the Ph.D. degree from the Cyber-Physical Interaction Lab, Northwestern Polytechnical University. His research interests include VR/AR, haptics, and human-computer interaction.

Weiping He

Weiping He received his B.Eng. Degree in 1985, master’s degree in 1988, and PH.D. in 1998 in advanced manufacturing from Northwestern Polytechnical University. His research interest covers direct part marking, smart manufacturing, empathic computing, and multimodal interaction. He has spent decades on advanced manufacturing and applying the technique to practical applications.

Zhiwei Cao

Zhiwei Cao received the B.Eng. Degree in Advanced Manufacturing Engineering from Northwestern Polytechnical University in 2019. He is currently pursuing a master’s degree from the Cyber-Physical Interaction Lab, Northwestern Polytechnical University. His research direction is AR-based assembly and digital twin.

Shuxia Wang

Shuxia Wang received her Ph.D. Degree in Mechanical Engineering and Automation from Northwestern Polytechnical University in 2008. She is currently a professor in the Department of Mechanical Engineering, Northwestern Polytechnical University. Her research interests include engineering graphics, computer-aided design, computer vision, and intelligent manufacturing.

Huidong Bai

HuiDong Bai received his Ph.D. from the HIT Lab NZ in 2016, supervised by Prof. Mark Billinghurst and Prof. Ramakrishnan Mukundan. He is currently a Research Fellow at the Empathic Computing Laboratory established within The University of Auckland. His research areas include exploring remote collaborative Mixed Reality (MR) and empathic interfaces.

Mark Billinghurst

Mark Billinghurst received the Ph.D. degree in electrical engineering from the University of Washington under the supervision of Prof. T. Furness III and Prof. L. Shapiro. He is currently a Professor and leading the Empathetic Computing Laboratory, Auckland Bioengineering Institute, The University of Auckland.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.