324
Views
6
CrossRef citations to date
0
Altmetric
Original Articles

Wayfinding of Users With Visual Impairments in Haptically Enhanced Virtual Environments

, , &
 

Abstract

As a powerful interaction technology, haptically enhanced virtual environments (VEs) have found many useful applications. However, few studies have examined how wayfinding of users with visual impairments is affected by VE characteristics. An empirical experiment was conducted to investigate how different environmental characteristics (number of objects inside the environment, layout of the objects and density) affect task performance (completion time, completion ratio, and travel distance), perceived task difficulty, and behavior pattern (short and long pause) of users with visual impairments when they perform a wayfinding task in a desktop-based haptically enhanced VE. The present study found that the number of objects inside the environment and layout of the objects play a significant role in determining the completion time and distance traveled. Layout type also greatly affected the user’s behavioral pattern in terms of frequency of pauses. Finally, perceived task difficulty varied with different environmental characteristics. The study results should provide insight into the future research and development of haptically enhanced VEs for people with visual impairments.

Additional information

Funding

The research described here has been supported by the National Science Foundation (NSF) under Grant Numbers DRL-0953772 and DRL-1203450. This work was also supported partly by the National Research Foundation of Korea (Global Frontier R&D Program on “Human-centered Interaction for Coexistence” 2014-0029756). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF and the National Research Foundation of Korea.

Notes on contributors

Chang S. Nam

Chang S. Nam is currently an associate professor of Edward P. Fitts Department of Industrial and Systems Engineering at North Carolina State University. He is also an associated faculty in the UNC/NCSU Joint Department of Biomedical Engineering, as well as the Department of Psychology. His primary research interests are brain–computer interface, neuroergonomics, rehabilitation engineering, and affective computing.

Mincheol Whang

Mincheol Whang is currently a professor of Media Software in Sangmyung University, Seoul, Korea. He received M.S. and Ph.D. from Georgia Institute of Technology, majoring Biomedical Engineering on respective 1990 and 1994. He was senior researcher at Ergonomics Laboratory in Korea Research Institute of Science and Standardization from 1994 to 1998. His interests are human–computer interface, brain–computer interface, and emotion engineering.

Shijing Liu

Shijing Liu is a Ph.D. student of Edward P. Fitts Department of Industrial and Systems Engineering at North Carolina State University. She obtained her bachelor’s and M.S. in Industrial and Systems Engineering at China University of Mining and Technology at Beijing in China and the Ohio University, respectively. Her research interests include brain--computer interface, haptic user interface, and smart healthcare engineering.

Matthew Moore

Matthew Moore is a master’s student of Psychology Department at the University of North Carolina at Wilmington. He obtained his bachelor’s degree in psychology from North Carolina State University in 2012. His research interests include brain–computer interface, haptic system design, cognition with respect to working memory, and central processing, as well as physiological and psychological conditions like fatigue and stress.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.