427
Views
9
CrossRef citations to date
0
Altmetric
Research Article

Design Guidelines for Schematizing and Rendering Haptically Perceivable Graphical Elements on Touchscreen Devices

ORCID Icon, &
Pages 1393-1414 | Published online: 29 Apr 2020
 

ABSTRACT

This paper explores the viability of new touchscreen-based haptic/vibrotactile interactions as a primary modality for perceiving visual graphical elements in eyes-free situations. For touchscreen-based haptic information extraction to be both accurate and meaningful, the onscreen graphical elements should be schematized and downsampled to: (1) maximize the perceptual specificity of touch-based sensing and (2) account for the technical characteristics of touchscreen interfaces. To this end, six human behavioral studies were conducted with 64 blind and 105 blindfolded-sighted participants. Experiments 1–3 evaluated three key rendering parameters that are necessary for supporting touchscreen-based vibrotactile perception of graphical information, with results providing empirical guidance on both minimally detectable and functionally discriminable line widths, inter-line spacing, and angular separation that should be maintained. Experiments 4–6 evaluated perceptually-motivated design guidelines governing visual-to-vibrotactile schematization required for tasks involving information extraction, learning, and cognition of multi-line paths (e.g., transit-maps and corridor-intersections), with results providing clear guidance as to the stimulus parameters maximizing accuracy and temporal performance. The six empirically-validated guidelines presented here, based on results from 169 participants, provide designers and content providers with much-needed guidance on effectively incorporating perceptually-salient touchscreen-based haptic feedback as a primary interaction style for interfaces supporting nonvisual and eyes-free information access.

Acknowledgments

We acknowledge support from NSF grants CHS-1425337, ECR/DCL Level 2 1644471, and IIS-1822800 on this project.

Additional information

Notes on contributors

Hari P. Palani

Hari P. Palani received his PhD degree in Spatial Informatics from the University of Maine. He is the Founder and Chief Executive Office of UNAR Labs. His research interests include haptic perception, multimodal interface design, spatial cognition and nonvisual graphic accessibility.

Paul D. S. Fink

Paul D. S. Fink, is a Ph.D. student in Spatial Information Science and Engineering at the University of Maine. His research intersects user experience, technology education, and accessibility. Current work includes designing a virtual learning platform for autonomous vehicle AI research and development.

Nicholas A. Giudice

Nicholas A. Giudice is a Professor in the School of Computing and Information Science, University of Maine. His research combines techniques from Experimental Psychology and Human-Computer Interaction, with expertise in spatial learning and navigation and in the design and evaluation of multimodal information-access technologies for blind and visually impaired users.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 306.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.