1,214
Views
23
CrossRef citations to date
0
Altmetric
Original Articles

Controlling a Smartphone Using Gaze Gestures as the Input Mechanism

, , , &
Pages 34-63 | Published online: 30 Sep 2014
 

Abstract

The emergence of small handheld devices such as tablets and smartphones, often with touch sensitive surfaces as their only input modality, has spurred a growing interest in the subject of gestures for human–computer interaction (HCI). It has been proven before that eye movements can be consciously controlled by humans to the extent of performing sequences of predefined movement patterns, or “gaze gestures” that can be used for HCI purposes in desktop computers. Gaze gestures can be tracked noninvasively using a video-based eye-tracking system. We propose here that gaze gestures can also be an effective input paradigm to interact with handheld electronic devices. We show through a pilot user study how gaze gestures can be used to interact with a smartphone, how they are easily assimilated by potential users, and how the Needleman-Wunsch algorithm can effectively discriminate intentional gaze gestures from otherwise typical gaze activity performed during standard interaction with a small smartphone screen. Hence, reliable gaze–smartphone interaction is possible with accuracy rates, depending on the modality of gaze gestures being used (with or without dwell), higher than 80 to 90%, negligible false positive rates, and completion speeds lower than 1 to 1.5 s per gesture. These encouraging results and the low-cost eye-tracking equipment used suggest the possibilities of this new HCI modality for the field of interaction with small-screen handheld devices.

NOTES

Notes

Background. This article is based on the bachelor thesis of the second author and the Ph.D. thesis of the first author.

Funding. This work has been supported by grants MINECO TIN2012-30883 and TIN2010-19607.

HCI Editorial Record. First received August 22, 2012. Revisions received January 7, 2012, and June 10, 2013. Accepted by Brad Myers. Final manuscript received October 24, 2013. — Editor

Additional information

Notes on contributors

D. Rozado

D. Rozado ([email protected], http://www.davidrozado.com/) is a computer scientist with an interest in human–computer interaction, gaze tracking, and brain–computer interfaces. He is a Ph.D. graduate from the Computational Neuroscience group in the Computer Science Department of Universidad Autonoma de Madrid.

T. Moreno

T. Moreno ([email protected]) is a Computer Science student at Universidad Autonoma de Madrid with an interest in gaze tracking.

J. San Agustin

J. San Agustin ([email protected]) is a telecommunications engineer with an interest in gaze tracking. He is a Ph.D. graduate from the Pervasive Computing group at the IT University of Copenhagen.

F. B. Rodriguez

F. B. Rodriguez ([email protected], http://arantxa.ii.uam.es/~frodrig/) is a Professor of Computer Scientist at Universidad Autonoma de Madrid with an interest in computational neuroscience.

P. Varona

P. Varona ([email protected], http://arantxa.ii.uam.es/~pvarona/) is a Professor of Computer Scientist at Universidad Autonoma de Madrid with an interest in computational neuroscience.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 329.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.