1,995
Views
46
CrossRef citations to date
0
Altmetric
Tutorial and Synthesis Article

Eye Tracking Research to Answer Questions about Augmentative and Alternative Communication Assessment and Intervention

&
Pages 106-119 | Received 02 Jul 2013, Accepted 02 Jan 2014, Published online: 23 Apr 2014
 

Abstract

Recently, eye tracking technologies (i.e., technologies that automatically track the point of an individual's gaze while that person views or interacts with a visual image) have become available for research purposes. Based on the sampling of the orientation of the individual's eyes, researchers can quantify which locations within the visual image were fixated (viewed), for how long, and how many times. These automated eye tracking research technologies open up a wealth of avenues for investigating how individuals with developmental or acquired communication disabilities may respond to aided augmentative and alternative communication (AAC) systems. In this paper, we introduce basic terminology and explore some of the special challenges of conducting eye tracking research with populations with disabilities who might use AAC, including challenges of inferring attention from the presence of fixation and challenges related to calibration that may result from participant characteristics, behavioral idiosyncracies, and/or the number of calibration points. We also examine how the technology can be applied to ask well-structured experimental questions that have direct clinical relevance, with a focus on the unique contributions that eye tracking research can provide by (a) allowing evaluation of skills in individuals who are difficult to assess via traditional methods, and (b) facilitating access to information on underlying visual cognitive processes that is not accessible via traditional behavioral measures.

Notes

Acknowledgements

Many people have contributed to the authors’ ever-growing knowledge of eye tracking, including William Dube and Wilkie Wong. The eye tracking research activities of both authors have been supported by NIH P01 HD25995. Support for the first author has also been supplied by the Rehabilitation Engineering Research Center on Communication Enhancement, a virtual research center that is funded by the National Institute on Disability and Rehabilitation Research under Grant H133E030018 and from the Hintz Family Endowed Chair in Children's Communicative Competence at the Pennsylvania State University.

Declaration of interest: The authors report no conflicts of interests. The authors alone are responsible for the content and writing of this paper.

Notes

1. Apple computers, Cupertino, CA. http://www.apple.com/ipad/

2. Tobii Technology, Inc. 510 N. Washington Street, Suite 200 Falls Church, VA 22046, USA.

3. SensoMotoric Instruments (SMI), 28 Atlantic Avenue, Boston, MA 02110, USA. http://www.smivision.com/en/gaze-and-eye-tracking-systems/home.html

4. Applied Science Laboratory, 175 Middlesex Turnpike, Bedford, MA 01730, USA.

5. ISCAN Inc, 21 Cabot Road, Woburn, MA 01801, USA.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.