1,068
Views
30
CrossRef citations to date
0
Altmetric
Review Article

A psychotechnological review on eye-tracking systems: towards user experience

&
Pages 261-281 | Received 03 Jan 2011, Accepted 22 Oct 2011, Published online: 26 Nov 2011

References

  • Garrett JJ. The elements of user experience: user-centered design for the web. New York, NY: New Riders Press; 2003.
  • Federici S, Corradi F, Mele ML, Miesenberger K. From cognitive ergonomist to psychotechnologist: a new professional profile in a multidisciplinary team in a centre for technical aids. In: Gelderblom GJ, Soede M, Adriaens L, Miesenberger K, editors. Everyday technology for independence and care: AAATE 2011. Amsterdam, NL: IOS Press; 2011. pp 1178–1184. DOI: 10.3233/978-1-60750-814-4-1178
  • Meloni F, Federici S, Stella A. The psychologist’s role: a neglected presence in the assistive technology assessment process. In: Gelderblom GJ, Soede M, Adriaens L, Miesenberger K, editors. Everyday technology for independence and care: AAATE 2011. Amsterdam, NL: IOS Press; 2011. pp 1199–1206. DOI: 10.3233/978-1-60750-814-4-1199
  • Meloni F, Federici S, Stella A, Mazzeschi C, Cordella B, Greco F, et al. The Psychologist. In: Federici S, Scherer MJ, editors. Assistive technology assessment: a handbook for professionals in disability, rehabilitation and health professions. London, UK: CRC Press; 2012.
  • Javal E. Essay on the physiology of reading. Ophthalmic Physiol Opt 1990;10:381–384.
  • Huey EB. The psychology and pedagogy of reading. Cambridge, MA: MIT Press; 1968.
  • Dodge R, Cline TS. The angle velocity of eye movements. Psychol Rev 1901;8:145–157. DOI:10.1037/h0076100
  • Dix A, Finlay J, Abowd GD, Beale R. Human-computer interaction. Harlow, UK: Pearson Education; 2004.
  • Cornsweet TN, Crane HD. Accurate two-dimensional eye tracker using first and fourth Purkinje images. J Opt Soc Am 1973;63:921–928.
  • Just MA, Carpenter PA. Eye fixations and cognitive processes. Report. Pittsburgh, PA: Carnegie-Mellon University, Department of Psychology; 1975.
  • Card SK. Visual search of computer command menus. In: Bouma H, Bouwhuis DG, editors. Attention and performance X, control of language processes. Hillsdale, NJ: Lawrence Erlbaum Associates; 1984. pp 97–108.
  • Ellis S, Candrea R, Misner J, Craig CS, Lankford CP. Windows to the soul? What eye movements tell us about software usability. 7th Annual Conference of the Usability Professionals Association Conference: UPA’98; 22–26 Jun 1998; Washington, DC: UPA Press; 1998. pp 151–178.
  • Jay C, Stevens R, Glencross M, Chalmers A, Yang C. How people use presentation to search for a link: expanding the understanding of accessibility on the web. Univ Access Inf Soc 2007;6:307–320. DOI: 10.1007/s10209-007-0089-5
  • Nielsen J, Pernice K. Eyetracking web usability. Berkeley, CA: New Riders; 2009.
  • Starker I, Bolt RA. A gaze-responsive self-disclosing display. SIGCHI conference on human factors in computing systems: empowering people: CHI’90; 1–5 Apr 1990; Seattle, WA: ACM; 1990. pp 3–9. DOI:10.1145/97243.97245
  • Jacob RJK. The use of eye movements in human-computer interaction techniques: what you look at is what you get. TOIS 1991;9:152–169. DOI: 10.1145/123078.128728
  • Zhai S, Morimoto C, Ihde S. Manual and gaze input cascaded (MAGIC) pointing. SIGCHI conference on human factors in computing systems: CHI’99; 15–20 May 1999; Pittsburgh, PA: ACM; 1999. pp 246–253. DOI: 10.1145/302979.303053
  • National Joint Committee for the Communication Needs of Persons With Severe Disabilities. Supporting documentation for the position statement on access to communication services and supports: concerns regarding the application of restrictive “eligibility” policies. Comm Dis Q 2002;23:143–151. DOI: 10.1177/15257401020230030401
  • Stanca Law. Law n. 4: Provisions to support the access to information technologies for the disabled. Official Gazette of Italian Republic, n. 13 on 17 January 2004. <http://www.pubbliaccesso.gov.it/normative/law_20040109_n4.htm>.
  • Federici S, Scherer MJ., Preface. In: Federici S, Scherer MJ, editors. Assistive technology assessment: a handbook for professionals in disability, rehabilitation and health professions. London, UK: CRC Press; 2012.
  • Miesenberger K, Corradi F, Mele ML. The Psychotechnologist: a new profession in the assistive technology assessment. In: Federici S, Scherer MJ, editors. Assistive technology assessment: a handbook for professionals in disability, rehabilitation and health professions. London, UK: CRC Press; 2012.
  • Federici S, Stefano F, Borsci S, Stamerra G. Web usability evaluation with screen reader users: implementation of the partial concurrent thinking aloud technique. Cogn Process 2010;11:263–272.
  • Federici S, Borsci S. Usability evaluation: models, methods, and applications. In: Stone J, Blouin M, editors. International encyclopedia of rehabilitation. Buffalo, NY: Center for International Rehabilitation Research Information and Exchange (CIRRIE); 2010. pp 1–17.
  • Federici S, Borsci S, Mele ML. Usability evaluation with screen reader users: a video presentation of the PCTA’s experimental setting and rules. Cogn Process 2010;11:285–288.
  • Federici S, Micangeli A, Ruspantini I, Borgianni S, Corradi F, Pasqualotto E, Olivetti Belardinelli M. Checking an integrated model of web accessibility and usability evaluation for disabled people. Disabil Rehabil 2005;27:781–790.
  • Borsci S, Kurosu M, Federici S, Mele ML. Computer systems experiences of users with and without disabilities: an evaluation guide for professionals. London, UK: CRC Press, in press.
  • Borsci S, Kurosu M, Federici S, Mele ML. Systemic user experience. In: Federici S, Scherer MJ, editors. Assistive technology assessment: a handbook for professionals in disability, rehabilitation and health professions. London, UK: CRC Press; 2012.
  • Beach G, Cohen CJ, Braun J, Moody G. Eye tracker system for use with head mounted displays. IEEE International Conference on Systems, Man, and Cybernetics: SMC’98; 11–14 Oct 1998; San Diego, CA; 1998. pp 4348–4352. DOI: 10.1109/ICSMC.1998.727531
  • Kawato S, Tetsutani N. Real-time detection of between-the-eyes with a circle frequency filter. 5th Asian Conference on Computer Vision: ACCV’02; 23–25 Jan 2002; Melbourne, AU; 2002. pp 442–447.
  • Bhaskar TN, Foo Tun K, Ranganath S, Venkatesh YV. Blink detection and eye tracking for eye localization. Conference on Convergent Technologies for Asia-Pacific Region: TENCON’03; 15–17 Oct 2003; Bangalore, IN; 2003. pp 821–824. DOI: 10.1109/TENCON.2003.1273293
  • Amarnag S, Kumaran RS, Gowdy JN. Real time eye tracking for human computer interfaces. International Conference on Multimedia and Expo: ICME’03; 6–9 Jul 2003; Baltimore, MD; 2003. pp 557–560-III. DOI: 10.1109/ICME.2003.1221372
  • Beymer D, Flickner M. Eye gaze tracking using an active stereo head. IEEE Computer Society Conference on Computer Vision and Pattern Recognition: CVPR’03; 18–20 Jun 2003; Madison, WI; 2003. pp 451–458-II. DOI: 10.1109/CVPR.2003.1211502
  • Bagci AM, Ansari R, Khokhar A, Cetin E. Eye tracking using Markov models. 17th International Conference on Pattern Recognition: ICPR 2004; 23–26 Aug 2004; Cambridge, UK; 2004. pp 818–821. DOI: 10.1109/ICPR.2004.1334654
  • Kim SM, Sked M, Ji Q. Non-intrusive eye gaze tracking under natural head movements. 26th Annual International Conference of the IEEE on Engineering in Medicine and Biology Society: IEMBS’04; 1–5 Sep 2004; San Francisco, CA; 2004. pp 2271–2274. DOI: 10.1109/IEMBS.2004.1403660
  • Haseyama M, Kaneko C. A robust human-eye tracking method in video sequences. 12th IEEE International Conference on Image Processing: ICIP’05; 11–14 Sep 2005; Genoa, IT; 2005. pp 362–365-II. DOI: 10.1109/ICIP.2005.1530067
  • Zhiwei Z, Qiang J. Eye gaze tracking under natural head movements. IEEE Computer Society Conference on Computer Vision and Pattern Recognition: CVPR’05; 20–25 Jun 2005; San Diego, CA; 2005. pp 918–923. DOI: 10.1109/CVPR.2005.148
  • Zhiwei Z, Qiang J. Novel eye gaze tracking techniques under natural head movement. IEEE Trans Biomed Eng 2007;54:2246–2260. DOI: 10.1109/TBME.2007.895750
  • Ying Q, Zhi-Liang W, Ying H. A non-contact eye-gaze tracking system for human computer interaction. International Conference on Wavelet Analysis and Pattern Recognition: ICWAPR’07; 2–4 Nov 2007; Beijing, CN; 2007. pp 68–72. DOI: 10.1109/ICWAPR.2007.4420638
  • Droege D, Schmidt C, Paulus D. A comparison of pupil centre estimation algorithms. In: Istance H, Stepankova O, Bates R, editors. Proceedings of communication, environment and mobility control by gaze: COGAIN’08; 2–3 Sep 2008; Prague, Czech Republic; 2008. pp 23–26.
  • Figueira MV, de Azevedo DFG, Russomano T, Lilienthal RF. Evaluation tests for eye tracking systems. 30th Annual International Conference of the IEEE on Engineering in Medicine and Biology Society: EMBS ‘08; 20–25 Aug 2008; Vancouver, CA; 2008. pp 5765–5768. DOI: 10.1109/IEMBS.2008.4650524
  • Manh Duong P, Quang Vinh T, Hara K, Inagaki H, Abe M. Easy-setup eye movement recording system for human-computer interaction. IEEE International Conference on Research, Innovation and Vision for the Future: RIVF’08; 13–17 Jul 2008; Ho Chi Minh City, VN; 2008. pp 292–297. DOI: 10.1109/RIVF.2008.4586369
  • Topal C, Doğan A, Gerek ÖN. A wearable head-mounted sensor-based apparatus for eye tracking applications. IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems: VECIMS’08; 14–16 Jul 2008; Istambul, TR; 2008. pp 136–139. DOI: 10.1109/VECIMS.2008.4592768
  • Chi J-n Zhang, P-y, Zheng S-y Zhang, C, Huang Y. Key techniques of eye gaze tracking based on pupil corneal reflection. WRI Global Congress on Intelligent Systems, 2009: GCIS’09; 19–21 May 2009; Xiamen, CN; 2009. pp 133–138. DOI: 10.1109/GCIS.2009.338
  • Price D, Kaputa D, Sierra DA, Enderle J. Infrared-based eye-tracker system for saccades. IEEE 35th Annual Northeast Bioengineering Conference: NEBC’09; 3–5 Apr 2009; Boston, MA; 2009. pp 1–2. DOI: 10.1109/NEBC.2009.4967836
  • Zhang Q, Wang Z, Chi J, Zhang P, Yang Y. Design and calibration for gaze tracking system. IEEE 2nd International Conference on Information Management and Engineering: ICIME’10; 16–18 Apr 2010; Chengdu, CN; 2010. pp 221–225. DOI: 10.1109/ICIME.2010.5477463
  • Goldberg JH, Kotval XP. Computer interface evaluation using eye movements: methods and constructs. Int J Ind Ergon 1999;24:631–645. DOI: 10.1016/S0169-8141(98)00068-7
  • Henderson JM, Weeks PA, Hollingworth A. The effects of semantic consistency on eye movements during complex scene viewing. J Exp Psychol Hum Percept Perform 1999;25:210–228. DOI: 10.1037//0096-1523.25.1.210
  • Cowen L, Ball LJ, Delin J. An eye-movement analysis of web page usability. In: Faulkner X, Finlay J, Dètienne F, editors. People and computers XVIII: design for life proceedings of HCI 2004. Berlin, DE: Springer-Verlag; 2002. pp 317–335.
  • Cooke L. Is eye tracking the next step in usability testing? IEEE International Professional Communication Conference: IPCC’06; 23–25 Oct 2006; Saratoga Springs, NY; 2006. pp 236–242. DOI: 10.1109/IPCC.2006.320355
  • Xiao M, Wong M, Umali M, Pomplun M. Using eye-tracking to study audio-visual perceptual integration. Perception 2007;36:1391–1395.
  • Ratwani RM, Trafton JG, Boehm-Davis DA. Thinking graphically: Connecting vision and cognition during graph comprehension. J Exp Psychol Appl 2008;14:36–49.
  • Guérard K, Tremblay S, Saint-Aubin J. The processing of spatial information in short-term memory: insights from eye tracking the path length effect. Acta Psychol (Amst) 2009;132:136–144.
  • Humphrey K, Underwood G. Domain knowledge moderates the influence of visual saliency in scene recognition. Br J Psychol 2009;100:377–398.
  • Harrison WJ, Thompson MB, Sanderson PM. Multisensory integration with a head-mounted display: background visual motion and sound motion. Hum Factors 2010;52:78–91.
  • Gredebäck G, von Hofsten C. Infants’ evolving representations of object motion during occlusion: a longitudinal study of 6- to 12-month-old infants. Infancy 2004;6(2):165–184. DOI: 10.1207/s15327078in0602_2
  • Johnson SP, Slemmer JA, Amso D. Where infants look determines how they see: eye movements and object perception performance in 3-month-olds. Infancy 2004;6(2):185–201. DOI: 10.1207/s15327078in0602_3
  • Hunnius S, Geuze RH. Developmental changes in visual scanning of dynamic faces and abstract stimuli in infants: a longitudinal study. Infancy 2004;6(2):231–255. DOI: 10.1207/s15327078in0602_5
  • Salman MS, Sharpe JA, Eizenman M, Lillakas L, Westall C, To T, Dennis M, Steinbach MJ. Saccades in children. Vision Res 2006;46:1432–1439.
  • Theuring C, Gredebäck G, Hauf P. Object processing during a joint gaze following task. Eur J Dev Psychol 2007;4:65–79. DOI: 10.1080/17405620601051246
  • Jonsson B, Rönnqvist L, Dömellof E. Prospective head tracking in infants: head movements, accuracy and timing in relation to a circular object motion. Current Psychology Letters: Behaviour, Brain & Cognition 2009;25:2–15.
  • Chung-Hsien K, Yi-Chang C, Hung-Chyun C, Jia-Wun S. Eyeglasses based electrooculography human-wheelchair interface. IEEE International Conference on Systems, Man and Cybernetics: SMC’09; 11–14 Oct 2009; San Antonio, TX; 2009. pp 4746–4751. DOI: 10.1109/ICSMC.2009.5346087
  • Eun Yi K, Sin Kuk K, Keechul J, Hang Joon K. Eye mouse: mouse implementation using eye tracking. International Conference on Consumer Electronics: ICCE’05; 8–12 Jan 2005; Las Vegas, NV; 2005. pp 207–208. DOI: 10.1109/ICCE.2005.1429790
  • Hansen DW, Hansen JP, Nielsen M, Johansen AS, Stegmann MB. Eye typing using Markov and active appearance models. 6th IEEE Workshop on Applications of Computer Vision: WACV’02; 3–4 Dec 2002; Orlando, FL; 2002. pp 132–136. DOI: 10.1109/ACV.2002.1182170
  • Hiley JB, Redekopp AH, Fazel-Rezai R. A low cost human computer interface based on eye tracking. 28th Annual International Conference of the IEEE on Engineering in Medicine and Biology Society: EMBS’06; 30 Aug–3 Sep 2006; New York, NY; 2006. pp 3226–3229. DOI: 10.1109/IEMBS.2006.260774
  • Corno F, Farinetti L, Signorile I. A cost-effective solution for eye-gaze assistive technology. IEEE International Conference on Multimedia and Expo: ICME’02; 26–29 Aug 2002; Lausanne, CH; 2002. pp 433–436. DOI: 10.1109/ICME.2002.1035632
  • Kocejko T, Bujnowski A, Wtorek J. Eye mouse for disabled. Conference on Human System Interactions: HSI’08; 25–27 May 2008; Krakow, PL; 2008. pp 199–202. DOI: 10.1109/HSI.2008.4581433
  • Miyoshi T, Murata A. Input device using eye tracker in human-computer interaction. 10th IEEE International Workshop on Robot and Human Interactive Communication: RO-MAN’01; 18–21 Sep 2001; Bordeaux and Paris, France; 2001. pp 580–585. DOI: 10.1109/ROMAN.2001.981967
  • Miyoshi T, Murata A. Usability of input device using eye tracker on button size, distance between targets and direction of movement. IEEE International Conference on Systems, Man, and Cybernetics: ICSMC’01; 7–10 Oct 2001; Tucson, AZ; 2001. pp 227–232. DOI: 10.1109/ICSMC.2001.969816
  • Perez CA, Pena CP, Holzmann CA, Held CM. Design of a virtual keyboard based on iris tracking. 2nd Joint Engineering in Medicine and Biology and 24th Annual Conference and the Annual Fall Meeting of the Biomedical Engineering Society: EMBS/BMES’02; 23–26 Oct 2002; Houston, TX; 2002. pp 2428–2429. DOI: 10.1109/IEMBS.2002.1053358
  • Porta M, Ravelli A. WeyeB, an eye-controlled web browser for hands-free navigation. 2nd Conference on Human System Interactions: HSI’09; 21–23 May 2009; Catania, II; 2009. pp 210–215. DOI: 10.1109/HSI.2009.5090980
  • Roberts A, Pruehsner W, Enderle JD. Vocal, motorized, and environmentally controlled chair. IEEE 25th Annual Northeast Bioengineering Conference: NEBC’99; 8–9 Apr 1999; West Hartford, CT; 1999. pp 33–34. DOI: 10.1109/NEBC.1999.755746
  • Špakov O, Majaranta P. Scrollable keyboards for casual eye typing. PsychNology Journal 2009;7:159–173. <http://www.psychnology.org/File/PNJ7%282%29/PSYCHNOLOGY_JOURNAL_7_2_SPAKOV.pdf>.
  • Yuan-Pin L, Yi-Ping C, Chung-Chih L, Jyh-Horng C. Webcam mouse using face and eye tracking in various illumination environments. 27th Annual International Conference of the Engineering in Medicine and Biology Society: IEEE-EMBS’05; 17–18 Jan 2005; Shanghai, CN; 2005. pp 3738–3741. DOI: 10.1109/IEMBS.2005.1617296
  • Jacob RJK, Karn KS. Eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: Hyönä J, Radach R, Deubel H, editors. The mind’s eye: cognitive and applied aspects of eye movement research. Oxford, UK: Elsevier Science; 2003. pp 573–603.
  • Poole A, Ball L. Eye Tracking in human-computer interaction and usability research: current status and future prospects. In: Ghaoui C, editor. Encyclopedia of Human Computer Interaction. London, UK: Idea Group Reference; 2006. pp 211–219.
  • Duchowski AT. Eye tracking methodology: theory And practice. 2nd ed. London, UK: Springer; 2007.
  • Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, van de Weijer J. Eye tracking: a comprehensive guide to methods and measures. New York, NY: Oxford University Press; 2011.
  • Rayner K. Eye movements in reading and information processing: 20 years of research. Psychol Bull 1998;124:372–422.
  • Duchowski AT. A breadth-first survey of eye-tracking applications. Behav Res Methods Instrum Comput 2002;34:455–470. DOI: 10.3758/BF03195475
  • Hansen DW, Ji Q. In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 2010;32:478–500.
  • Markov AA, Howard RA. Extension of the limit theorems of probability theory to a sum of variables connected in a chain. In: Howard RA, editor. Dynamic probabilistic systems. New York, NY: John Wiley & Sons, Inc.; 1971. pp 552–577.
  • Goldberg JH, Wichansky AM. Eye tracking in usability evaluation: a practitioner’s guide. In: Hyönä J, Radach R, Deubel H, editors. The mind’s eye: cognitive and applied aspects of eye movements. Oxford, UK: Elsevier Science; 2003. pp 493–516.
  • Loftus GR, Mackworth NH. Cognitive determinants of fixation location during picture viewing. J Exp Psychol Appl 1978;4:565–572. DOI: 10.1037//0096-1523.4.4.565
  • Doll TJ. Preattentive processing in visual search. 37th Annual Meeting of the Human Factors and Ergonomics Society: HFES’93; 11–15 Oct 1993; Seattle, WA; 1993. pp 1291–1294 p. DOI: 10.1518/107118193784162263
  • Revonsuo A, Newman J. Binding and consciousness. Conscious Cogn 1999;8:123–127. DOI: 10.1006/ccog.1999.0393
  • Carpenter PA, Shah P. A model of the perceptual and conceptual processes in graph comprehension. J Exp Psychol Appl 1998;4:75–100. DOI: 10.1037//1076-898X.4.2.75
  • Lohse GL. A cognitive model for understanding graphical perception. Hum-Comp Inter 1993;8:353–388. DOI: 10.1207/s15327051hci0804_3
  • Peebles D, Cheng PCH. Modeling the effect of task and graphical representation on response latency in a graph reading task. Human Factors: The Journal of the Human Factors and Ergonomics Society 2003;45:28–46. DOI: 10.1518/hfes.45.1.28.27225
  • Gillan DJ, Lewis R. A componential model of human interaction with graphs: 1. Linear regression modeling. Hum Factors 1994;36:419–440.
  • Wickens CD, Carswell CM. The proximity compatibility principle: its psychological foundation and relevance to display design. Hum Factors 1995;37:473–494. DOI: 10.1518/001872095779049408
  • Hashiba M, Matsuoka T, Baba S, Watanabe S. Non-visually induced smooth pursuit eye movements using sinusoidal target motion. Acta Otolaryngol Suppl 1996;525:158–162.
  • Fioravanti F, Inchingolo P, Pensiero S, Spanio M. Saccadic eye movement conjugation in children. Vision Res 1995;35:3217–28. DOI: 10.1016/0042-6989(95)00152-5
  • Aslin RN, Salapatek P. Saccadic localization of visual targets by the very young human infants. Percept Psychophys 1975;17:293–302. DOI: 10.3758/BF03203214
  • Sharpe JA, Zackon DH. Senescent saccades. Effects of aging on their accuracy, latency and velocity. Acta Otolaryngol 1987;104:422–428.
  • Grönqvist H, Gredebäck G, Hofsten C. Developmental asymmetries between horizontal and vertical tracking. Vision Res 2006;46:1754–1761.
  • Kapoula Z, Bucci P. Distribution-dependent saccades in children with strabismus and in normals. Exp Brain Res 2002;143:264–268.
  • Lasker AG, Denckla MB, Zee DS. Ocular motor behavior of children with neurofibromatosis 1. J Child Neurol 2003;18:348–355.
  • Munoz DP, Broughton JR, Goldring JE, Armstrong IT. Age-related performance of human subjects on saccadic eye movement tasks. Exp Brain Res 1998;121:391–400.
  • Ross RG, Radant AD, Young DA, Hommer DW. Saccadic eye movements in normal children from 8 to 15 years of age: a developmental study of visuospatial attention. J Autism Dev Disord 1994;24:413–431.
  • Kapoula Z, Evdokimidis I, Smyrnis N, Bucci MP, Constantinidis TS. EEG cortical potentials preceding vergence and combined saccade-vergence eye movements. Neuroreport 2002;13:1893–1897.
  • Barkovich AJ. Normal development of the neonatal and infant brain, skull, and spine. In: Barkovich AJ, editor. Pediatric neuroimaging. 4th ed. Philadelphia, PA: Lippincott Williams & Wilkins; 2005. pp 17–75.
  • Aslin RN, McMurray B. Automated corneal-reflection eye tracking in infancy: methodological developments and applications to cognition. Infancy 2004;6:155–163. DOI: 10.1207/s15327078in0602_1
  • World Health Organization (WHO). A glossary of terms for community health care and services for older persons. Technical report. Geneva, CH: WHO; 15 Jul 2004, 2011. Report No. WHO/WKC/Tech.Ser./04.2.
  • United States Congress. Assistive Technology Act (Public Law 108–364). 2004.
  • World Health Organization (WHO), World Bank. World report on disability. Geneva, CH: WHO; 2011.
  • Wikipedia contributors. Electroencephalography. Wikipedia, The Free Encyclopedia 2011, 1 Oct 2011. <http://en.wikipedia.org/w/index.php?title=Electroencephalography&oldid = 448580630>. Available from: 1 Oct 2011.
  • Pasqualotto E, Federici S, Olivetti Belardinelli M. Toward functioning and usable brain computer interfaces (BCIs): a literature review. Disabil Rehabil Assist Technol 2011:1–15. DOI: 10.3109/17483107.2011.589486
  • Hutchinson TE, White KP, Jr., Martin WN, Reichert KC, Frey LA. Human-computer interaction using eye-gaze input. IEEE Trans Syst Man Cybern 1989;19(6):1527–1534. DOI: 10.1109/21.44068
  • Levine JL. An eye-controlled computer. Technical Report. Yorktown Heights, NY: IBM Research Division; 1981. Report No. RC-8857.
  • Majaranta P, Räihä K-J. Text entry by gaze: utilizing eye-tracking. In: MacKenzie IS, Tanaka-Ishii K, editors. Text entry systems: mobility, accessibility, universality. San Francisco, CA: Morgan Kaufmann; 2007. pp 175–187.
  • Mirza M, Gossett Zakrajsek A, Borsci S. The assessment of the environments of use: accessibility, sustainability, and universal design. In: Federici S, Scherer MJ, editors. Assistive technology assessment: a handbook for professionals in disability, rehabilitation and health professions. London, UK: CRC Press; 2012.
  • Cortada JW. Before the computer: IBM, NCR, Burroughs, and Remington Rand and the industry they created, 1865–1956. Princeton, NJ: Princeton University Press; 2000.
  • Wikipedia contributors. Electrooculography. Wikipedia, The Free Encyclopedia 2011, 1 Oct 2011. http://en.wikipedia.org/wiki/Electrooculography. Available from: 1 Oct 2011.
  • Federici S, Scherer MJ, editors. Assistive technology assessment: a handbook for professionals in disability, rehabilitation and health professions. London, UK: CRC Press; 2012.
  • De Kerckhove D. The skin of culture: investigating the new electronic reality. Toronto, CA: Somerville, 1995.
  • De Kerckhove D. Brainframes. Technology, mind and business. Utrecht, NL: Bosch & Keuning; 1991.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.