Abstract
This article presents a series of user studies to develop a new eye-gaze tracking–based pointing system. We developed a new target prediction model that works for different input modalities and combined the eye-gaze tracking–based pointing with a joystick controller that can reduce pointing and selection times. The system finds important applications in cockpit of combat aircraft and for computer novice users. User studies confirmed that users can perform significantly faster using this new eye-gaze tracking–based system for both military and everyday computing tasks compared to existing input devices. As part of the study it was also found that the amplitude of maximum power component obtained through Fourier Transform of pupil signal significantly correlates with selection times and perceived cognitive load of users in terms of Task Load Index scores.
Additional information
Notes on contributors
Pradipta Biswas
Pradipta Biswas Ph.D., is a Senior Research Associate at University of Cambridge. His research addresses a wide range of human–machine interaction issues ranging from designing user model for people with severe physical impairment to proposing new interaction techniques for Eurofighter Typhoon pilots.
Pat Langdon
Pat Langdon is a Principal Research Associate at the Inclusive Design Group of Engineering Design Centre in the University of Cambridge. His research has contributed to cognitive science, artificial intelligence, robotics and psychophysical studies of the human visual system.