336
Views
11
CrossRef citations to date
0
Altmetric
Articles

Robust Eye-Based Dwell-Free Typing

, &

References

  • Adjouadi, M., Sesin, A., Ayala, M., & Cabrerizo, M. (2004). Remote eye gaze tracking system as a computer interface for persons with severe motor disability. Springer Berlin Heidelberg.
  • Alnajar, F., Gevers, T., Valenti, R., & Ghebreab, S. (2013). Calibration-free gaze estimation using human gaze patterns. In 2013 IEEE International Conference on Computer Vision (ICCV) (pp. 137–144). Sydney, NSW: IEEE.
  • Biswas, P., & Langdon, P. (2015). Multimodal intelligent eye-gaze tracking system. International Journal of Human–Computer Interaction, 31(4), 277–294.
  • Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F., & Nardone, A. (2013). Eye tracking communication devices in amyotrophic lateral sclerosis: Impact on disability and quality of life. Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, 14(7–8), 546–552.
  • Chen, J., & Ji, Q. (2011). Probabilistic gaze estimation without active personal calibration. In 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 609–616). Providence, RI: IEEE.
  • Chen, J., & Ji, Q. (2015). A probabilistic approach to online eye gaze tracking without explicit personal calibration. In IEEE Transactions on Image Processing, 24(3), 1076–1086.
  • Davies, M. (2011). Word frequency data from the corpus of contemporary American English (coca). Retrieved from http://www.wordfrequency.info/
  • Donegan, M., Morris, J. D., Corno, F., Signorile, I., Chió, A., Pasian, V., & Holmqvist, E. (2009). Understanding users and their needs. Universal Access in the Information Society, 8(4), 259–275.
  • Eraslan, S., Yesilada, Y., & Harper, S. (2016). Eye tracking scanpath analysis on web pages: How many users? In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (pp. 103–110). Charleston, SC: ACM.
  • Guan, C., Thulasidas, M., & Wu, J. (2004). High performance p300 speller for brain-computer interface. In 2004 IEEE International Workshop on Biomedical Circuits and Systems (pp. S3–S5). Singapore: IEEE.
  • Hansen, D. W., Hansen, J. P., Nielsen, M., Johansen, A. S., & Stegmann, M. B. (2002). Eye typing using Markov and active appearance models. In Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision, 2002 (WACV 2002) (pp. 132–136). Orlando, FL: IEEE.
  • Hansen, J. P., Tørning, K., Johansen, A. S., Itoh, K., & Aoki, H. (2004). Gaze typing compared with input by head and hand. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (pp. 131–138). San Antonio, TX.
  • Hoppe, S., Löchtefeld, M., & Daiber, F. (2013). Eype–using eye-traces for eye-typing. In Workshop on Grand Challenges in Text Entry (chi 2013). Paris, France.
  • Huey, E. B. (1908). The psychology and pedagogy of reading. Bristol, UK: Thoemmes Press.
  • Jacob, R., & Karn, K. S. (2003). Eye tracking in human–computer interaction and usability research: Ready to deliver the promises. Mind, 2(3), 4.
  • Jacob, R. J. (1990). What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the Sigchi Conference on Human Factors in Computing Systems (pp. 11–18). Seattle, WA.
  • Jacob, R. J. (1991). The use of eye movements in human–computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems (TOIS), 9(2), 152–169.
  • Jacob, R. J. (1995). Eye tracking in advanced interface design. In W. Barfield & T. A. Furness (Eds.), Virtual environments and advanced interface design (pp. 258–288). Oxford, UK: Oxford University Press.
  • Just, M. A., & Carpenter, P. A. (1976). Eye fixations and cognitive processes. Cognitive Psychology, 8(4), 441–480.
  • Kocejko, T., Bujnowski, A., & Wtorek, J. (2009). Eye-mouse for disabled. In Z. S. Hippe & J. L. Kulikowski (Eds.), Human–computer systems interaction (pp. 109–122). Springer Berlin Heidelberg.
  • Kotani, K., Yamaguchi, Y., Asao, T., & Horii, K. (2010). Design of eye-typing interface using saccadic latency of eye movement. International Journal of Human–Computer Interaction, 26(4), 361–376.
  • Kristensson, P. O., & Vertanen, K. (2012). The potential of dwell-free eye-typing for fast assistive gaze communication. In Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 241–244). Santa Barbara, CA.
  • Kristensson, P.-O., & Zhai, S. (2004). Shark 2: A large vocabulary shorthand writing system for pen-based computers. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology (pp. 43–52). Santa Fe, NM: ACM.
  • Kumar, M. (2007). Gaze-enhanced user interface design ( Unpublished doctoral dissertation). Stanford University, Stanford, CA.
  • Liu, Y., Zhang, C., Lee, C., Lee, B.-S., & Chen, A. Q. (2015). Gazetry: Swipe text typing using gaze. In Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction (pp. 192–196). Melbourne, Australia.
  • MacKenzie, I. S., & Zhang, X. (2008). Eye typing using word and letter prediction and a fixation algorithm. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (pp. 55–58). Savannah, GA.
  • Majaranta, P., Ahola, U.-K., & Špakov, O. (2009). Fast gaze typing with an adjustable dwell time. In Proceedings of the Sigchi Conference on Human Factors in Computing Systems (pp. 357–360). Boston, MA.
  • Majaranta, P., Aoki, H., Donegan, M., Hansen, D. W., & Hansen, J. P. (2011). Gaze interaction and applications of eye tracking: Advances in assistive technologies (1st ed.). Hershey, PA: Information Science Reference - Imprint of IGI Publishing.
  • Majaranta, P., MacKenzie, I. S., Aula, A., & Räihä, K.-J. (2003). Auditory and visual feedback during eye typing. In Chi’03 Extended Abstracts on Human Factors in Computing Systems (pp. 766–767). Fort Lauderdale, FL.
  • Majaranta, P., & Räihä, K.-J. (2002). Twenty years of eye typing: Systems and design issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (pp. 15–22), New York, NY: ACM. doi:10.1145/507072.507076
  • Monty, R. A., & Senders, J. W. (1976). Eye movements and psychological processes (Tech. Rep.). DTIC Document. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Murata, A. (2006). Eye-gaze input versus mouse: Cursor control as a function of age. International Journal of Human–Computer Interaction, 21(1), 1–14.
  • Navarro, G. (2001). A guided tour to approximate string matching. ACM Computing Surveys (CSUR), 33(1), 31–88.
  • Ohno, T. (2007). Eyeprint: Using passive eye trace from reading to enhance document access and comprehension. International Journal of Human–Computer Interaction, 23(1–2), 71–94.
  • Pedrosa, D., Pimentel, M. D. G., & Truong, K. N. (2015). Filteryedping: A dwell-free eye typing technique. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (pp. 303–306). Seoul, Korea: ACM.
  • Pedrosa, D., Pimentel, M. D. G., Wright, A., & Truong, K. N. (2015). Filteryedping: Design challenges and user performance of dwell-free eye typing. ACM Transactions on Accessible Computing (TACCESS), 6(1), 3.
  • Räihä, K.-J., & Ovaska, S. (2012). An exploratory study of eye typing fundamentals: Dwell time, text entry rate, errors, and workload. In Proceedings of the Sigchi Conference on Human Factors in Computing Systems (pp. 3001–3010). Austin, TX.
  • Rayner, K. (1977). Visual attention in reading: Eye movements reflect cognitive processes. Memory & Cognition, 5(4), 443–448.
  • Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.
  • Salvucci, D. D. (1999). Inferring intent in eye-based interfaces: Tracing eye movements with process models. In Proceedings of the Sigchi Conference on Human Factors in Computing Systems (pp. 254–261). Pittsburgh, PA.
  • San Agustin, J., Skovsgaard, H., Hansen, J. P., & Hansen, D. W. (2009). Low-cost gaze interaction: Ready to deliver the promises. In Chi’09 Extended Abstracts on Human Factors in Computing Systems (pp. 4453–4458). Boston, MA.
  • Sarcar, S., Panwar, P., & Chakraborty, T. (2013). Eyek: An efficient dwell-free eye gaze-based text entry system. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction (pp. 215–220). Bangalore, India.
  • Spataro, R., Ciriacono, M., Manno, C., & La Bella, V. (2014). The eye-tracking computer device for communication in amyotrophic lateral sclerosis. Acta Neurologica Scandinavica, 130(1), 40–45.
  • Su, M.-C., Wang, K.-C., & Chen, G.-D. (2006). An eye tracking system and its application in aids for people with severe disabilities. Biomedical Engineering: Applications, Basis and Communications, 18(6), 319–327.
  • Sugano, Y., Matsushita, Y., & Sato, Y. (2010). Calibration-free gaze sensing using saliency maps. In 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 2667–2674). San Francisco, CA: IEEE.
  • Trnka, K., McCaw, J., Yarrington, D., McCoy, K. F., & Pennington, C. (2008). Word prediction and communication rate in AAC. In Telehealth and Assistive Technologies (Telehealth/AT) (pp. 19–24). Baltimore, MD.
  • Underwood, G. (2005). Cognitive processes in eye guidance. New York: Oxford University Press.
  • Urbina, M. H., & Huckauf, A. (2010). Alternatives to single character entry and dwell time selection on eye typing. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (pp. 315–322). Austin, TX.
  • Vadillo, M. A., Street, C. N., Beesley, T., & Shanks, D. R. (2015). A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation. Behavior Research Methods, 47(4), 1365–1376.
  • Ward, D. J., & MacKay, D. J. (2002). Fast hands-free writing by gaze direction. arXiv preprint cs/0204030.
  • Wobbrock, J. O., Myers, B. A., & Kembel, J. A. (2003). Edgewrite: A stylus-based text entry method designed for high accuracy and stability of motion. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (pp. 61–70). Vancouver, British Columbia.
  • Wolfe, B., & Eichmann, D. (1997). A neural network approach to tracking eye position. International Journal of Human–Computer Interaction, 9(1), 59–79.
  • Zander, T. O., Gaertner, M., Kothe, C., & Vilimek, R. (2010). Combining eye gaze input with a brain–computer interface for touchless human–computer interaction. International Journal of Human–Computer Interaction, 27(1), 38–51.
  • Zhai, S., & Kristensson, P.-O. (2003). Shorthand writing on stylus keyboard. In Proceedings of the Sigchi Conference on Human Factors in Computing Systems (pp. 97–104). Fort Lauderdale, FL.
  • Zhang, Y., & Hornof, A. J. (2014). Easy post-hoc spatial recalibration of eye tracking data. In Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 95–98). Safety Harbor, FL.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.