438
Views
15
CrossRef citations to date
0
Altmetric
Articles

Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection

ORCID Icon & ORCID Icon

References

  • Çığ, Ç., & Sezgin, T. M. (2015). Gaze-based real-time activity recognition for proactive interfaces. Proceedings of the 23nd Signal Processing and Communications Applications Conference (pp. 694–697). Piscataway, NJ: IEEE.
  • Ashmore, M., Duchowski, A. T., & Shoemaker, G. (2005). Efficient eye pointing with a fisheye lens. Proceedings of Graphics Interface (pp. 203–210). Waterloo, Canada: Canadian Human-Computer Communications Society.
  • Das, D., Rashed, M. G., Kobayashi, Y., & Kuno, Y. (2015). Supporting human–robot interaction based on the level of visual focus of attention. IEEE Transactions on Human-Machine Systems, 45(6), 664–675. doi:10.1109/THMS.2015.2445856
  • Dong, X., Wang, H., Chen, Z., & Shi, B. E. (2015). Hybrid Brain Computer Interface via Bayesian integration of EEG and eye gaze. In Proceedings of the 7th International IEEE/EMBS Conference on Neural Engineering (pp. 150–153).
  • Ghahramani, Z., & Jordan, M. I. (1997). Factorial hidden Markov models. Machine Learning, 29(2–3), 245–273. doi:10.1023/A:1007425814087
  • Huang, C.-M., & Mutlu, B. (2016). Anticipatory robot control for efficient human-robot collaboration. Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction (pp. 83–90). Piscataway, NJ: IEEE.
  • Jacob, R. J. (1995). Eye tracking in advanced interface design. In W. Barfield & T. A. Furness (Eds.), Virtual Environments and Advanced Interface Design (pp. 258-288). New York, NY: Oxford University Press.
  • Kumar, C., Menges, R., & Staab, S. (2016). Eye-controlled interfaces for multimedia interaction. IEEE MultiMedia, 23(4), 6–13. doi:10.1109/MMUL.2016.52
  • Kumar, M. (2007). Gaze-enhanced user interface design ( Doctoral dissertation, Stanford University).
  • Lee, S., Yoo, J., & Han, G. (2015). Gaze-assisted user intention prediction for initial delay reduction in web video access. Sensors, 15(6), 14679–14700. doi:10.3390/s150614679
  • Lutteroth, C., Penkar, M., & Weber, G. (2015). Gaze vs. mouse: A fast and accurate gaze-only click alternative. Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (pp. 385–394). New York, NY: ACM.
  • Majaranta, P., Ahola, U.-K., & Špakov, O. (2009). Fast gaze typing with an adjustable dwell time. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 357–360). New York, NY: ACM.
  • Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human–Computer interaction. In S. H. Fairclough & K. Gilleade (Eds.), Advances in Physiological Computing (pp. 39–65). London, UK: Springer.
  • Menges, R., Kumar, C., Müller, D., & Sengupta, K. (2017). GazeTheWeb: A gaze-controlled web browser. Paper presented at the Proceedings of the 14th Web for All Conference. New York, NY: ACM.
  • Menges, R., Kumar, C., Sengupta, K., & Staab, S. (2016). eyeGUI: A novel framework for eye-controlled user interfaces. Proceedings of the 9th Nordic Conference on Human-Computer Interaction (p. 121). New York, NY: ACM.
  • Mott, M. E., Williams, S., Wobbrock, J. O., & Morris, M. R. (2017). Improving dwell-based gaze typing with dynamic, cascading dwell times. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2558–2570). New York, NY: ACM.
  • Murata, A. (2006). Eye‐gaze input versus mouse: Cursor control as a function of age. International Journal of Human‐Computer Interaction, 21(1), 1–14.
  • Noton, D., & Stark, L. (1971). Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vision Research, 11(9), 929–IN928. doi:10.1016/0042-6989(71)90213-6
  • Orquin, J. L., & Loose, S. M. (2013). Attention and choice: A review on eye movements in decision making. Acta Psychologica, 144(1), 190–206. doi:10.1016/j.actpsy.2013.06.003
  • Pi, J., & Shi, B. E. (2017). Probabilistic adjustment of dwell time for eye typing. Proceedings of the 10th International Conference on Human System Interactions (pp. 251–257). Piscataway, NJ: IEEE.
  • Poole, A., & Ball, L. J. (2006). Eye tracking in human-computer interaction and usability research: Current status and future prospects. Encyclopedia of Human Computer Interaction, 1, 211–219.
  • Rabiner, L. R. (1989). A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2), 257–286. doi:10.1109/5.18626
  • Räihä, K.-J., & Ovaska, S. (2012). An exploratory study of eye typing fundamentals: Dwell time, text entry rate, errors, and workload. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 3001–3010). New York, NY: ACM.
  • Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372. doi:10.1037/0033-2909.124.3.372
  • Rozado, D., El Shoghri, A., & Jurdak, R. (2015). Gaze dependant prefetching of web content to increase speed and comfort of web browsing. International Journal of Human-Computer Studies, 78, 31–42. doi:10.1016/j.ijhcs.2015.02.006
  • Salvucci, D. D., & Anderson, J. R. (2000). Intelligent gaze-added interfaces. Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 273–280). New York, NY: ACM.
  • Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 71–78). New York, NY: ACM.
  • Shimojo, S., Simion, C., Shimojo, E., & Scheier, C. (2003). Gaze bias both reflects and influences preference. Nature Neuroscience, 6(12), 1317–1322. doi:10.1038/nn1150
  • Wang, H., Dong, X., Chen, Z., & Shi, B. E. (2015). Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task. Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (pp. 1476–1479). Piscataway, NJ: IEEE.
  • Windows Control [Computer software]. (2017). Retrieved from https://www.tobiidynavox.com/software/windows-software/windows-control-2/
  • Zander, T. O., Gaertner, M., Kothe, C., & Vilimek, R. (2010). Combining eye gaze input with a brain–Computer interface for touchless human–Computer interaction. International Journal of Human–Computer Interaction, 27(1), 38–51. doi:10.1080/10447318.2011.535752

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.