176
Views
0
CrossRef citations to date
0
Altmetric
Original Article

Evaluating the performance of gaze interaction for map target selection

ORCID Icon, ORCID Icon & ORCID Icon
Received 22 Mar 2023, Accepted 20 Mar 2024, Published online: 09 Apr 2024

References

  • Abran, A., Khelifi, A., Suryn, W., & Seffah, A. (2003). Usability meanings and interpretations in ISO standards. Software Quality Journal, 11(4), 325–338. https://doi.org/10.1023/A:1025869312943
  • Alinaghi, N., Kattenbeck, M., Golab, A., & Giannopoulos, I. (2021). Will you take this turn? Gaze-based turning activity recognition during navigation. In V. Janowicz & A. Judith (Eds.), 11th international conference on Geographic Information Science (pp. 5:1–:5:16). Schloss Dagstuhl - Leibniz-Zentrum für Informatik. https://doi.org/10.4230/LIPIcs.GIScience.2021.II.5
  • APA. (2023). APA dictionary of psychology-interaction effect. Retrieved October 3, 2023, from. https://dictionary.apa.org/interaction-effect
  • Bektaş, K., Çöltekin, A., Krüger, J., Duchowski, A. T., & Irina Fabrikant, S. (2019). GeoGCD: Improved visual search via gaze-contingent display. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Article 84. New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3317959.3321488
  • Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., & Ertl, T. (2017). Visualization of eye tracking data: A taxonomy and survey. Computer Graphics Forum, 36(8), 260–284. https://doi.org/10.1111/cgf.13079
  • Bulling, A., Ward, J. A., Gellersen, H., & Troster, G. (2011). Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis & Machine Intelligence, 33(4), 741–753. https://doi.org/10.1109/tpami.2010.86
  • Burch, M., Kurzhals, K., & Weiskopf, D. (2014). Visual task solution strategies in public transport maps. In P. Kiefer, I. Giannopoulos, M. Raubal, & A. Krüger (Eds.), Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research co-located with the 8th International Conference on Geographic Information Science (GIScience 2014) (pp. 284–291). Vienna, Austria.
  • Card, S. K., Moran, T. P., & Newell, A. (1983). The psychology of human-computer interaction: L. Erlbaum Associates, Hillsdale, N.J.
  • Chuang, L., Duchowski, A. T., Qvarfordt, P., & Weiskopf, D. (2019). Ubiquitous gaze sensing and interaction (Dagstuhl seminar 18252). Dagstuhl Reports, 8(6), 77–148. https://doi.org/10.4230/DagRep.8.6.77
  • Colin, W., & Mikaelian, H. H. (1986). An evaluation of an eye tracker as a device for computer input2. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface, (pp. 183–188). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/29933.275627
  • Çöltekin, A., Fabrikant, S. I., & Lacayo, M. (2010). Exploring the efficiency of users’ visual analytics strategies based on sequence analysis of eye movement recordings. International Journal of Geographical Information Science, 24(10), 1559–1575. https://doi.org/10.1080/13658816.2010.511718
  • Creed, C., Frutos-Pascual, M., & Williams, I. (2020). Multimodal Gaze Interaction for Creative Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, (pp. 1–13). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3313831.3376196
  • Cybulski, P., & Krassanakis, V. (2022). The effect of map label language on the visual search of cartographic point symbols. Cartography and Geographic Information Science, 49(3), 189–204. https://doi.org/10.1080/15230406.2021.2007419
  • David, S., Marsh, O., Hutchinson, C., Judge, S., & Paterson, K. B. (2021). Cognitive plasticity induced by gaze-control technology: Gaze-typing improves performance in the antisaccade task. Computers in Human Behavior, 122, 106831. https://doi.org/10.1016/j.chb.2021.106831
  • Drewes, H., & Albrecht, S. (2007). Interacting with the computer using gaze gestures. In C. Baranauskas, P. Palanque, J. Abascal, & S. D. J. Barbosa (Eds.), Human-computer interaction – INTERACT 2007 (pp. 475–488). Springer. https://doi.org/10.1007/978-3-540-74800-7_43
  • Duchowski, A. T. (2018). Gaze-based interaction: A 30 year retrospective. Computers & Graphics, 73, 59–69. https://doi.org/10.1016/J.CAG.2018.04.002
  • Feit, A. M., Williams, S., Toledo, A., Paradiso, A., Kulkarni, H., Kane, S., & Ringel Morris, M. (2017). Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 1118–1130). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3025453.3025599
  • Friedman, M. B. (1983). Eyetracker communication system. In The Seventh Annual Symposium on Computer Applications in Medical Care, (pp. 895–896). Washington, USA: IEEE. https://doi.org/10.1109/SCAMC.1983.764801
  • Giannopoulos, I., Kiefer, P., & Raubal, M. (2012). GeoGazemarks: Providing gaze history for the orientation on small display maps. In Proceedings of the 14th ACM International Conference on Multimodal Interaction (pp. 165–172). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/2388676.2388711
  • Giannopoulos, I., Peter, K., & Martin, R. (2015). GazeNav: Gaze-based pedestrian navigation. In S. Boring, H. Gellersen, E. Rukzio, & K. Hinckley (Eds.), Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (pp. 337–346). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/2785830.2785873.
  • Göbel, F., Bakogioannis, N., Henggeler, K., Tschümperlin, R., Xu, Y., Kiefer, P., & Raubal, M. (2018). A public gaze-controlled campus map. In Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop (ET4S). ETH Zurich. https://doi.org/10.3929/ETHZ-B-000222491
  • Göbel, F., Kiefer, P., Giannopoulos, I., Duchowski, A. T., & Raubal, M. (2018). Improving map reading with gaze-adaptive legends In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, (pp. 1–9). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3204493.3204544.
  • Goldberg, J. H., & Kotval, X. P. (1999). Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics, 24(6), 631–645. https://doi.org/10.1016/S0169-8141(98)00068-7
  • Hansen, J. P., Vijay Rajanna, I. S. M., & Bækgaard, P. (2018). A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In Proceedings of the Workshop on Communication by Gaze Interaction, Article 7. New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3206343.3206344.
  • Harezlak, K., Duliban, A., & Kasprowski, P. (2021). Eye movement-based methods for human-system interaction. A comparison of different approaches. Procedia Computer Science, 192, 3099–3108. https://doi.org/10.1016/j.procs.2021.09.082
  • Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In P. A. Hancock & N. Meshkati (Eds.), Advances in psychology (pp. 139–183). North Holland. https://doi.org/10.1016/S0166-4115(08)62386-9
  • Hinderks, A., Schrepp, M., José Domínguez Mayo, F., José Escalona, M., & Thomaschewski, J. (2019). Developing a UX KPI based on the user experience questionnaire. Computer Standards and Interfaces, 65, 38–44. https://doi.org/10.1016/j.csi.2019.01.007
  • Hoppe, S., Loetscher, T., Morey, S. A., & Bulling, A. (2018). Eye movements during everyday behavior predict personality traits. Frontiers in Human Neuroscience, 12, 328195. https://doi.org/10.3389/fnhum.2018.00105
  • Hou, W. J., & Chen, X. L. (2021). Comparison of eye-based and controller-based selection in virtual reality. International Journal of Human-Computer Interaction, 37(5), 484–495. https://doi.org/10.1080/10447318.2020.1826190
  • Hyrskykari, A., Istance, H., & Vickers, S. (2012). Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications, (pp. 229–232). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/2168556.2168602.
  • ISO 9241-11. (1998). Ergonomic requirements for office work with visual display terminals (VDTs). Part 11, guidance on usability. Retrieved March 27, 2024 from https://www.iso.org/standard/16883.html
  • Isomoto, T., Yamanaka, S., & Shizuki, B. (2023). Exploring dwell-time from human cognitive processes for dwell selection. 7 (Proc. ACM Hum.-Comput. Interact.):Article 159. https://doi.org/10.1145/3591128
  • Jacob, R. J. K. (1991). The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems, 9(2), 152–169. https://doi.org/10.1145/123078.128728
  • Just, M. A., & Carpenter, P. A. (1976). Eye fixations and cognitive processes. Cognitive Psychology, 8(4), 441–480. https://doi.org/10.1016/0010-0285(76)90015-3
  • Just, M. A., & Carpenter, P. A. (1984). Using eye fixations to study reading comprehension. In D. E. Kieras & M. A. Just (Eds.), New methods in reading comprehension research (pp. 151–182). Routledge.
  • Kasprowski, P., Harezlak, K., & Niezabitowski, M. (2016). Eye movement tracking as a new promising modality for human computer interaction. In 17th International Carpathian Control Conference (ICCC), (pp. 314–318). High Tatras, Slovakia: Institute of Electrical and Electronics Engineers (IEEE). https://doi.org/10.1109/CarpathianCC.2016.7501115.
  • Keskin, M., & Kettunen, P. (2023). Potential of eye-tracking for interactive geovisual exploration aided by machine learning. International Journal of Cartography, 9(2), 150–172. https://doi.org/10.1080/23729333.2022.2150379
  • Keskin, M., Ooms, K., Ozgur Dogru, A., & De Maeyer, P. (2020). Exploring the cognitive load of expert and novice map users using EEG and eye tracking. ISPRS International Journal of Geo-Information, 9(7), 429. https://doi.org/10.3390/ijgi9070429
  • Kiefer, P., Giannopoulos, I., Raubal, M., & Duchowski, A. (2017). Eye tracking for spatial research: Cognition, computation, challenges. Spatial Cognition and Computation, 17(1–2), 1–19. https://doi.org/10.1080/13875868.2016.1254634
  • Klamka, K., Siegel, A., Vogt, S., Göbel, F., Stellmach, S., & Dachselt, R. (2015). Look & pedal: Hands-free navigation in zoomable information spaces through gaze-supported foot input. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, (pp. 123–130). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/2818346.2820751.
  • Krassanakis, V., & Cybulski, P. (2021). Eye tracking research in cartography: Looking into the future. ISPRS International Journal of Geo-Information, 10(6), 411. https://doi.org/10.3390/ijgi10060411
  • Królak, A., & Strumiłło, P. (2012). Eye-blink detection system for human–computer interaction. Universal Access in the Information Society, 11(4), 409–419. https://doi.org/10.1007/S10209-011-0256-6
  • Kubíček, P., Šašinka, Č., Stachoň, Z., Štěrba, Z., Apeltauer, J., & Urbánek, T. (2017). Cartographic design and usability of visual variables for linear features. The Cartographic Journal, 54(1), 91–102. https://doi.org/10.1080/00087041.2016.1168141
  • Kytö, M., Ens, B., Piumsomboon, T., Lee, G., & Billinghurst, M. (2018). Pinpointing: Precise head- and eye-based target selection for augmented reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, (pp. 1–14). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3173574.3173655.
  • Laugwitz, B., Held, T., & Schrepp, M. (2008). Construction and evaluation of a user experience questionnaire. In A. Holzinger (Ed.), HCI and usability for education and work (pp. 63–76). Springer. https://doi.org/10.1007/978-3-540-89350-9_6
  • Liao, H., Dong, W. H., Huang, H. S., Gartner, G., & Liu, H. P. (2019). Inferring user tasks in pedestrian navigation from eye movement data in real-world environments. International Journal of Geographical Information Science, 33(4), 739–763. https://doi.org/10.1080/13658816.2018.1482554
  • Liao, H., Dong, W., Peng, C., & Liu, H. (2017). Exploring differences of visual attention in pedestrian navigation when using 2D maps and 3D geo-browsers. Cartography and Geographic Information Science, 44(6), 474–490. https://doi.org/10.1080/15230406.2016.1174886
  • Liao, H., Wang, X., Dong, W., & Meng, L. (2019). Measuring the influence of map label density on perceived complexity: A user study using eye tracking. Cartography and Geographic Information Science, 46(3), 210–227. https://doi.org/10.1080/15230406.2018.1434016
  • Liao, H., Zhang, C. B., Zhao, W. D., & Dong, W. H. (2022). Toward gaze-based map interactions: Determining the dwell time and buffer size for the gaze-based selection of map features. ISPRS International Journal of Geo-Information, 11(2), 127. https://doi.org/10.3390/ijgi11020127
  • Liao, H., Zhao, W., Zhang, C., Dong, W., & Huang, H. (2022). Detecting individuals’ spatial familiarity with urban environments using eye movement data. Computers, environment and urban systems, 93, 101758. https://doi.org/10.1016/j.compenvurbsys.2022.101758
  • Libet, B., Gleason, C. A., Wright, E. W., & Pearl, D. K. (1983). Time of conscious intention to act in relation to onset of cerebral activity (readiness-potential). The unconscious initiation of a freely voluntary act. Brain : A journal of neurology, 106(Pt 3), 623–642. https://doi.org/10.1093/brain/106.3.623
  • Luro, F. L., & Sundstedt, V. (2019). A comparative study of eye tracking and hand controller for aiming tasks in virtual reality. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Article 68. New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3317956.3318153.
  • Majaranta, P., Ahola, U.-K., & Špakov, O. (2009). Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 357–360. New York, USA: Association for Computing Machinery. https://doi.org/10.1145/1518701.1518758.
  • Majaranta, P., & Räihä, K.-J. (2002). Twenty years of eye typing: Systems and design issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, (pp. 15–22). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/507072.507076.
  • Mollenbach, E., Paulin Hansen, J., Lillholm, M., & Gale, A. G. (2009). Single stroke gaze gestures. In CHI ’09 extended abstracts on human factors in computing systems (pp. 4555–4560). Association for Computing Machinery. https://doi.org/10.1145/1520340.1520699
  • Mott, M. E., Williams, S., Wobbrock, J. O., & Ringel Morris, M. (2017). Improving dwell-based gaze typing with dynamic, cascading dwell times. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, (pp. 2558–2570). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3025453.3025517.
  • Murata, A. (2006). Eye-gaze input versus mouse: Cursor control as a function of age. International Journal of Human-Computer Interaction, 21(1), 1–14. https://doi.org/10.1207/s15327590ijhc2101_1
  • Najjar, A., Benabid, A. A.-W., Hosny, M., Alrashed, W., & Alrubaian, A. (2021). Usability evaluation of optimized single-pointer Arabic keyboards using eye tracking. Advances in Human-Computer Interaction. https://doi.org/10.1155/2021/6657155
  • Niu, Y. F., Gao, Y., Zhang, Y. T., Xue, C. Q., & Yang, L. X. (2019). Improving eye-computer interaction interface design: Ergonomic investigations of the optimum target size and gaze-triggering dwell time. Journal of Eye Movement Research, 12, 3. https://doi.org/10.16910/jemr.12.3.8
  • Ober, J. (1997). Application of eye movement measuring system OBER 2 to medicine and technology. Proceedings of SPIE, 3061(1), 327–336. https://doi.org/10.1117/12.280352
  • Ooms, K., De Maeyer, P., Fack, V., Van Assche, E., & Witlox, F. (2012). Interpreting maps through the eyes of expert and novice users. International Journal of Geographical Information Science, 26(10), 1773–1788. https://doi.org/10.1080/13658816.2011.642801
  • Ooms, K., & Krassanakis, V. (2018). Measuring the spatial noise of a low-cost eye tracker to enhance fixation detection. Journal of Imaging, 4(8), 96. https://doi.org/10.3390/jimaging4080096
  • Pai, Y. S., Dingler, T., & Kunze, K. (2019). Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality, 23(2), 119–131. https://doi.org/10.1007/s10055-018-0371-2
  • Paulus, Y. T., & Remijn, G. B. (2021). Usability of various dwell times for eye-gaze-based object selection with eye tracking. Displays, 67, Article 101997. https://doi.org/10.1016/j.displa.2021.101997
  • Penkar, A. M., Lutteroth, C., & Weber, G. (2012). Designing for the eye: Design parameters for dwell in gaze interaction. In Proceedings of the 24th Australian Computer-Human Interaction Conference, (pp. 479–488). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/2414536.2414609.
  • Pfeuffer, K., Mayer, B., Mardanbegi, D., & Gellersen, H. (2017). Gaze + pinch interaction in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction, (pp. 99–108). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3131277.3132180.
  • Piotr, K., & Sawicki, D. J. (2019). Blink and wink detection as a control tool in multimodal interaction. Multimedia Tools & Applications, 78(10), 13749–13765. https://doi.org/10.1007/S11042-018-6554-8
  • Popelka, S., Burian, J., & Beitlova, M. (2022). Swipe versus multiple view: A comprehensive analysis using eye-tracking to evaluate user interaction with web maps. Cartography and Geographic Information Science, 49(3), 252–270. https://doi.org/10.1080/15230406.2021.2015721
  • Rayner, K. (2009). Eye movements and attention in reading, scene perception, and visual search. The Quarterly Journal of Experimental Psychology, 62(8), 1457–1506. https://doi.org/10.1080/17470210902816461
  • Riegler, A., Aksoy, B., Riener, A., & Holzmann, C. (2020). Gaze-based interaction with windshield displays for automated driving: Impact of dwell time and feedback design on task performance and subjective workload. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, (pp. 151–160). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3409120.3410654.
  • Roth, R. E. (2013a). An empirically-derived taxonomy of interaction primitives for interactive cartography and geovisualization. IEEE Transactions on Visualization and Computer Graphics, 19(12), 2356–2365. https://doi.org/10.1109/TVCG.2013.130
  • Roth, R. E. (2013b). Interactive maps: What we know and what we need to know. Journal of Spatial Information Science, 2013, 59–115. https://doi.org/10.5311/JOSIS.2013.6.105
  • Rozado, D., Agustin, J. S., Rodriguez, F. B., & Varona, P. (2012). Gliding and saccadic gaze gesture recognition in real time. ACM Transactions on Interactive Intelligent Systems, 1(2), 1–27. https://doi.org/10.1145/2070719.2070723
  • Rudnicki, T. (2014). Eye-control empowers people with disabilities. Tobii ATI. Retrieved April 14, 2022, from. https://www.abilities.com/community/assistive-eye-control.html
  • Sarsam, S. M., Al-Samarraie, H., & Alzahrani, A. I. (2021). Influence of personality traits on users’ viewing behaviour. Journal of Information Science, 49(1), 233–247. https://doi.org/10.1177/0165551521998051
  • Schmidbauer-Wolf, G. M., & Guder, M. (2019). Usability and UX of a gaze interaction tool for front seat passengers: Evaluation of a gaze controlled optical feedback system in a car In Proceedings of Mensch Und Computer 2019, (pp. 677–681). New York, USA: Association for Computing Machinery. https://doi.org/10.1145/3340764.3344890.
  • Soon, C. S., Brass, M., Heinze, H.-J., & Haynes, J.-D. (2008). Unconscious determinants of free decisions in the human brain. Nature neuroscience, 11(5), 543–545. https://doi.org/10.1038/nn.2112
  • Wolfe, J. M., & Horowitz, T. S. (2004). What attributes guide the deployment of visual attention and how do they do it? Nature Reviews Neuroscience, 5(6), 495–501. https://doi.org/10.1038/nrn1411
  • Yi, J. S., Ah Kang, Y., Stasko, J. T., & Jacko, J. A. (2007). Toward a deeper understanding of the role of interaction in information visualization. IEEE Transactions on Visualization and Computer Graphics, 13(6), 1224–1231. https://doi.org/10.1109/tvcg.2007.70515
  • Zhu, L., Wang, S., Yuan, W., Dong, W., & Liu, J. (2020). An interactive map based on gaze control (In ChineseI. Geomatics and Information Science of Wuhan University, 45(5), 736–743. https://doi.org/10.13203/j.whugis20190165

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.