References
- Andrews, T. J., & Coppola, D. M. (1999). Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments. Vision Research, 39(17), 2947–2953. https://doi.org/https://doi.org/10.1016/S0042-6989(99)00019-X
- Bargary, G., Bosten, J. M., Goodbourn, P. T., Lawrance-Owen, A. J., Hogg, R. E., & Mollon, J. (2017). Individual differences in human eye movements: An oculomotor signature? Vision Research, 141, 157–169. https://doi.org/https://doi.org/10.1016/j.visres.2017.03.001
- Bar-Zeev, A., (2019). The Eyes Are the Prize: Eye-Tracking Technology Is Advertising’s Holy Grail. https://www.vice.com/en/article/bj9ygv/the-eyes-are-the-prize-eye-tracking-technology-is-advertisings-holy-grail
- Bayat, A., & Pomplun, M. (2017). Biometric identification through eye-movement patterns. In D. N. Cassenti (Ed.), Advances in human factors in simulation and modeling, advances in intelligent systems and computing 591 (pp. 583–594). Springer International Publishing.
- Bednarik, R., Kinnunen, T., Mihaila, A., & Fränti, P. (2005). Eye-Movements as a biometric. In H. Kalviainen, J. Parkkinen, & A. Kaarna (Eds.), SCIA: Scandinavian Conference on Image Analysis (pp. 780–789). Springer Berlin Heidelberg.
- Bertin, J. (1983). Semiology of graphics: Diagrams, networks, maps. University of Wisconsin Press. Madison (1983) (French edn., 1967).
- Boisvert, J. F. G., & Bruce, N. D. B. (2016). Predicting task from eye movements: On the importance of spatial distribution, dynamics, and image features. Neurocomputing, 207, 653–668. https://doi.org/https://doi.org/10.1016/j.neucom.2016.05.047
- Borji, A., & Itti, L. (2014). Defending Yarbus: Eye movements reveal observers’ task. Journal of Vision, 14(3), 1–22. https://doi.org/https://doi.org/10.1167/14.3.29
- Breiman, L. (2001). Random forests. Machine Learning, 45(1), 5–32. https://doi.org/https://doi.org/10.1023/A:1010933404324
- Bulling, A., Ward, J. A., Gellersen, H., & Troster, G. (2011). Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(4), 741–753. https://doi.org/https://doi.org/10.1109/TPAMI.2010.86
- Bulling, A., Ward, J. A., Gellersen, H., & Tröster, G. (2009). Eye movement analysis for activity recognition. Proceedings of the 11th International Conference on Ubiquitous Computing (pp. 41–50). ACM.
- Cantoni, V., Galdi, C., Nappi, M., Porta, M., & Riccio, D. (2015). GANT: Gaze analysis technique for human identification. Pattern Recognition, 48(4), 1027–1038. https://doi.org/https://doi.org/10.1016/J.PATCOG.2014.02.017
- Castelhano, M. S., & Henderson, J. M. (2008). Stable individual differences across images in human saccadic eye movements. Canadian Journal of Experimental Psychology, 62(1), 1–14. https://doi.org/https://doi.org/10.1037/1196-1961.62.1.1
- Chen, T., & Guestrin, C. (2016). XGBoost: A scalable tree boosting system. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 785–794). Association for Computing Machinery.
- Christ, M., Kempa-Liehr, A. W., & Feindt, M. (2017). Distributed and parallel time series feature extraction for industrial big data applications. https://arxiv.org/pdf/1610.07717
- Chuang, L. L., Duchowski, A. T., Qvarfordt, P., & Weiskopf, D. (2019). Ubiquitous Gaze Sensing and Interaction (Dagstuhl Seminar 18252). Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik. https://drops.dagstuhl.de/opus/volltexte/2019/10057/.
- Cöltekin, A., Fabrikant, S. I., & Lacayo, M. (2010). Exploring the efficiency of users’ visual analytics strategies based on sequence analysis of eye movement recordings. International Journal of Geographical Information Science, 24(10), 1559–1575. https://doi.org/https://doi.org/10.1080/13658816.2010.511718
- Davies, C., & Peebles, D. (2010). Spaces or scenes: Map-based orientation in urban environments. Spatial Cognition and Computation, 10(2–3), 135–156. https://doi.org/https://doi.org/10.1080/13875861003759289
- DELL. (2021). Alienware m17 Gaming Laptop with Tobii Eye Tracking. https://www.dell.com/en-us/shop/dell-laptops/alienware-m17-r2-gaming-laptop/spd/alienware-m17-r2-laptop
- Dong, W., Liao, H., Liu, B., Zhan, Z., Liu, H., Meng, L., & Liu, Y. (2020a). Comparing pedestrians’ gaze behavior in desktop and in real environments. Cartography and Geographic Information Science, 47(5), 432–451. https://doi.org/https://doi.org/10.1080/15230406.2020.1762513
- Dong, W., Zhan, Z., Liao, H., Meng, L., & Liu, J. (2020b). Assessing similarities and differences between males and females in visual behaviors in spatial orientation tasks. ISPRS International Journal of Geo-Information, 9(2), 115. https://doi.org/https://doi.org/10.3390/ijgi9020115
- Dong, W., Zheng, L., Liu, B., & Meng, L. (2018). Using eye tracking to explore differences in map-based spatial ability between geographers and non-geographers. ISPRS International Journal of Geo-Information, 7(9), 337. https://doi.org/https://doi.org/10.3390/ijgi7090337
- Duchowski, A. T., Krejtz, K., Krejtz, I., Biele, C., Niedzielska, A., Kiefer, P., Raubal, M., & Giannopoulos, I. (2018). The index of pupillary activity: measuring cognitive load vis-à-vis task difficulty with pupil oscillation. In M. H. Regan Mandryk (Ed.), Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Paper 282.
- Francis, T. (2020). Data Sharing Policy. Taylor & Francis. https://www.tandfonline.com/action/authorSubmission?show=instructions&journalCode=tcag20#dsp
- George, A., & Routray, A. (2016). A score level fusion method for eye movement biometrics. Pattern Recognition Letters, 82, 207–215. https://doi.org/https://doi.org/10.1016/j.patrec.2015.11.020
- Göbel, F., Kurzhals, K., Raubal, M., & Schinazi, V. R. (2020). Gaze-aware mixed-reality: Addressing privacy issues with eye tracking. In F. F. M. Regina Bernhaupt, D. Verweij, & J. Andres (Eds.), CHI 2020 Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI. ACM.
- Griffith, H. K., Lohr, D. J., Abdulin, E., & Komogortsev, O. (2020). GazeBase: A large-scale, multi-stimulus, longitudinal eye movement dataset. Scientific Data, 8(184), 1–9. https://doi.org/https://doi.org/10.1038/s41597-021-00959-y
- Holland, C., & Komogortsev, O. V. (2011). Biometric identification via eye movement scanpaths in reading. 2011 International Joint Conference on Biometrics (IJCB) (pp. 1–8). IEEE.
- Hoppe, S., Loetscher, T., Morey, S. A., & Bulling, A. (2018). Eye movements during everyday behavior predict personality traits. Frontiers in Human Neuroscience, 12(105), 1–8. https://doi.org/https://doi.org/10.3389/FNHUM.2018.00105
- HTC. (2021). VIVE Pro Eye Office. Retrieved May 15, 2021, from https://business.vive.com/us/product/vive-pro-eye-office/
- Jain, A. K., Nandakumar, K., & Ross, A. (2016). 50 years of biometric research: Accomplishments, challenges, and opportunities. Pattern Recognition Letters, 79(79), 80–105. https://doi.org/https://doi.org/10.1016/J.PATREC.2015.12.013
- Kasprowski, P. (2013). The impact of temporal proximity between samples on eye movement biometric identification. In K. Saeed, R. Chaki, A. Cortesi, & S. Wierzchoń (Eds), IFIP International Conference on Computer Information Systems and Industrial Management (pp. 77–87). Springer Berlin Heidelberg.
- Kasprowski, P., & Ober, J. (2004). Eye movements in biometrics. In D. Maltoni & A. K. Jain (Eds.), Biometric authentication (pp. 248–258). Springer Berlin Heidelberg.
- Kasprowski, P., & Rigas, I. (2013). The influence of dataset quality on the results of behavioral biometric experiments. In A. Brömme & C. Busch (Eds.), 2013 International Conference of the BIOSIG Special Interest Group (BIOSIG) (pp. 1–8). IEEE.
- Katsini, C., Abdrabou, Y., Raptis, G. E., Khamis, M., & Alt, F. (2020). The role of eye gaze in security and privacy applications: Survey and future HCI research directions. In F. F. M. Regina Bernhaupt, D. Verweij, & J. Andres (Eds.), Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–21). ACM.
- Keskin, M., Ooms, K., Dogru, A. O., & Maeyer, P. D. (2020). Exploring the cognitive load of expert and novice map users using EEG and eye tracking. ISPRS International Journal of Geo-Information, 9(7), 429. https://doi.org/https://doi.org/10.3390/IJGI9070429
- Kiefer, P., Giannopoulos, I., Duchowski, A., & Raubal, M. (2016). Measuring cognitive load for map tasks through pupil diameter. In O. S. D. Miller & J. N. Wiegand (Eds.), Geographic information science. GIScience 2016. Lecture notes in computer science (pp. 323–337). Springer.
- Kiefer, P., Giannopoulos, I., & Raubal, M. (2013). Using eye movements to recognize activities on cartographic maps. In M. S. Craig Knoblock (Ed.), Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems (pp. 478–481). ACM.
- Kiefer, P., Giannopoulos, I., Raubal, M., & Duchowski, A. T. (2017). Eye tracking for spatial research: Cognition, computation, challenges. Spatial Cognition and Computation, 17(1–2), 1–19. https://doi.org/https://doi.org/10.1080/13875868.2016.1254634
- Kiefer, P., Zhang, Y., & Bulling, A. (2015). The 5th international workshop on pervasive eye tracking and mobile eye-based interaction. In M. L. Kenji Mase & D. Gatica-Perez (Ed.), International Symposium on Wearable Computers (pp. 825–828). ACM.
- Kim, J., Kwan, M. P., Levenstein, M. C., & Richardson, D. B. (2021). How do people perceive the disclosure risk of maps? Examining the perceived disclosure risk of maps and its implications for geoprivacy protection. Cartography and Geographic Information Science, 48(1), 2–20. https://doi.org/https://doi.org/10.1080/15230406.2020.1794976
- Kinnunen, T., Sedlak, F., & Bednarik, R. (2010). Towards task-independent person authentication using eye movement signals. In C. H. Morimoto & H. Istance (Eds.), Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (pp. 187–190). ACM.
- Krejtz, K., Duchowski, A. T., Niedzielska, A., Biele, C., & Krejtz, I. (2018). Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze. PloS One, 13(9), e0203629. https://doi.org/https://doi.org/10.1371/journal.pone.0203629
- Kwan, M.-P., Casas, I., & Schmitz, B. (2004). Protection of geoprivacy and accuracy of spatial information: How effective are geographical masks? Cartographica: The International Journal for Geographic Information and Geovisualization, 39(2), 15–28. https://doi.org/https://doi.org/10.3138/x204-4223-57mk-8273
- Kwok, T. C. K., Kiefer, P., Schinazi, V. R., Adams, B., & Raubal, M. (2019). Gaze-guided narratives: Adapting audio guide content to gaze in virtual and real environments. In G. F. Stephen Brewster (Ed.), CHI ‘19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM.
- Li, C., Xue, J., Quan, C., Yue, J., & Zhang, C. (2018). Biometric recognition via texture features of eye movement trajectories in a visual searching task. PloS One, 13(4), e0194475. https://doi.org/https://doi.org/10.1371/journal.pone.0194475
- Liang, Z., Tan, F., & Chi, Z. (2012). Video-based biometric identification using eye tracking technique. In K. K. M. Lam & J. Huang (Eds.), 2012 IEEE International Conference on Signal Processing, Communication and Computing (ICSPCC 2012) (pp. 728–733). IEEE.
- Liao, H., & Dong, W. (2017). An exploratory study investigating gender effects on using 3D maps for spatial orientation in wayfinding. ISPRS International Journal of Geo-Information, 6(3), 1–19. https://doi.org/https://doi.org/10.3390/ijgi6030060
- Liao, H., Dong, W., Huang, H., Gartner, G., & Liu, H. (2019). Inferring user tasks in pedestrian navigation from eye movement data in real-world environments. International Journal of Geographical Information Science, 33(4), 739–763. https://doi.org/https://doi.org/10.1080/13658816.2018.1482554
- Liao, H., Dong, W., Peng, C., & Liu, H. (2017). Exploring differences of visual attention in pedestrian navigation when using 2D maps and 3D geo-browsers. Cartography and Geographic Information Science, 44(6), 474–490. https://doi.org/https://doi.org/10.1080/15230406.2016.1174886
- Libet, B., Gleason, C. A., Wright, E. W., & Pearl, D. K. (1983). Time of conscious intention to act in relation to onset of cerebral activity (readiness-potential). The unconscious initiation of a freely voluntary act. brain, 106(3), 623–642. https://doi.org/https://doi.org/10.1093/BRAIN/106.3.623
- MacEachren, A. M. (1995). How maps work: Representation, visualization, and design. Guilford Press.
- Madsen, J., Júlio, S. U., Gucik, P. J., Steinberg, R., & Parra, L. C. (2021). Synchronized eye movements predict test scores in online video education. Proceedings of the National Academy of Sciences of the United States of America, 118(5), e2016980118. https://doi.org/https://doi.org/10.1073/PNAS.2016980118
- Maggi, S., Fabrikant, S. I., Imbert, J.-P., & Hurter, C. (2015). How do display design and user characteristics matter in animations?: An empirical study with air traffic control displays. Cartographica: The International Journal for Geographic Information and Geovisualization, 51(1), 25–37. https://doi.org/https://doi.org/10.3138/CART.3176
- Microsoft. (2020). HoloLens 2 A new reality for computing: See new ways to work better together with the ultimate mixed reality device. https://www.microsoft.com/en-us/hololens
- Montello, D. R., Lovelace, K. L., Golledge, R. G., & Self, C. M. (1999). Sex‐related differences and similarities in geographic and environmental spatial abilities. Annals of the Association of American Geographers, 89(3), 515–534. https://doi.org/https://doi.org/10.1111/0004-5608.00160
- Ooms, K., De Maeyer, P., Fack, V., Van Assche, E., & Witlox, F. (2012). Interpreting maps through the eyes of expert and novice users. International Journal of Geographical Information Science, 26(10), 1773–1788. https://doi.org/https://doi.org/10.1080/13658816.2011.642801
- Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, É. (2011). Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12(85), 2825–2830.
- Popelka, S., & Brychtova, A. (2013). Eye-tracking study on different perception of 2D and 3D terrain visualisation. The Cartographic Journal, 50(3), 240–246. https://doi.org/https://doi.org/10.1179/1743277413Y.0000000058
- Poynter, W., Barber, M., Inman, J., & Wiggins, C. J. V. R. (2013). Individuals exhibit idiosyncratic eye-movement behavior profiles across tasks. Vision Research, 89, 32–38. https://doi.org/https://doi.org/10.1016/j.visres.2013.07.002
- Rao, J., Gao, S., Li, M., & Huang, Q. (2021). A privacy-preserving framework for location recommendation using decentralized collaborative machine learning. Transactions in GIS, 25, 1153–1175. https://doi.org/https://doi.org/10.1111/tgis.12769
- Rayner, K. (2009). Eye movements and attention in reading, scene perception, and visual search. The Quarterly Journal of Experimental Psychology, 62(8), 1457–1506. https://doi.org/https://doi.org/10.1080/17470210902816461
- Rigas, I., & Komogortsev, O. V. (2014). Biometric recognition via probabilistic spatial projection of eye movement trajectories in dynamic visual environments. IEEE Transactions on Information Forensics and Security, 9(10), 1743–1754. https://doi.org/https://doi.org/10.1109/TIFS.2014.2350960
- Rigas, I., & Komogortsev, O. V. (2017). Current research in eye movement biometrics: An analysis based on BioEye 2015 competition. Image and Vision Computing, 58, 129–141. https://doi.org/https://doi.org/10.1016/j.imavis.2016.03.014
- Saeed, U. (2016). Eye movements during scene understanding for biometric identification. Pattern Recognition Letters, 82(2), 190–195. https://doi.org/https://doi.org/10.1016/j.patrec.2015.06.019
- Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. In A. T. Duchowski (Ed.), Proceedings of the 2000 symposium on Eye tracking research & applications (pp. 71–78). ACM.
- Schröder, C., Zaidawi, S. M. K. A., Prinzler, M. H. U., Maneth, S., & Zachmann, G. (2020). Robustness of eye movement biometrics against varying stimuli and varying trajectory length. In R. Bernhaupt & F. F. Mueller (Eds.), Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–7). ACM.
- Seidl, D. E., Jankowski, P., & Tsou, M.-H. (2016). Privacy and spatial pattern preservation in masked GPS trajectory data. International Journal of Geographical Information Science, 30(4), 785–800. https://doi.org/https://doi.org/10.1080/13658816.2015.1101767
- Soon, C. S., Brass, M., Heinze, H.-J., & Haynes, J.-D. (2008). Unconscious determinants of free decisions in the human brain. Nature Neuroscience, 11(5), 543–545. https://doi.org/https://doi.org/10.1038/nn.2112
- Steil, J., Hagestedt, I., Huang, M. X., & Bulling, A. (2019). Privacy-aware eye tracking using differential privacy. In B. S. Krzysztof Krejtz (Ed.), ETRA ‘19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. ACM.
- Tavenard, R., Faouzi, J., Vandewiele, G., Divo, F., Androz, G., Holtz, C., Payne, M., Yurchak, R., Rußwurm, M., Kolar, K., & Woods, E. (2020). Tslearn, a machine learning toolkit for time series data. Journal of Machine Learning Research, 2020(21), 1–6. https://jmlr.org/papers/v21/20-091.html
- Tobii. (2012). The Tobii I-VT Fixation Filter: Algorithm description. https://www.tobiipro.com/siteassets/tobii-pro/learn-and-support/analyze/how-do-we-classify-eye-movements/tobii-pro-i-vt-fixation-filter.pdf
- Tobii. (2020). Tobii Eye Tracker 5 - The Next Generation of Head and Eye Tracking. https://gaming.tobii.com/product/eye-tracker-5/
- Tobii. (2021). Are pupil size calculations possible with Tobii Eye Trackers? https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/is-pupil-size-calculations-possible-with-tobii-eye-trackers/
- Wang, C., Chen, Y., Zheng, S., & Liao, H. (2018). Gender and age differences in using indoor maps for wayfinding in real environments. ISPRS International Journal of Geo-Information, 8(1), 11. https://doi.org/https://doi.org/10.3390/ijgi8010011
- White, J. P., Dennis, S., Tomko, M., Bell, J., & Winter, S. (2021). Paths to social licence for tracking-data analytics in university research and services. PloS One, 16(5), e0251964. https://doi.org/https://doi.org/10.1371/journal.pone.0251964
- Yarbus, A. L. (1967). Eye movements and vision. Plenum press.
- Ying, S., Zhuang, Y., Huang, L., Wang, H., & Yin, Z. (2020). Analysis of the correlation between spatial cognitive abilities and wayfinding decisions in 3D digital environments. Behaviour & Information Technology, 40(8), 809–820. https://doi.org/https://doi.org/10.1080/0144929X.2020.1726468
- Zheng, W.-L., Dong, B.-N., & Lu, B.-L. (2014). Multimodal emotion recognition using EEG and eye tracking data. International Conference of the IEEE Engineering in Medicine and Biology Society (pp. 5040–5043). IEEE.