1,482
Views
39
CrossRef citations to date
0
Altmetric
Special Section: Social Media and Tracking Data

Inferring user tasks in pedestrian navigation from eye movement data in real-world environments

ORCID Icon, ORCID Icon, ORCID Icon, &
Pages 739-763 | Received 20 Aug 2017, Accepted 27 May 2018, Published online: 26 Jun 2018

References

  • Anagnostopoulos, V., et al., 2017. Gaze-Informed location-based services. International Journal of Geographical Information Science, 1–28. doi:10.1080/13658816.2017.1334896
  • Bednarik, R., et al., 2012. What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Stephen N. Spencer, ed. Proceedings of the symposium on eye tracking research and applications. Santa Barbara, CA, 83–90.
  • Bednarik, R., et al., 2013. A computational approach for prediction of problem-solving behavior using support vector machines and eye-tracking data. In: Y.I. Nakano, et al., eds. Eye Gaze in Intelligent User Interfaces: gaze-based Analyses, Models and Applications. London: Springer-Verlag London, 111–134.
  • Boisvert, J.F.G. and Bruce, N.D.B., 2016. Predicting task from eye movements: on the importance of spatial distribution, dynamics, and image features. Neurocomputing, 207, 653–668. doi:10.1016/j.neucom.2016.05.047
  • Borji, A. and Itti, L., 2014. Defending Yarbus: eye movements reveal observers’ task. Journal of Vision, 14 (3), 1–22. doi:10.1167/14.3.29
  • Breiman, L., 2001. Random forests. Machine Learning, 45 (1), 5–32. doi:10.1023/A:1010933404324
  • Bulling, A., et al., 2011. Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33 (4), 741–753. doi:10.1109/TPAMI.2010.86
  • Bulling, A., et al., 2013. Eyecontext: recognition of high-level contextual cues from human visual behaviour. In: Susanne Bødker, et al., eds. Proceedings of the sigchi conference on human factors in computing systems. Paris, France, 305–308.
  • Delikostidis, I., et al., 2015. Overcoming challenges in developing more usable pedestrian navigation systems. Cartography and Geographic Information Science, 43 (3), 189–207. doi:10.1080/15230406.2015.1031180
  • Dong, W., et al., 2014. Eye tracking to explore the potential of enhanced imagery basemaps in web mapping. The Cartographic Journal, 51 (4), 313–329. doi:10.1179/1743277413Y.0000000071
  • Doshi, A. and Trivedi, M.M., 2009. On the roles of eye gaze and head dynamics in predicting driver’s intent to change lanes. IEEE Transactions on Intelligent Transportation Systems, 10 (3), 453–462. doi:10.1109/TITS.2009.2026675
  • Downs, R.M. and Stea, D., 1977. Maps in minds: reflections on cognitive mapping. New York, NY: HarperCollins Publishers.
  • Fabrikant, S.I., et al., 2010. Cognitively inspired and perceptually salient graphic displays for efficient spatial inference making. Annals of the Association of American Geographers, 100 (1), 13–29. doi:10.1080/00045600903362378
  • Franke, C. and Schweikart, J., 2017. Mental representation of landmarks on maps – investigating cartographic visualization methods with eye tracking technology. Spatial Cognition & Computation, 17 (1–2), 20–38. doi:10.1080/13875868.2016.1219912
  • Friedman, J., et al., 2001. The elements of statistical learning. New York, NY: Springer series in statistics.
  • Garreta, R. and Moncecchi, G., 2013. Learning scikit-learn: machine learning in python. Birmingham, UK.: Packt Publishing Ltd.
  • Giannopoulos, I., et al., 2012. GeoGazemarks: providing gaze history for the orientation on small display maps. In:Proceedings of the 14th ACM international conference on Multimodal interaction. SantaMonica, California., 165–172.
  • Giannopoulos, I., et al., 2015. GazeNav: gaze-based pedestrian navigation. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. Copenhagen, New York: ACM, 337–346.
  • Gkonos, C., et al., 2017. Maps, vibration or gaze? Comparison of novel navigation assistance in indoor and outdoor environments. Journal of Location Based Services, 1–21. doi:10.1080/17489725.2017.1323125
  • Goldberg, J.H., et al., 2002. Eye tracking in web search tasks: design implications. In: Proceedings of the 2002 symposium on Eye tracking research & applications. New Orleans, LA, 51–58.
  • Goldberg, J.H. and Kotval, X.P., 1999. Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics, 24 (6), 631–645. doi:10.1016/S0169-8141(98)00068-7
  • Golledge, R.G., 1999. Human wayfinding and cognitive maps. In: R.G. Golledge, ed. Wayfinding behavior: cognitive mapping and other spatial processes. Baltimore, MD: The Johns Hopkins University Press, 5–45.
  • Gong, L., et al., 2015. Inferring trip purposes and uncovering travel patterns from taxi trajectory data. Cartography and Geographic Information Science, 43 (2), 103–114. doi:10.1080/15230406.2015.1014424
  • Greene, M.R., et al., 2012. Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns. Vision Research, 62, 1–8. doi:10.1016/j.visres.2012.03.019
  • Haji-Abolhassani, A. and Clark, J.J., 2014. An inverse Yarbus process: predicting observers’ task from eye movement patterns. Vision Research, 103, 127–142. doi:10.1016/j.visres.2014.08.014
  • Hart, S.G. and Staveland, L.E., 1988. Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Advances in Psychology, 52, 139–183.
  • Hegarty, M., et al., 2002. Development of a self-report measure of environmental spatial ability. Intelligence, 30 (5), 425–447. doi:10.1016/S0160-2896(02)00116-2
  • Henderson, J.M., 2003. Human gaze control during real-world scene perception. Trends in Cognitive Sciences, 7 (11), 498–504. doi:10.1016/j.tics.2003.09.006
  • Huang, H., et al., 2014. AffectRoute–considering people’s affective responses to environments for enhancing route-planning services. International Journal of Geographical Information Science, 28 (12), 2456–2473. doi:10.1080/13658816.2014.931585
  • Huang, H. and Gartner, G., 2012. Collective intelligence-based route recommendation for assisting pedestrian wayfinding in the era of Web 2.0. Journal of Location Based Services, 6 (1), 1–21. doi:10.1080/17489725.2011.625302
  • Jiang, B. and Yao, X., 2006. Location-based services and GIS in perspective. Computers, Environment and Urban Systems, 30 (6), 712–725. doi:10.1016/j.compenvurbsys.2006.02.003
  • Just, M.A. and Carpenter, P.A., 1976. Eye fixations and cognitive processes. Cognitive Psychology, 8 (4), 441–480. doi:10.1016/0010-0285(76)90015-3
  • Kanan, C., et al., 2014. Predicting an observer’s task using multi-fixation pattern analysis. In: Proceedings of the symposium on eye tracking research and applications. Safety Harbor, FL, 287–290.
  • Kiefer, P., et al., 2013. Using eye movements to recognize activities on cartographic maps. In: Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems. Orlando, FL, 478–481.
  • Kiefer, P., et al., 2017. Eye tracking for spatial research: cognition, computation, challenges. Spatial Cognition & Computation, 17 (1–2), 1–19. doi:10.1080/13875868.2016.1254634
  • Komogortsev, O.V., et al., 2010. Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering, 57 (11), 2635–2645. doi:10.1109/TBME.2010.2057429
  • Lander, C., et al., 2017. Inferring landmarks for pedestrian navigation from mobile eye-tracking data and Google street view. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. Denver, CO, 2721–2729.
  • Liao, H., et al., 2017b. Exploring differences of visual attention in pedestrian navigation when using 2D maps and 3D geo-browsers. Cartography and Geographic Information Science, 44 (6), 474–490. doi:10.1080/15230406.2016.1174886
  • Liao, H., et al., 2018. Measuring the influence of map label density on perceived complexity: a user study using eye tracking. Cartography and Geographic Information Science, 1–19. doi:10.1080/15230406.2018.1434016
  • Liao, H. and Dong, W., 2017a. An exploratory study investigating gender effects on using 3D maps for spatial orientation in wayfinding. ISPRS International Journal of Geo-Information, 6 (3), 1–19. doi:10.3390/ijgi6030060
  • Liu, B., et al., 2013. Learning geographical preferences for point-of-interest recommendation. In: Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining. Chicago, Illinois, 1043–1051.
  • Montello, D. and Raubal, M., 2012. Functions and applications of spatial cognition. In: D.W.A.L. Nadel, ed. Handbook of Spatial Cognition. Washington, DC: APA, 249–264.
  • Montello, D.R., 2005. Navigation. In: P. Shah and A. Miyake, eds. The Cambridge handbook of visuospatial thinking. New York, NY: Cambridge University Press, 257–294.
  • Ohm, C., et al., 2014. Where is the landmark? Eye tracking studies in large-scale indoor environments. In: Peter Kiefer et al., eds. Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (in conjunction with GIScience 2014). Vienna, Austria, 47–51.
  • Ohm, C., et al., 2017. Evaluating indoor pedestrian navigation interfaces using mobile eye tracking. Spatial Cognition & Computation, 17 (1–2), 89–120. doi:10.1080/13875868.2016.1219913
  • Ooms, K., et al., 2012. Investigating the effectiveness of an efficient label placement method using eye movement data. The Cartographic Journal, 49 (3), 234–246. doi:10.1179/1743277412Y.0000000010
  • Rayner, K., 2009. Eye movements and attention in reading, scene perception, and visual search. The Quarterly Journal of Experimental Psychology, 62 (8), 1457–1506.
  • Salvucci, D.D. and Goldberg, J.H., 2000. Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 symposium on Eye tracking research & applications. 71–78.
  • Schrom-Feiertag, H., et al., 2017. Evaluation of indoor guidance systems using eye tracking in an immersive virtual environment. Spatial Cognition & Computation, 17 (1–2), 163–183. doi:10.1080/13875868.2016.1228654
  • Silverman, B.W., 1986. Density estimation for statistics and data analysis. Florida: CRC press.
  • Spiers, H.J. and Maguire, E.A., 2008. The dynamic nature of cognition during wayfinding. Journal of Environmental Psychology, 28 (3), 232–249. doi:10.1016/j.jenvp.2008.02.006
  • Steil, J. and Bulling, A., 2015. Discovery of everyday human activities from long-term visual behaviour using topic models. In: Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. Osaka, Japan., 75–85.
  • Viaene, P., et al., 2016. Examining the validity of the total dwell time of eye fixations to identify landmarks in a building. Journal of Eye Movement Research, 9 (3), 1–11.
  • Wang, S., et al., 2016. Visualizing the intellectual structure of eye movement research in cartography. ISPRS International Journal of Geo-Information, 5 (10), 1–22. doi:10.3390/ijgi5100168
  • Wolfe, J.M., et al., 2003. Changing your mind: on the contributions of top-down and bottom-up guidance in visual search for feature singletons. Journal of Experimental Psychology: Human Perception and Performance, 29 (2), 483–502.
  • Wu, X., et al., 2008. Top 10 algorithms in data mining. Knowledge & Information Systems, 14 (1), 1–37. doi:10.1007/s10115-007-0114-2
  • Yarbus, A.L., et al., 1967. Eye movements and vision. New York, NY: Plenum press.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.