322
Views
0
CrossRef citations to date
0
Altmetric
Research Article

User-defined mid-air gestures for multiscale GIS interface interaction

ORCID Icon & ORCID Icon
Pages 481-494 | Received 06 Aug 2022, Accepted 20 Feb 2023, Published online: 02 Mar 2023

References

  • Anagun, Y., Isik, S., & Seke, E. (2019). Srlibrary: Comparing different loss functions for super-resolution over various convolutional architectures. Journal of Visual Communication and Image Representation, 61, 178–187. https://doi.org/10.1016/j.jvcir.2019.03.027
  • Arefin Shimon, S. S., Lutton, C., Xu, Z., Morrison-Smith, S., Boucher, C., & Ruiz, J. (2016). Exploring non-touchscreen gestures for smartwatches. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/2858036.2858385
  • Austin, C. R., Ens, B., Satriadi, K. A., & Jenny, B. (2020). Elicitation study investigating hand and foot gesture interaction for immersive maps in augmented reality. Cartography and Geographic Information Science, 47(3), 214–228. https://doi.org/10.1080/15230406.2019.1696232
  • Bartoschek, T., Pape, G., Kray, C., Jones, J., & Kauppinen, T. (2013). Gestural interaction with spatiotemporal linked open data. Free and Open Source Software for Geospatial (FOSS4G) Conference Proceedings, 13(11), 9. https://doi.org/10.7275/r57d2sbm
  • Cai, G., Sharma, R., MacEachren, A. M., & Brewer, I. (2006). Human-GIS interaction issues in crisis response. International Journal of Risk Assessment and Management, 6(4–6), 388. https://doi.org/10.1504/ijram.2006.009538
  • Chen, Z., Ma, X., Peng, Z., Zhou, Y., Yao, M., Ma, Z., Wang, C., Gao, Z., & Shen, M. (2017). User-defined gestures for gestural interaction: Extending from hands to other body parts. International Journal of Human–Computer Interaction, 34(3), 238–250. https://doi.org/10.1080/10447318.2017.1342943
  • Filho, J. A. W., Stuerzlinger, W., & Nedel, L. (2020). Evaluating an immersive space-time cube geovisualization for intuitive trajectory data exploration. IEEE Transactions on Visualization and Computer Graphics, 26(1), 514–524. https://doi.org/10.1109/tvcg.2019.2934415
  • Florence, J., Hornsby, K., & Egenhofer, M. J. (1998). The GIS wallboard: Interactions with spatial information on large-scale displays.
  • Furnas, G. W., Landauer, T. K., Gomez, L. M., & Dumais, S. T. (1987). The vocabulary problem in human-system communication. Communications of the ACM, 30(11), 964–971. https://doi.org/10.1145/32206.32212
  • Giannopoulos, I., Komninos, A., & Garofalakis, J. (2017). Natural interaction with large map interfaces in VR. Proceedings of the 21st Pan-Hellenic Conference on Informatics. https://doi.org/10.1145/3139367.3139424
  • Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action. Journal of Motor Behavior, 19(4), 486–517. https://doi.org/10.1080/00222895.1987.10735426
  • Huang, X., Song, Y., & Hu, X. (2021). Deploying spatial data for coastal community resilience: A review from the managerial perspective. International Journal of Environmental Research and Public Health, 18(2), 830. https://doi.org/10.3390/ijerph18020830
  • Johnson-Glenberg, M. C. (2018). Immersive VR and education: Embodied design principles that include gesture and hand controls. Frontiers in Robotics and AI, 5. https://doi.org/10.3389/frobt.2018.00081
  • Kelly, J. W., Cherep, L. A., Klesel, B., Siegel, Z. D., & George, S. (2018). Comparison of two methods for improving distance perception in virtual reality. ACM Transactions on Applied Perception, 15(2), 1–11. https://doi.org/10.1145/3165285
  • Kenyon, R. V., Sandin, D., Smith, R. C., Pawlicki, R., & Defanti, T. (2007). Size-constancy in the CAVE. Presence Teleoperators and Virtual Environments, 16(2), 172–187. https://doi.org/10.1162/pres.16.2.172
  • Lee, D. S. (2012). Preferred viewing distance of liquid crystal high-definition television. Applied Ergonomics, 43(1), 151–156. https://doi.org/10.1016/j.apergo.2011.04.007
  • Lee, Y., Choi, W., & Sohn, B. S. (2018). Immersive gesture interfaces for 3D map navigation in HMD-based virtual environments. 2018 International Conference on Information Networking (ICOIN). https://doi.org/10.1109/icoin.2018.8343267
  • Lee, D. S., & Shen, I. H. (2012). Effects of illumination conditions on preferred viewing distance of portable liquid-crystal television. Journal of the Society for Information Display, 20(7), 360–366. https://doi.org/10.1002/jsid.95
  • Lu, W., Tong, Z., & Chu, J. (2016). Dynamic hand gesture recognition with leap motion controller. IEEE Signal Processing Letters, 23(9), 1188–1192. https://doi.org/10.1109/lsp.2016.2590470
  • Małecki, K., Nowosielski, A., & Kowalicki, M. (2020). Gesture-based user interface for vehicle on-board system: A questionnaire and research approach. Applied Sciences, 10(18), 6620. https://doi.org/10.3390/app10186620
  • Morris, M. R., Wobbrock, J. O., & Wilson, A. D. (2010). Understanding users’ preferences for surface gestures. In Proceedings of Graphics Interface 2010 (pp. 261–268). https://doi.org/10.5555/1839214.1839260
  • Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., & Mackay, W. (2011). Mid-air pan-and-zoom on wall-sized displays. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/1978942.1978969
  • Nanjappan, V., Liang, H. N., Lu, F., Papangelis, K., Yue, Y., & Man, K. L. (2018). User-elicited dual-hand interactions for manipulating 3D objects in virtual reality environments. Human-Centric Computing and Information Sciences, 8(1). https://doi.org/10.1186/s13673-018-0154-5
  • Nestorov, N., Hughes, P., Healy, N., Sheehy, N., & O’hare, N. (2016). Application of natural user interface devices for touch-free control of radiological images during surgery. 2016 IEEE 29th International Symposium on Computer-Based Medical Systems (CBMS). https://doi.org/10.1109/cbms.2016.20
  • Oviatt, S. (1996). Multimodal interfaces for dynamic interactive maps. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Common Ground - CHI ’96. https://doi.org/10.1145/238386.238438
  • Pereira, A., Wachs, J. P., Park, K., & Rempel, D. (2014). A user-developed 3-D hand gesture set for human–computer interaction. Human Factors: The Journal of the Human Factors and Ergonomics Society, 57(4), 607–621. https://doi.org/10.1177/0018720814559307
  • Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. (2013). User-defined gestures for augmented reality. CHI ’13 Extended Abstracts on Human Factors in Computing Systems On - CHI EA ’13. https://doi.org/10.1145/2468356.2468527
  • Rauschert, I., Agrawal, P., Sharma, R., Fuhrmann, S., Brewer, I., & MacEachren, A. (2002). Designing a human-centered, multimodal GIS interface to support emergency management. Proceedings of the Tenth ACM International Symposium on Advances in Geographic Information Systems - GIS ’02. https://doi.org/10.1145/585147.585172
  • Renner, R. S., Velichkovsky, B. M., & Helmert, J. R. (2013). The perception of egocentric distances in virtual environments - a review. ACM Computing Surveys, 46(2), 1–40. https://doi.org/10.1145/2543581.2543590
  • Rossol, N., Cheng, I., Shen, R., & Basu, A. (2014). Touchfree medical interfaces. 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. https://doi.org/10.1109/embc.2014.6945140
  • Rudi, D., Giannopoulos, I., Kiefer, P., Peier, C., & Raubal, M. (2016). Interacting with maps on optical head-mounted displays. Proceedings of the 2016 Symposium on Spatial User Interaction. https://doi.org/10.1145/2983310.2985747
  • Sampson, H., Kelly, D., Wunsche, B. C., & Amor, R. (2018). A hand gesture set for navigating and interacting with 3D virtual environments. 2018 International Conference on Image and Vision Computing New Zealand (IVCNZ). https://doi.org/10.1109/ivcnz.2018.8634656
  • Sathiyanarayanan, M., & Mulling, T. (2015). Map navigation using hand gesture recognition: A case study using MYO connector on apple maps. Procedia computer science, 58, 50–57. https://doi.org/10.1016/j.procs.2015.08.008
  • Satriadi, K. A., Ens, B., Cordeil, M., Jenny, B., Czauderna, T., & Willett, W. (2019). Augmented reality map navigation with freehand gestures. 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). https://doi.org/10.1109/vr.2019.8798340
  • Stanciulescu, A., Vanderdonckt, J., & Macq, B. (2008). Automatic usability assessment of multimodal user interfaces based on ergonomic rules.
  • Stephen, D. K., & David, R. G. (2004). Pencil out, stylus in: Geospatial technologies give coastal fieldwork a new dimension. Geography, 89, 58–70. https://www.jstor.org/stable/40573912
  • Sun, F. (2015). Exploring intent-driven multimodal interface for geographical information system. Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. https://doi.org/10.1145/2818346.2823304
  • Sun, Y., Li, C., Li, G., Jiang, G., Jiang, D., Liu, H., Zheng, Z., & Shu, W. (2018). Gesture recognition based on kinect and sEMG signal fusion. Mobile Networks and Applications, 23(4), 797–805. https://doi.org/10.1007/s11036-018-1008-0
  • Trong, K. N., Bui, H., & Pham, C. (2019). Recognizing hand gestures for controlling home appliances with mobile sensors. 2019 11th International Conference on Knowledge and Systems Engineering (KSE). https://doi.org/10.1109/kse.2019.8919419
  • Urakami, J. (2014). Cross-cultural comparison of hand gestures of Japanese and Germans for tabletop systems. Computers in Human Behavior, 40, 180–189. https://doi.org/10.1016/j.chb.2014.08.010
  • Vaidya, O., Jadhav, K., Ingale, L., & Chaudhari, R. (2019). Hand gesture based music player control in vehicle. 2019 IEEE 5th International Conference for Convergence in Technology (I2CT). https://doi.org/10.1109/i2ct45611.2019.9033708
  • Vanderdonckt, J., Magrofuoco, N., Kieffer, S., Pérez, J., Rase, Y., Roselli, P., & Villarreal, S. (2019). Head and shoulders gestures: Exploring user-defined gestures with upper body. Design, User Experience, and Usability User Experience in Advanced Technological Environments, 192–213. https://doi.org/10.1007/978-3-030-23541-3_15
  • Vogiatzidakis, P., & Koutsabasis, P. (2019). Frame-based elicitation of mid-air gestures for a smart home device ecosystem. Informatics, 6(2), 23. https://doi.org/10.3390/informatics6020023
  • Wachs, J. P., Kölsch, M., Stern, H., & Edan, Y. (2011). Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60–71. https://doi.org/10.1145/1897816.1897838
  • Wittorf, M. L., & Jakobsen, M. R. (2016). Eliciting mid-air gestures for wall-display interaction. Proceedings of the 9th Nordic Conference on Human-Computer Interaction. https://doi.org/10.1145/2971485.2971503
  • Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/1518701.1518866
  • Wu, H., Fu, S., Yang, L., & Zhang, X. L. (2020). Exploring frame-based gesture design for immersive VR shopping environments. Behaviour & Information Technology, 41(1), 96–117. https://doi.org/10.1080/0144929x.2020.1795261
  • Wu, H., Wang, Y., Liu, J., Qiu, J., & Zhang, X. L. (2019). User-defined gesture interaction for in-vehicle information systems. Multimedia Tools and Applications, 79(1–2), 263–288. https://doi.org/10.1007/s11042-019-08075-1
  • Wu, H., Wang, Y., Qiu, J., Liu, J., & Zhang, X. L. (2018). User-defined gesture interaction for immersive VR shopping applications. Behaviour & Information Technology, 38(7), 726–741. https://doi.org/10.1080/0144929x.2018.1552313
  • Yee, W. (2009). Potential limitations of multi-touch gesture vocabulary: Differentiation, adoption, fatigue. Human-Computer Interaction Novel Interaction Methods and Techniques, 291–300. https://doi.org/10.1007/978-3-642-02577-8_32
  • Zaiţi, I. A., Pentiuc, T. G., & Vatavu, R. D. (2015). On free-hand TV control: Experimental results on user-elicited gestures with leap motion. Personal and Ubiquitous Computing, 19(5–6), 821–838. https://doi.org/10.1007/s00779-015-0863-y
  • Zerger, A., & Smith, D. I. (2003). Impediments to using GIS for real-time disaster decision support. Computers, Environment and Urban Systems, 27(2), 123–141. https://doi.org/10.1016/s0198-9715(01)00021-7
  • Zhang, N., Wang, W. X., Huang, S. Y., & Luo, R. M. (2022). Mid-air gestures for in-vehicle media player: Elicitation, segmentation, recognition, and eye-tracking testing. SN Applied Sciences, 4(4), 109. https://doi.org/10.1007/s42452-022-04992-3

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.