481
Views
5
CrossRef citations to date
0
Altmetric
Research Article

Aladdin’s magic carpet: Navigation by in-air static hand gesture in autonomous vehicles

ORCID Icon, &

References

  • Angelini, L., Carrino, F., Carrino, S., Caon, M., Khaled, O. A., Baumgartner, J., Sonderegger, A., Lalanne, D., & Mugellini, E. (2014). Gesturing on the steering wheel: A user-elicited taxonomy. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 1–8). New York, NY: ACM.
  • Archer, D. (1997). Unspoken diversity: Cultural differences in gestures. Qualitative Sociology, 20(1), 79–105. https://doi.org/10.1023/A:1024716331692
  • Austin, E. E., & Sweller, N. (2014, June). Presentation and production: The role of gesture in spatial communication. Journal of Experimental Child Psychology, 122, 92–103. https://doi.org/10.1016/j.jecp.2013.12.008
  • Austin, E. E., & Sweller, N. (2017). Getting to the elephants: Gesture and preschoolers’ comprehension of route direction information. Journal of Experimental Child Psychology, 163, 1–14. https://doi.org/10.1016/j.jecp.2017.05.016
  • Azevedo, T. M., Volchan, E., Imbiriba, L. A., Rodrigues, E. C., Oliveira, J. M., Oliveira, L. F., Lutterbach, L. G., & Vargas, C. D. (2005). A freezing-like posture to pictures of mutilation. Psychophysiology, 42(3), 255–260. https://doi.org/10.1111/j.1469-8986.2005.00287.x
  • Bau, O., & Mackay, W. E. (2008). OctoPocus: A dynamic guide for learning gesture-based command sets. In Proceedings of the 21st annual ACM Symposium on User Interface Software and Technology (pp. 37–46). New York, NY: ACM.
  • Bhuiyan, M., & Picking, R. (2011). A gesture controlled user interface for inclusive design and evaluative study of its usability. Journal of Software Engineering and Applications, 4(9), 513. https://doi.org/10.4236/jsea.2011.49059
  • Bonin-Font, F., Ortiz, A., & Oliver, G. (2008). Visual navigation for mobile robots: A survey. Journal of Intelligent and Robotic Systems, 53(3), 263–296. https://doi.org/10.1007/s10846-008-9235-4
  • Bragdon, A., Nelson, E., Li, Y., & Hinckley, K. (2011). Experimental analysis of touch-screen gesture designs in mobile environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 403–412). New York, NY: ACM.
  • Burnett, G., Crundall, E., Large, D., Lawson, G., & Skrypchuk, L. (2013). A study of unidirectional swipe gestures on in-vehicle touch screens. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 22–29). New York, NY: ACM.
  • Chandarana, M., Meszaros, E. L., Trujillo, A., & Allen, B. D. (2017). ‘Fly like this’: Natural language interface for UAV mission planning. International Conference on Advances in Computer-Human Interactions. NASA STI.
  • Chung, C., & Rantanen, E. Gestural interaction with in-vehicle audio and climate controls. (2010). Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 54(19), 1406–1410. Sage CA: Los Angeles, CA: SAGE Publications. https://doi.org/10.1177/154193121005401911
  • Dawes, J. (2008). Do data characteristics change according to the number of scale points used? An experiment using 5-point, 7-point and 10-point scales. International Journal of Market Research, 50(1), 61–104. https://doi.org/10.1177/147078530805000106
  • Demiris, G., Oliver, D. P., Giger, J., Skubic, M., & Rantz, M. (2009). Older adults’ privacy considerations for vision based recognition methods of eldercare applications. Technology and Health Care, 17(1), 41–48. https://doi.org/10.3233/THC-2009-0530
  • Deo, N., Rangesh, A., & Trivedi, M. (2016). In-vehicle hand gesture recognition using hidden Markov models. In 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC) (pp. 2179–2184). Rio de Janeiro, Brazil: IEEE.
  • Dovidio, J. F., Brown, C. E., Heltman, K., Ellyson, S. L., & Keating, C. F. (1988). Power displays between women and men in discussions of gender-linked tasks: A multichannel study. Journal of Personality and Social Psychology, 55(4), 580. https://doi.org/10.1037/0022-3514.55.4.580
  • Feng, J., & Donmez, B. (2013). Designing feedback to induce safer driving behaviors: A literature review and a model of driver-feedback interaction. Technical report submitted to Toyota Collaborative Safety Research Center (CSRC). Human Factors and Applied Statistics Laboratory, University of Toronto. Retrieved from https://hfast.mie.utoronto.ca/wp-content/uploads/Publications/CSRC_UofT_Report_Literature_review_and_driver_feedback_model.pdf
  • Fong, T. W., Conti, F., Grange, S., & Baur, C. (2001). Novel interfaces for remote driving: Gesture, haptic, and PDA. In Mobile Robots XV and Telemanipulator and Telepresence Technologies VII (Vol. 4195, pp. 300–311). International Society for Optics and Photonics. https://doi.org/10.1117/12.417314
  • Forster, Y., Hergeth, S., Naujoks, F., & Krems, J. F. (2018). How usability can save the day-methodological considerations for making automated driving a success story. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 278–290). New York, NY: ACM.
  • Goldin-Meadow, S. (1999). The role of gesture in communication and thinking. Trends in Cognitive Sciences, 3(11), 419–429. https://doi.org/10.1016/S1364-6613(99)01397-2
  • González, I. E., Wobbrock, J. O., Chau, D. H., Faulring, A., & Myers, B. A. (2007). Eyes on the road, hands on the wheel: Thumb-based interaction techniques for input on steering wheels. In Proceedings of Graphics Interface 2007 (pp. 95–102). New York, NY: ACM.
  • Harré, N., Field, J., & Kirkwood, B. (1996). Gender differences and areas of common concern in the driving behaviors and attitudes of adolescents. Journal of Safety Research, 27(3), 163–173. https://doi.org/10.1016/0022-4375(96)00013-8
  • Hegarty, M., Montello, D. R., Richardson, A. E., Ishikawa, T., & Lovelace, K. (2006). Spatial abilities at different scales: Individual differences in aptitude-test performance and spatial-layout learning. Intelligence, 34(2), 151–176. https://doi.org/10.1016/j.intell.2005.09.005
  • Holler, J., & Stevens, R. (2007). The effect of common ground on how speakers use gesture and speech to represent size information. Journal of Language and Social Psychology, 26(1), 4–27. https://doi.org/10.1177/0261927X06296428
  • Holler, J., & Wilkin, K. (2009). Communicating common ground: How mutually shared knowledge influences speech and gesture in a narrative task. Language and Cognitive Processes, 24(2), 267–289. https://doi.org/10.1080/01690960802095545
  • Hou, J., Liu, Y., Zheng, T. F., Olsen, J., & Tian, J. (2010). Multi-layered features with SVM for Chinese accent identification. In 2010 International Conference on Audio, Language and Image Processing (pp. 25–30). Shanghai, China: IEEE.
  • Jæger, M. G., Skov, M. B., & Thomassen, N. G. (2008). You can touch, but you can’t look: Interacting with in-vehicle systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1139–1148). New York, NY: ACM.
  • Jahani, H., Alyamani, H. J., Kavakli, M., Dey, A., & Billinghurst, M. (2017). User evaluation of hand gestures for designing an intelligent in-vehicle interface. In International Conference on Design Science Research in Information System and Technology (pp. 104–121). Springer, Cham.
  • Jaschinski, L., Denjean, S., Petiot, J. F., Mars, F., & Roussarie, V. (2016, October). Impact of interface sonification with touchless gesture command in a car. Human Factors and Ergonomics Society Europe Chapter 2016 Annual Conference, Prague, Czech Republic, pp. 35–46.
  • Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge University Press.
  • Kiefer, P., Giannopoulos, I., Anagnostopoulos, V. A., Schöning, J., & Raubal, M. (2017). Controllability matters: The user experience of adaptive maps. GeoInformatica, 21(3), 619–641. https://doi.org/10.1007/s10707-016-0282-x
  • Kokubo, H., Amano, A., & Hataoka, N. (2002). Robust speech recognition for car environment noise. Electronics and Communications in Japan (Part III: Fundamental Electronic Science), 85(11), 65–73. https://doi.org/10.1002/ecjc.10055
  • Lee, S. H., Yoon, S. O., & Shin, J. H. (2015). On-wheel finger gesture control for in-vehicle systems on central consoles. In Adjunct Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 94–99). New York, NY: ACM.
  • Li, G., Zhang, R., Ritchie, M., & Griffiths, H. (2017). Sparsity-based dynamic hand gesture recognition using micro-Doppler signatures. In 2017 IEEE Radar Conference (RadarConf) (pp. 0928–0931). Seattle, WA: IEEE. doi:10.1109/RADAR.2017.7944336.
  • Liben, L. S., Kastens, K. A., & Stevenson, L. M. (2002). Real-world knowledge through real-world maps: A developmental guide for navigating the educational terrain. Developmental Review, 22(2), 267–322. https://doi.org/10.1006/drev.2002.0545
  • Ma, N., Green, P., Barker, J., & Coy, A. (2007). Exploiting correlogram structure for robust speech recognition with multiple speech sources. Speech Communication, 49(12), 874–891. https://doi.org/10.1016/j.specom.2007.05.003
  • MacEachren, A. M. (2015). Distributed cognition: A conceptual framework for understanding map-based reasoning. In Proceeding of the 27th International Cartographic Conference. Rio de Janeiro, Brazil: International Cartographic Association.
  • MacKenzie, I. S. (1992). Fitts’ law as a research and design tool in human-computer interaction. Human-Computer Interaction, 7(1), 91–139. doi:https://doi.org/10.1207/s15327051hci0701_3
  • Mashood, A., Noura, H., Jawhar, I., & Mohamed, N. (2015). A gesture based kinect for quadrotor control. In 2015 International Conference on Information and Communication Technology Research (ICTRC) (pp. 298–301). Abu Dhabi: IEEE. doi:10.1109/ICTRC.2015.7156481
  • McNeill, D. (1992). Hand and mind: What gestures reveal about thought. University of Chicago Press.
  • Millen, D. R. (2000). Rapid ethnography: Time deepening strategies for HCI field research. In Proceedings of the 3rd Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques (pp. 280–286). New York, NY: ACM.
  • Norman, D. (2013). The design of everyday things: Revised and expanded edition. Basic Books.
  • Ohn-Bar, E., & Trivedi, M. M. (2014). Hand gesture recognition in real time for automotive interfaces: A multimodal vision-based approach and evaluations. IEEE Transactions on Intelligent Transportation Systems, 15(6), 2368–2377. https://doi.org/10.1109/TITS.2014.2337331
  • Pfleging, B., Schneegass, S., & Schmidt, A. (2012). Multimodal interaction in the car: Combining speech and gestures on the steering wheel. In Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 155–162). New York, NY: ACM.
  • Pickering, C. A., Burnham, K. J., & Richardson, M. J. (2007). A review of automotive human machine interface technologies and techniques to reduce driver distraction. In 2nd Institution of Engineering and Technology International Conference on System Safety (pp. 223–228). London, UK. doi: 10.1049/cp:20070468
  • Plouffe, G., Cretu, A. M., & Payeur, P. (2015). Natural human-computer interaction using static and dynamic hand gestures. In 2015 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) (pp. 1–6). Ottawa, Canada: IEEE. doi:10.1109/HAVE.2015.7359473
  • Reich, D., Weber, M., Terken, J., Riener, A., Schroeter, R., & Osswald, S. (2013). Comparison of different touchless gesture interactions in the car cockpit. In Adjunct Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 119–120). Eindhoven, The Netherlands: ACM.
  • Reifinger, S., Wallhoff, F., Ablassmeier, M., Poitschke, T., & Rigoll, G. (2007). Static and dynamic hand-gesture recognition for augmented reality applications. In International Conference on Human-Computer Interaction (pp. 728–737). Springer, Berlin, Heidelberg.
  • Riener, A., Ferscha, A., Bachmair, F., Hagmüller, P., Lemme, A., Muttenthaler, D., … Weger, F. (2013). Standardization of the in-car gesture interaction space. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 14–21). New York, NY: ACM.
  • Rümelin, S., Marouane, C., & Butz, A. (2013). Free-hand pointing for identification and interaction with distant objects. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 40–47). New York, NY: ACM.
  • Scoditti, A. (2011). A novel taxonomy for testural interaction techniques: Considerations for automotive environments. In 3rd International Workshop on Multimodal Interfaces for Automotive Applications (MIAA). Palo Alto, CA.
  • Singha, J., Roy, A., & Laskar, R. H. (2018). Dynamic hand gesture recognition using vision-based approach for human–computer interaction. Neural Computing & Applications, 29(4), 1129–1141. https://doi.org/10.1007/s00521-016-2525-z
  • Stern, H. I., Wachs, J. P., & Edan, Y. (2008). Designing hand gesture vocabularies for natural interaction by combining psycho-physiological and recognition factors. International Journal of Semantic Computing, 2(1), 137–160. https://doi.org/10.1142/S1793351X08000385
  • Sun, X., & Li, T. (2014). Survey of studies supporting the use of in-air gesture in car HMI. UXPA (China). Wuxi, Jiangsu, China.
  • Terada, T., Miyamae, M., Kishino, Y., Tanaka, K., Nishio, S., Nakagawa, T., & Yamaguchi, Y. (2006). Design of a car navigation system that predicts user destination. In 7th International Conference on Mobile Data Management (MDM’06) (pp. 145). Nara, Japan: IEEE.
  • Wigdor, D., & Wixon, D. (2011). Brave NUI world: Designing natural user interfaces for touch and gesture. Elsevier.
  • Wolf, K., Naumann, A., Rohs, M., & Müller, J. (2011). A taxonomy of microinteractions: Defining microgestures based on ergonomic and scenario-dependent requirements. In IFIP Conference on Human-Computer Interaction (pp. 559–575). Springer, Berlin, Heidelberg.
  • Wu, H., Zhang, S., Liu, J., Qiu, J., & Zhang, X. (2019). The gesture disagreement problem in free-hand gesture interaction. International Journal of Human–Computer Interaction, 35(12), 1102–1114. https://doi.org/10.1080/10447318.2018.1510607
  • Wu, S., Gable, T., May, K., Choi, Y. M., & Walker, B. N. (2016). Comparison of surface gestures and air gestures for in-vehicle menu navigation. Archives of Design Research, 29(4), 65–80. https://doi.org/10.15187/adr.2016.11.29.4.65
  • Yan, Q., & Vaseghi, S. (2002). A comparative analysis of UK and US English accents in recognition and synthesis. In 2002 IEEE International Conference on Acoustics, Speech, and Signal Processing (Vol. 1, pp. I–413). Orlando, FL: IEEE.
  • Yin, Y., & Davis, R. (2014). Real-time continuous gesture recognition for natural human-computer interaction. In 2014 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) (pp. 113–120). Melbourne, VIC, Australia: IEEE.
  • Zeng, J., Sun, Y., & Wang, F. (2012, November). A natural hand gesture system for intelligent human-computer interaction and medical assistance. In 2012 Third Global Congress on Intelligent Systems (pp. 382–385). Wuhan, China: IEEE.
  • Zhuang, H., Yang, M., Cui, Z., & Zheng, Q. (2017). A method for static hand gesture recognition based on non-negative matrix factorization and compressive sensing. IAENG International Journal of Computer Science, 44(1), 52–59. http://www.iaeng.org/IJCS/issues_v44/issue_1/index.html and the article's direct pdf link is: http://www.iaeng.org/IJCS/issues_v44/issue_1/IJCS_44_1_07.pdf

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.