526
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

User-Defined Foot Gestures for Eyes-Free Interaction in Smart Shower Rooms

ORCID Icon, & ORCID Icon
Pages 4139-4161 | Received 07 Mar 2022, Accepted 29 Jul 2022, Published online: 18 Aug 2022

References

  • Alexander, J., Han, T., Judd, W., Irani, P., & Subramanian, S. (2012). Putting your best foot forward: Investigating real-world mappings for foot-based gestures [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, Texas, USA.
  • Ali, A., Morris, M. R., & Wobbrock, J. O. (2021). “I Am Iron Man”: Priming improves the learnability and memorability of user-elicited gestures [Paper presentation]. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan.
  • Austin, C. R., Ens, B., Satriadi, K. A., & Jenny, B. (2020). Elicitation study investigating hand and foot gesture interaction for immersive maps in augmented reality. Cartography and Geographic Information Science, 47(3), 214–228. https://doi.org/10.1080/15230406.2019.1696232.
  • Cafaro, F., Lyons, L., & Antle, A. N. (2018). Framed guessability: Improving the discoverability of gestures and body movements for full-body interaction [Paper presentation]. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, Montreal, QC, Canada.
  • Chen, Z., Ma, X., Peng, Z., Zhou, Y., Yao, M., Ma, Z., Wang, C., Gao, Z., & Shen, M. (2018). User-defined gestures for gestural interaction: Extending from hands to other body parts. International Journal of Human–Computer Interaction, 34(3), 238–250. https://doi.org/10.1080/10447318.2017.1342943.
  • Cohen, P., Swindells, C., Oviatt, S., & Arthur, A. (2008). A high-performance dual-wizard infrastructure for designing speech, pen, and multimodal interfaces [Paper presentation]. Proceedings of the 10th international conference on Multimodal interfaces, Chania, Crete, Greece.
  • Dong, H., Danesh, A., Figueroa, N., & Saddik, A. E. (2015). An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access. 3, 543–555. https://doi.org/10.1109/ACCESS.2015.2432679.
  • Fan, M., Ding, Y., Shen, F., You, Y., & Yu, Z. (2017). An empirical study of foot gestures for hands-occupied mobile interaction [Paper presentation]. Proceedings of the 2017 ACM International Symposium on Wearable Computers, Maui, Hawaii.
  • Felberbaum, Y., & Lanir, J. (2016). Step by STEP: Investigating foot gesture interaction [Paper presentation]. Proceedings of the International Working Conference on Advanced Visual Interfaces, Bari, Italy.
  • Felberbaum, Y., & Lanir, J. (2018). Better understanding of foot gestures: An elicitation study [Paper presentation]. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada.
  • Ferati, M., Babar, A., Carine, K., Hamidi, A., & Mörtberg, C. (2018). Participatory design approach to internet of things: Co-designing a smart shower for and with people with disabilities. In M. Antona & C. Stephanidis (Eds.), Universal access in human-computer interaction. Virtual, augmented, and intelligent environments, International Conference on Universal Access in Human-Computer Interaction. Cham.
  • Findlater, L., Wobbrock, J. O., & Wigdor, D. (2011). Typing on flat glass: examining tenfinger expert typing patterns on touch surfaces [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
  • Fukahori, K., Sakamoto, D., & Igarashi, T. (2015). Exploring subtle foot plantar-based gestures with sock-placed pressure sensors [Paper presentation]. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea.
  • Funk, M., Schneegass, S., Behringer, M., Henze, N., & Schmidt, A. (2015). An interactive curtain for media usage in the shower [Paper presentation]. Proceedings of the 4th International Symposium on Pervasive Displays, Saarbruecken, Germany.
  • Gheran, B.-F., Vanderdonckt, J., & Vatavu, R.-D. (2018). Gestures for smart rings: Empirical results insights and design implications [Paper presentation]. Proceedings of the 2018 Designing Interactive Systems Conference, Hong Kong, China.
  • Hirai, S., Sakakibara, Y., & Hayakawa, S. (2012). Bathcratch: Touch and sound-based DJ controller implemented on a bathtub. In A. Nijholt, T. Romão, & D. Reidsma (Eds.), Advances in computer entertainment, International Conference on Advances in Computer Entertainment Technology, Berlin, Heidelberg.
  • Hoshino, K., Koge, M., Hachisu, T., Kodama, R., & Kajimoto, H. (2015). Jorro Beat: Shower tactile stimulation device in the bathroom [Paper presentation]. Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, Seoul, Republic of Korea.
  • Höysniemi, J., Hämäläinen, P., Turkki, L., & Rouvi, T. (2005). Children's intuitive gestures in vision-based action games. Communications of the ACM, 48(1), 44–50. https://doi.org/10.1145/1039539.1039568.
  • Kane, S. K., Morris, M. R., Perkins, A. Z., Wigdor, D., Ladner, R. E., & Wobbrock, J. O. (2011). Access overlays: Improving non-visual access to large touch screens for blind users [Paper presentation]. Proceedings of the 24th annual ACM Symposium on User Interface Software and Technology, Santa Barbara, California, USA.
  • Kane, S. K., Morris, M. R., & Wobbrock, J. O. (2013). Touchplates: Low-cost tactile overlays for visually impaired touch screen users [Paper presentation]. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, Bellevue, Washington.
  • Karam, M., & schraefel, m. c. (2005). A taxonomy of gestures in human computer interactions (project report). A. T. o. C.-H. Interactions. https://eprints.soton.ac.uk/261149/.
  • Kawakatsu, R., & Hirai, S. (2018, 19–23 March). Rubbinput: An interaction technique for wet environments utilizing squeak sounds caused by finger-rubbing [Paper presentation]. 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Athens, Greece.
  • Kim, T., Blum, J. R., Alirezaee, P., Arnold, A. G., Fortin, P. E., & Cooperstock, J. R. (2019). Usability of foot-based interaction techniques for mobile solutions. In S. Paiva (Ed.), Mobile solutions and their usefulness in everyday life (pp. 309–329). Springer International Publishing. https://doi.org/10.1007/978-3-319-93491-4_16
  • Kim, W., & Xiong, S. (2021). User-defined walking-in-place gestures for VR locomotion. International Journal of Human-Computer Studies, 152, 102648. https://doi.org/10.1016/j.ijhcs.2021.102648.
  • Koike, H., Matoba, Y., & Takahashi, Y. (2013). AquaTop display: Interactive water surface for viewing and manipulating information in a bathroom [Paper presentation]. Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces, St. Andrews, Scotland, United Kingdom.
  • Lopes, D., Relvas, F., Paulo, S., Rekik, Y., Grisoni, L., & Jorge, J. (2019). FEETICHE: FEET Input for Contactless Hand gEsture Interaction [Paper presentation]. The 17th International Conference on Virtual-Reality Continuum and its Applications in Industry, Brisbane, QLD, Australia.
  • Maskeliūnas, R., Damaševičius, R., & Segal, S. (2019). A review of internet of things technologies for ambient assisted living environments. Future Internet, 11(12), 259. https://www.mdpi.com/1999-5903/11/12/259.
  • Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63(2), 81–97. https://doi.org/10.1037/h0043158.
  • Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., schraefel, m c., & Wobbrock, J. O. (2014). Reducing legacy bias in gesture elicitation studies. Interactions, 21(3), 40–45. https://doi.org/10.1145/2591689.
  • Morris, M. R., Wobbrock, J. O., & Wilson, A. D. (2010). Understanding users' preferences for surface gestures [Paper presentation]. Proceedings of Graphics Interface 2010, Ottawa, Ontario, Canada.
  • Müller, F., McManus, J., Günther, S., Schmitz, M., Mühlhäuser, M., & Funk, M. (2019). Mind the tap: Assessing foot-taps for interacting with head-mounted displays [Paper presentation]. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland, UK.
  • Nacenta, M. A., Kamber, Y., Qiang, Y., & Kristensson, P. O. (2013). Memorability of pre-designed and user-defined gesture sets [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France.
  • Nielsen, M., Störring, M., Moeslund, T. B., & Granum, E. (2004). A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-based communication in human-computer interaction. Springer.
  • Pakkanen, T., & Raisamo, R. (2004). Appropriateness of foot interaction for non-accurate spatial tasks [Paper presentation]. CHI '04 Extended Abstracts on Human Factors in Computing Systems, Vienna, Austria.
  • Pan, Y., & Steed, A. (2019). How foot tracking matters: The impact of an animated self-avatar on interaction, embodiment and presence in shared virtual environments. Frontiers in Robotics and AI, 6, 104. https://doi.org/10.3389/frobt.2019.00104.
  • Paulo, S. F., Relvas, F., Nicolau, H., Rekik, Y., Machado, V., Botelho, J., Mendes, J. J., Grisoni, L., Jorge, J., & Lopes, D. S. (2019). Touchless interaction with medical images based on 3D hand cursors supported by single-foot input: A case study in dentistry. Journal of Biomedical Informatics, 100, 103316. https://doi.org/10.1016/j.jbi.2019.103316.
  • Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. (2013). User-defined gestures for augmented reality. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Human-computer interaction – INTERACT 2013, IFIP Conference on Human-Computer Interaction, Berlin, Heidelberg. Springer.
  • Roaas, A., & Andersson, G. B. J. (1982). Normal range of motion of the hip, knee and ankle joints in male subjects, 30–40 years of age. Acta Orthopaedica Scandinavica, 53(2), 205–208. https://doi.org/10.3109/17453678208992202.
  • Ruiz, J., Li, Y., & Lank, E. (2011). User-defined motion gestures for mobile interaction [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
  • Saunders, W., & Vogel, D. (2016). Tap-Kick-Click: Foot interaction for a standing desk [Paper presentation]. Proceedings of the 2016 ACM Conference on Designing Interactive Systems, Brisbane, QLD, Australia.
  • Schlömer, I., Klein, B., & Roßberg, H. (2017). A robotic shower system – evaluation of multimodal human-robot interaction for the elderly. Gesellschaft Für Informatik e.V, https://doi.org/10.18420/MUC2017-WS17-0415
  • Silpasuwanchai, C., & Ren, X. (2015). Designing concurrent full-body gestures for intense gameplay. International Journal of Human-Computer Studies, 80, 1–13. https://doi.org/10.1016/j.ijhcs.2015.02.010.
  • Sumida, T., Hirai, S., Ito, D., & Kawakatsu, R. (2017). RapTapBath: User interface system by tapping on a bathtub edge utilizing embedded acoustic sensors [Paper presentation]. Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, Brighton, United Kingdom.
  • Takahashi, Y., Matoba, Y., & Koike, H. (2012). Fluid surface: Interactive water surface display for viewing information in a bathroom [Paper presentation]. Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, Cambridge, Massachusetts, USA.
  • Tu, H., Huang, Q., Zhao, Y., & Gao, B. (2020). Effects of holding postures on user-defined touch gestures for tablet interaction. International Journal of Human-Computer Studies, 141, 102451. https://doi.org/10.1016/j.ijhcs.2020.102451.
  • Vatavu, R.-D. (2012). User-defined gestures for free-hand TV control [Paper presentation]. Proceedings of the 10th European Conference on Interactive TV and Video, Berlin, Germany.
  • Vatavu, R.-D., & Wobbrock, J. O. (2015). Formalizing agreement analysis for elicitation studies: New measures significance test and tool kit [Paper presentation]. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea.
  • Velloso, E., Schmidt, D., Alexander, J., Gellersen, H., & Bulling, A. (2015). The feet in human–computer interaction: A survey of foot-based interaction. ACM Computing Surveys, 48(2), 1–35. https://doi.org/10.1145/2816455.
  • Villarreal-Narvaez, S., Vanderdonckt, J., Vatavu, R.-D., & Wobbrock, J. O. (2020). A systematic review of gesture elicitation studies: What can we learn from 216 studies? [Paper presentation]. Proceedings of the 2020 ACM Designing Interactive Systems Conference, Association for Computing Machinery, Eindhoven, Netherlands, 855–872.
  • Vogiatzidakis, P., & Koutsabasis, P. (2019). Frame-based elicitation of mid-air gestures for a smart home device ecosystem. Informatics, 6(2), 23. https://www.mdpi.com/2227-9709/6/2/23.
  • Vuletic, T., Duffy, A., Hay, L., McTeague, C., Campbell, G., & Grealy, M. (2019). Systematic literature review of hand gestures used in human computer interaction interfaces. International Journal of Human-Computer Studies, 129, 74–94. https://doi.org/10.1016/j.ijhcs.2019.03.011.
  • Willich, J. V., Schmitz, M., Müller, F., Schmitt, D., & Mühlhäuser, M. (2020). Podoportation: Foot-based locomotion in virtual reality [Paper presentation]. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, Honolulu, HI, 1–14.
  • Wobbrock, J. O., Aung, H. H., Rothrock, B., & Myers, B. A. (2005). Maximizing the guessability of symbolic input [Paper presentation]. CHI '05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA.
  • Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA.
  • Wu, H., Fu, S., Yang, L., & Zhang, X. (2022). Exploring frame-based gesture design for immersive VR shopping environments. Behaviour & Information Technology, 41(1), 96–117. https://doi.org/10.1080/0144929X.2020.1795261.
  • Wu, H., Huang, K., Deng, Y., & Tu, H. (2021). Exploring the design space of eyes-free target acquisition in virtual environments. Virtual Reality, 26, 513–524. https://doi.org/10.1007/s10055-021-00591-6
  • Wu, H., & Wang, J. (2016). A visual attention-based method to address the Midas touch problem existing in gesture-based interaction. The Visual Computer, 32(1), 123–136. https://doi.org/10.1007/s00371-014-1060-0.
  • Wu, H., Wang, J., & Zhang, X. (2016). User-centered gesture development in TV viewing environment. Multimedia Tools and Applications, 75(2), 733–760. https://doi.org/10.1007/s11042-014-2323-5.
  • Wu, H., Zhang, S., Liu, J., Qiu, J., & Zhang, X. (2019). The gesture disagreement problem in free-hand gesture interaction. International Journal of Human–Computer Interaction, 35(12), 1102–1114. https://doi.org/10.1080/10447318.2018.1510607.
  • Yan, Y., Yu, C., Ma, X., Huang, S., Iqbal, H., & Shi, Y. (2018). Eyes-free target acquisition in interaction space around the body for virtual reality [Paper presentation]. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 42.
  • Zhang, T., Song, T., Chen, D., Zhang, T., & Zhuang, J. (2019). WiGrus: A wifi-based gesture recognition system using software-defined radio. IEEE Access. 7, 131102–131113. https://doi.org/10.1109/ACCESS.2019.2940386.
  • Zhong, K., Tian, F., & Wang, H. (2011, 12–15 June). Foot menu: Using heel rotation information for menu selection [Paper presentation]. 2011 15th Annual International Symposium on Wearable Computers, San Francisco, CA.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.