526
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

User-Defined Foot Gestures for Eyes-Free Interaction in Smart Shower Rooms

ORCID Icon, & ORCID Icon
Pages 4139-4161 | Received 07 Mar 2022, Accepted 29 Jul 2022, Published online: 18 Aug 2022
 

Abstract

With the rapid development of natural human-computer interaction technologies, gesture-based interfaces have become popular. Although gesture interaction has received extensive attention from both academia and industry, most existing studies focus on hand gesture input, leaving foot-gesture-based interfaces underexplored, especially in scenarios where the user’s hands are occupied for other interaction tasks such as washing the hair in smart shower rooms. In such scenarios, users often have to perform interactive tasks (e.g., controlling water volume) with their eyes closed when water and shampoo liquid flow along with their head to eyes area. One possible way to address this problem is to use eyes-free (rather than eyes-engaged), foot-gesture-based interactive techniques that allow users to interact with the smart shower system without visual involvement. Through our online survey, 71.60% of the participants (58/81) have the requirements of using foot-gesture-based eyes-free interactions during showers. To this end, we conducted a three-phase study to explore foot-gesture-based interaction to achieve eyes-free interaction in smart shower rooms. We first derived a set of user-defined foot gestures for eyes-free interaction in smart shower rooms. Then, we proposed a taxonomy for foot gesture interaction. Our findings indicated that end-users preferred single-foot (76.1%), atomic (73.3%), deictic (65.0%), and dynamic (76.1%) foot gestures, which markedly differs from the results reported by previous studies on user-defined hand gestures. In addition, most of the user-defined dynamic foot gestures involve atomic movements perpendicular to the ground (40.1%) or parallel to the ground (27.7%). We finally distilled a set of concrete guidelines for foot gesture interfaces based on observing end-users’ mental model and behaviors when interacting with foot gestures. Our research can inform the design and development of foot-gesture-based interaction techniques for applications such as smart homes, intelligent vehicles, VR games, and accessibility design.

Acknowledgments

The authors would like to thank the editor and the anonymous reviewers for their insightful comments.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

Additional information

Funding

This work was supported by the National Natural Science Foundation of China under [Grant No. 61772564], the Guangdong Basic and Applied Basic Research Foundation under [Grant No. 2021A1515011990], and Guangdong Key Laboratory for Big Data Analysis and Simulation of Public Opinion under [Grant No. 2017B030301003].

Notes on contributors

Zhanming Chen

Zhanming Chen is a graduate student at the School of Communication and Design, Sun Yat-Sen University, China. His research interests include human-computer interaction, elicitation study, and usability engineering. He obtained a Bachelor of Marketing from Sun Yat-Sen University, Guangzhou, China, in 2019.

Huawei Tu

Huawei Tu is an Assistant Professor at La Trobe University, Australia. His research area is Human-computer Interaction, with special interests in multimodal interaction and user interface design. He has published more than 30 research papers including top-tier HCI journal papers (e.g. ACM TOCHI) and conference papers such as ACM CHI.

Huiyue Wu

Huiyue Wu is a Full Professor at Sun Yat-Sen University, Guangzhou, China, where he is also the director of the HCI Laboratory. He is the author of five books and more than 40 publications in the field of HCI (e.g., IJHCS, IJHCI). His research interests include human-computer interaction and virtual reality.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 306.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.