347
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

Touchless Interfaces in the Operating Room: A Study in Gesture Preferences

ORCID Icon, , , & ORCID Icon
Pages 438-448 | Received 22 Feb 2021, Accepted 10 Feb 2022, Published online: 18 Apr 2022
 

Abstract

Touchless interfaces allow surgeons to control medical imaging systems autonomously while maintaining total asepsis in the Operating Room. This is specially relevant as it applies to the recent outbreak of COVID-19 disease. The choice of best gestures/commands for such interfaces is a critical step that determines the overall efficiency of surgeon-computer interaction. In this regard, usability metrics such as task completion time, memorability and error rate have a long-standing as potential entities in determining the best gestures. In addition, previous works concerned with this problem utilized qualitative measures to identify the best gestures. In this work, we hypothesize that there is a correlation between gestures’ qualitative properties and their usability metrics. In this regard, we conducted a user experiment with language experts to quantify gestures’ properties (v). Next, we developed a gesture-based system that facilitates surgeons to control the medical imaging software in a touchless manner. Next, a usability study was conducted with neurosurgeons and the standard usability metrics (u) were measured in a systematic manner. Lastly, multi-variate correlation analysis was used to find the relations between u and v. Statistical analysis showed that the v scores were significantly correlated with the usability metrics with an R20.40 and p < 0.05. Once the correlation is established, we can utilize either gestures’ qualitative properties or usability metrics to identify the best set of gestures.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the National Institute of Health (NIH) through the Agency for Healthcare Research and Quality (AHRQ) under [Award No. R18HS024887]. Opinions, interpretations, conclusions and recommendations are those of the author and are not necessarily endorsed by the Agency for Healthcare Research and Quality.

Notes on contributors

Naveen Madapana

Naveen Madapana is currently working at Amazon Research. Naveen’s work focuses on computer vision, HCI, and gesture recognition.

Daniela Chanci

Daniela Chanci is a Ph.D. student at Emory School of Bioengineering. Daniela’s work focuses on pattern recognition in medical imaging and human-computer interaction for medical applications.

Glebys Gonzalez

Glebys Gonzalez is a Ph.D. candidate at Purdue University, School of Industrial engineering. Her work focuses on human-robot teaching and human-robot interaction applied to healthcare.

Lingsong Zhang

Lingsong Zhang is an Associate Professor of Statistics, Department of Statistics, Purdue University. Dr. Zhang’s research focuses on bioinformatics and biologically related disciplines (genomics, nutrition, proteomics, statistical genetics) and computational methods for statistical Inference.

Juan P. Wachs

Juan Wachs is an Associate Professor at James A. and Sharon M. Tompkins Rising Star Professorship Regenstrief Center for Healthcare Engineering. Adjunct Professor of Surgery IU School of Medicine Professor of Biomedical Engineering (by courtesy) School of Industrial Engineering. Dr. Wach’s work focuses on robotics, HCI, and data science, specifically applied to healthcare.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.