552
Views
17
CrossRef citations to date
0
Altmetric
Original Articles

Effects of User’s Hand Orientation and Spatial Movements on Free Hand Interactions with Large Displays

, , &
 

ABSTRACT

In motion-sensing interaction with large displays through bare hands, we can observe that users alternate their hands and move their bodies frequently. What cause such actions and how these actions affect free hand interaction results are less systematically investigated. To address this gap in knowledge, we conducted studies on Pointer-Acceleration (PA)-based free hand interactions of target selection and found that (1) users made more frequent hand alternations when selecting small targets with large movement amplitudes, as in such cases users were not only affected by observable arm fatigue, but were also motivated to switch hands for higher selection accuracy and convenience; (2) hand alternation led to the hand orientation effects: target selection on display areas at the operating hand’s side was more efficient and accurate than that at the opposite side; (3) large movement amplitudes on the user interface increased users’ physical movements in front of the large display, which harmed selection efficiency; (4) selection of small targets led to a closer interaction distance, while large movement amplitudes led to a larger interaction distance; and (5) selection results were affected by interaction distances, as users gained high efficiency, but low accuracy at a large distance and low efficiency, but high accuracy at a close distance. Given these results, this article discusses practical implications for applying PA-based free hand interaction techniques and designing related user interfaces on large displays.

Funding

The research was supported by the Special project on “Cloud-based Natural Interaction Devices and Tools” under the National Key Research and Development Program by Ministry of Science and Technology of the People’s Republic of China (2016YFB1001304), the Zhejiang Provincial Natural Science Funding (LQ15F020002) by Natural Science Foundation of Zhejiang Province, and the project on China Knowledge Centre for Engineering Science and Technology (CKCEST-2017-1-13).

Additional information

Funding

This research was supported by the funding of ‘Cloud-based natural interaction devices and tools [2016YFB1001300]’, ‘Zhejiang Provincial Natural Science Funding [LQ15F020002]’, and ‘China Knowledge Centre for Engineering Science and Technology [CKCEST-2014-3-2]’.

Notes on contributors

Xiaolong Lou

Xiaolong Lou is a PhD student in the Department of Digital Media at Zhejiang University, China. His research interests include large display ergonomics, usability and user experience in motion-sensing interaction.

Ren Peng

Ren Peng is an associate professor in the Department of Digital Media at Zhejiang University, China. He is the director of the Department of Digital Media. His research interests are multimedia design, industrial design and user-centered design.

Preben Hansen

Preben Hansen is an associate professor in the Department of Computer and Systems Sciences (DSV) at Stockholm University, Sweden. He is the head for the design and collaborative technologies research group. His research interests include interaction design, HCI, user-centered design methodologies and collaborative information Searching.

Xiangdong A. Li

Xiangdong A. Li is a lecturer in the Department of Digital Media at Zhejiang University, China. His research interests include interaction design and evaluation methodology, collaborative interaction, creative multimedia design and interaction.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.