1,423
Views
31
CrossRef citations to date
0
Altmetric
Original Articles

On the Usability and Effectiveness of Different Interaction Types in Augmented Reality

, &
 

Abstract

One of the key challenges of augmented reality (AR) interfaces is to design effective hand-based interaction supported by computer vision. Hand-based interaction requires free-hands tracking to support user interaction in AR for which this article presents a novel approach. This approach makes it possible to compare different types of hand-based interaction in AR for navigating using a spatial user interface. Quantitative and qualitative analyses of a study with 25 subjects indicate that tangible interaction is the preferred type of interaction with which to determine the position of the user interface in AR and to physically point to a preferred option for navigation in augmented reality.

Notes

1 No additional hardware equipment such as data gloves were used. Voice-based interfaces were also out of consideration due to their limited performance in noisy environments.

2 Initially, fist hand posture detector was mapped to the nondominant hand to control the location of the menu. The palm hand posture detector was mapped to the dominant hand to control the position of the on-screen cursor. Feedback from testers in the preexperiment session showed that this setup is highly uncomfortable and tiresome. This was also caused by small field of view limiting the work volume for the hands and imposing for hands rather fully stretched to interact properly with the AR system.

3 This resembles real-life situations for which users already know the structure of a menu.

4 Due to time restrictions, the other subjects could not fill in the questionnaire.

5 This distinction makes it possible to derive statistics on the number of inputs that fall in each group, pointing to indicators of low, medium, and high rankings or scores. Because some data sets (about 15%) for the different interaction types show non-normal distributions, (according to Anderson–Darling test), Mann–Whitney U test is used for checking populations prior to reporting findings.

Additional information

Notes on contributors

Dragoş Datcu

Dragoş Datcu is Postdoc researcher at the Delft University of Technology. His current research is on complex problem solving in new interaction spaces such as augmented reality. This includes the design of novel augmented reality-driven solutions enabling collaboration among distributed users, also by using affective computing technology.

Stephan Lukosch

Stephan Lukosch is associate professor at the Delft University of Technology. His current research focuses on presence, interaction, and coordination issues around virtual colocation using augmented reality. His research builds upon his recent research results on context-adaptive collaboration, collaborative storytelling, and design patterns for computer-mediated interaction.

Frances Brazier

Frances Brazier is a full professor at the Delft University of Technology. Her research focuses on the design of complex, emergent social technical participatory systems in merging networked realities. The design of witnessed presence, agency and awareness, self-management, and distributed coordination are core challenges within her research and the Participatory Systems Initiative.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.