779
Views
21
CrossRef citations to date
0
Altmetric
Articles

The Gesture Disagreement Problem in Free-hand Gesture Interaction

ORCID Icon, , , &

References

  • Alkemade, R., Verbeek, F. J., & Lukosch, S. G. (2017). On the efficiency of a VR hand gesture-based interface for 3D object manipulations in conceptual design. International Journal of Human-Computer Interaction, 33(11), 882–901. doi:10.1080/10447318.2017.1296074
  • Buchanan, S, Floyd, B, Holderness, W, & LaViola, J. J. (2013). Towards user-defined multi-touch gestures for 3d objects. ACM Interactive Tabletops and Surfaces Conference, 231–240. St.Andrews, United Kindom: ACM.
  • Budiu, R. (2014, July 6). Memory recognition and recall in user interfaces. Retrieved August 28, 2017, from https://www.nngroup.com/articles/recognition-and-recall/
  • Cai, Z. Y., Han, J. G., Liu, L., & Shao, L. (2017). RGB-D datasets using Microsoft Kinect or similar: A survey. Multimedia Tools and Applications, 76, 4313–4355. doi:10.1007/s11042-016-3374-6
  • Chan, E., Seyed, T., Stuerzlinger, W., Yang, X. D., & Maurer, F. (2016). User elicitation on single-hand microgestures. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3403–3414. New York, USA: ACM..
  • Chen, Z., Ma, X. C., Peng, Z. Y., Zhou, Y., Yao, M. G., Ma, Z., … Shen, M. W. (2018). User-defined gestures for gestural interaction: Extending from hands to other body parts. International Journal of Human-Computer Interaction, 34(3), 238–250. doi:10.1080/10447318.2017.1342943
  • Connell, S., Kuo, P. Y., Liu, L., & Piper, A. M. (2013). A Wizard-of Oz elicitation study examining child-defined gestures with a whole-body interface. International Conference on Interaction Design and Children, 277–280. New York, USA: ACM.
  • Feng, Z. Q., Yang, B., Li, Y., Zheng, Y. W., Zhao, Z. Y., Yin, J. Q., & Meng, Q. F. (2013). Real-time oriented behavior-driven 3D freehand tracking for direct interaction. Pattern Recognition, 46(2), 590–608. doi:10.1016/j.patcog.2012.07.019
  • Furnas, G. W., Landauer, T. K., Gomez, L. M., & Dumais, S. T. (1987). The vocabulary problem in human-system communication. Communications of the ACM, 30(11), 964–971. doi:10.1145/32206.32212
  • Gheran, B. F., Vanderdonckt, J., & Vatavu, R. D. (2018). Gestures for smart rings: Empirical results, insights, and design implications. Proceedings of the 2018 Designing Interactive Systems Conference,  623–635. Hong Kong, China: ACM.
  • Grijincu, D., Nacenta, M. A., & Kristensson, P. O. (2014). Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, 25–34. Dresden, Germany: ACM.
  • Hoff, L., Hornecker, E., & Bertel, S. (2016). Modifying gesture elicitation: Do kinaesthetic priming and increased production reduce legacy bias? Tenth International Conference on Tangible, 86–91. Eindhoven, Netherlands: ACM.
  • Kistler, F., & André, E. (2013). User-defined body gestures for an interactive storytelling scenario. IFIP Conference on Human-Computer Interaction, 8118, 264–281. Berlin, Heidelberg: Springer.
  • Kray, C., Nesbitt, D., & Rohs, M. (2010). User-defined gestures for connecting mobile phones, public displays, and tabletops. Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, 239–248. Lisbon, Portugal: ACM.
  • Kristensson, P. O., Nicholson, T. F. W., & Quigley, A. (2012). Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors. Proceedings. of the 17th International Conference on Intelligent User Interfaces, 89–92. Lisbon, Portugal: ACM.
  • Kühnel, C., Westermann, T., Hemmert, F., & Kratz, S. (2011). I’m home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies, 69, 693–704. doi:10.1016/j.ijhcs.2011.04.005
  • Kurdyukova, E., Redlin, M., & André, E. (2012). Studying user-defined iPad gestures for interaction in multi-display environment. Proceedings. of the 17th International Conference on Intelligent User Interfaces, 93–96. Lisbon, Portugal: ACM.
  • Löcken, A., Hesselmann, T., Pielot, M., Henze, N., & Boll, S. (2011). User-centered process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems, 18(1), 15–31. doi:10.1007/s00530-011-0240-2
  • Lou, X. L., Peng, R., Hansen, P., & Li, X. D. A. (2018). Effects of user’s hand orientation and spatial movements on free hand interactions with large displays. International Journal of Human-Computer Interaction, 34(6), 519–532. doi:10.1080/10447318.2017.1370811
  • Morris, M. R. (2012). Web on the wall: Insights from a multimodal interaction elicitation study. Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, 95–104. Cambridge, Massachusetts, USA: ACM.
  • Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., Schraefel, M. C., & Wobbrock, J. O. (2014). Reducing legacy bias in gesture elicitation studies. Interactions, 21(3), 40–45. doi:10.1145/2591689
  • Morris, M. R., Wobbrock, J. O., & Wilson, A. D. (2010). Understanding users’ preferences for surface gestures. Proceedings of Graphics Interface, 261–268.Ottawa, Ontario, Canada: ACM.
  • Nacenta, M. A., Kamber, Y., Qiang, Y. Z., & Kristensson, P. O. (2013). Memorability of pre-designed & user-defined gesture sets. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1099–1108. Paris, France: ACM.
  • Nebeling, M., Huber, A., Ott, D., & Norrie, M. C. (2014). Web on the wall reloaded: Implementation, replication and refinement of user-defined interaction sets. Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, 15–24. Dresden, Germany: ACM.
  • Nielsen, J. (1995, January 1). 10 usability heuristics for user interface design. Retrieved August 28, 2017, from https://www.nngroup.com/articles/ten-usability-heuristics/
  • Nielsen, M., Störring, M., Moeslund, T., & Granum, E. (2004). A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In Gesture-based communication in human–computer interaction (pp. 105–106), Berlin, Heidelberg: Springer.
  • Obaid, M., Häring, M., Kistler, F., Bühling, R., & André, E. (2012). User-defined body gestures for navigational control of a humanoid robot. International Conference on Social Robotics, 367–377. Berlin, Heidelberg: Springer.
  • Ouyang, T. Y., & Li, Y. (2012). Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2895–2904. Austin, Texas, USA: ACM.
  • Porta, M. (2002). Vision-based user interfaces: Methods and applications. International Journal of Human-Computer Studies, 57, 27–73. doi:10.1006/ijhc.2002.1012
  • Randolph, J. J. (2005). Free-marginal multirater kappa: An alternative to Fleiss´ fixed-marginal multirater kappa. Online Submission, 4(3), 20.
  • Rautaray, S. S., & Agrawal, A. (2015). Vision based hand gesture recognition for human computer interaction: A survey. Artificial Intelligence Review, 43, 1–54. doi:10.1007/s10462-012-9356-9
  • Rovelo, G., Vanacken, D., Luyten, K., Abad, F., & Camahort, E. (2014). Multi-viewer gesture-based interaction for omni-directional video. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 4077–4086. Toronto, Canada: ACM.
  • Ruiz, J., Li, Y., & Lank, E. (2011). User-defined motion gestures for mobile interaction. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 197–206. Vancouver, BC, Canada: ACM.
  • Takahashi, M., Fujii, M., Naemura, M., & Satoh, S. (2013). Human gesture recognition system for TV viewing using time-of-flight camera. Multimedia Tools and Applications, 62(3), 761–783. doi:10.1007/s11042-011-0870-6
  • Tian, F., Lyu, F., Zhang, X. L., Ren, X. S., & Wang, H. A. (2017). An empirical study on the interaction capability of arm stretching. International Journal of Human-Computer Interaction, 33(7), 565–575. doi:10.1080/10447318.2016.1265782
  • Tung, Y. C., Hsu, C. Y., Wang, H. Y., Chyou, S., Lin, J. W., Wu, P. J., … Chen, M. Y. (2015). User-defined game input for smart glasses in public space. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3327–3336. Seoul, Korea: ACM.
  • Valdes, C., Eastman, D., Grote, C., Thatte, S., Shaer, O., Mazalek, A., … Konkel, M. K. (2014). Exploring the design space of gestural interaction with active tokens through user-defined gestures. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 4107–4116. Toronto, Canada: ACM.
  • Vatavu, R. D. (2012). User-defined gestures for free-hand TV control. European Conference on Interactive Tv & Video, 45–48. Berlin, Germany: ACM.
  • Vatavu, R. D. (2013). A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments. Journal of Ambient Intelligence and Smart Environments, 5, 187–211.
  • Vatavu, R. D., & Wobbrock, J. O. (2015). Formalizing agreement analysis for elicitation studies: New measures, significance test, and toolkit. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1325–1334. Seoul, Korea: ACM.
  • Vatavu, R. D., & Wobbrock, J. O. (2016). Between-subjects elicitation studies: Formalization and tool support. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3390–3402. San Jose, CA, USA: ACM.
  • Wobbrock, J. O., Aung, H. H., Rothrock, B., & Myers, B. A. (2005). Maximizing the guessability of symbolic input. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1869–1872. Portland, Oregon, USA: ACM.
  • Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1083–1092. Boston, Massachusetts, USA: ACM.
  • Wu, H. Y., Wang, J. M., & Zhang, X. L. (2015). User-centered gesture development in TV viewing environment. Multimedia Tools and Applications, 75, 733–760. doi:10.1007/s11042-014-2323-5
  • Zaiţi, I. A., Pentiuc, Ş. G., & Vatavu, R. D. (2015). On free-hand TV control: Experimental results on user-elicited gestures with leap motion. Personal and Ubiquitous Computing, 19(5–6), 821–838. doi:10.1007/s00779-015-0863-y

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.