1,094
Views
38
CrossRef citations to date
0
Altmetric
Articles

On the Efficiency of a VR Hand Gesture-Based Interface for 3D Object Manipulations in Conceptual Design

, &

References

  • Alibali, M. (2005). Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spatial Cognition and Computation, 5(4), 307–331. doi:10.1207/s15427633scc0504
  • Benko, H., Ishak, E., & Feiner, S. (2005). Cross-dimensional gestural interaction techniques for hybrid immersive environments. IEEE Proceedings. VR 2005. Virtual Reality, 2005, 209–217. doi:10.1109/VR.2005.1492776
  • Blackler, A. L., Popovic, V., & Mahar, D. P. (2002). Intuitive Use of Products. In Common Ground. Design Research Society International Conference 2002 (pp. 120–134). London, UK: Design Research Society. Retrieved from http://eprints.qut.edu.au/1879/
  • Bordegoni, M. (1994). Parallel use of hand gestures and force-input device for interacting with 3d and virtual reality environments. International Journal of Human–Computer Interaction, 6(4), 391–413.
  • Borst, C. W., & Indugula, A. P. (2006). A spring model for whole-hand virtual grasping. Presence, 15(1), 47–61.
  • Brooke, J. (1996). SUS - A quick and dirty usability scale. Usability Evaluation in Industry, 189, 4–7.
  • Buchmann, V., Violich, S., Billinghurst, M., & Cockburn, A. (2004). FingARtips âA˘ S¸ Gesture based direct manipulation in augmented reality. Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, 1(212), 212–221.
  • Canare, D., Chaparro, B., & He, J. (2015). A comparison of gaze-based and gesture-based input for a point-and- click task. International Conference on Universal Access in Human–Computer Interaction, 9176, 15–24. doi:10.1007/978-3-319-20681-3
  • Cash, P., & Maier, A. (2016). Prototyping with your hands: The many roles of gesture in the communication of de- sign concepts. Journal of Engineering Design, 27(1–3), 118–145. doi:10.1080/09544828.2015.1126702
  • Coelho, J., & Verbeek, F. (2014). Pointing Task Evaluation of Leap Motion Controller in 3D Virtual Environment. In Creating the Difference: Proceedings of the Chi Sparks 2014 Conference. (p. 78). The Hague, The Netherlands: The Hague University of Applied Sciences and Chi Nederland. Retrieved from http://chi-sparks.nl/2014/proceedings/
  • R Core Team. (2014). R: A language and environment for statistical computing (Computer Software Manual). Vienna, Austria.
  • Datcu, D., Lukosch, S., & Brazier, F. (2015). On the usability and effectiveness of different interaction types in augmented reality. International Journal of Human–Computer Interaction, 31 (3), 193–209.
  • Deering, M. F. (1995, September). HoloSketch: A virtual reality sketching/animation tool. ACM Transactions on Computer–Human Interaction, 2(3), 220–238. doi:10.1145/210079.210087
  • de Ruiter, J. P. (2000). The production of gesture and speech. Language and Gesture, 2, 284.
  • Doherty, J. (1985, January). The effects of sign characteristics on sign acquisition and retention: An integrative review of the literature. Augmentative and Alternative Communication, 1(3), 108–121. doi:10.1080/07434618512331273601
  • Ellis, R., & Tucker, M. (2000). Micro-affordance: The potentiation of components of action by seen objects. British Journal of Psychology, 91, 451–471.
  • Encarnação, L. (1999). A translucent sketchpad for the virtual table exploring motion-based gesture recognition. Computer Graphics Forum, 18 (3), 277–286.
  • Fiorentino, M., de Amicis, R., Monno, G., & Stork, A. (2002). Spacedesign: A Mixed Reality Workspace for Aesthetic Industrial Design. In Proceedings of the 1st International Symposium on Mixed and Augmented Reality (p. 86). Washington, DC, USA: IEEE Computer Society. Retrieved from http://dl.acm.org/citation.cfm?id=850976.854976
  • Gibbs, R. W. J. (2006). Embodiment and cognitive science. Cambridge, UK: Cambridge University Press.
  • Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX: Results of empirical and theoretical research. Advances in Psychology, 52, 139–183.
  • Hostetter, A. B., & Alibali, M. W. (2004). On the tip of the mind: Gesture as a key to conceptualization. In Proceedings of the twenty-sixth annual conference of the cognitive science society (Vol. 26). Chicago, IL: Cognitive Science Society.
  • Hothorn, T., Bretz, F., & Westfall, P. (2008). Simultaneous inference in general parametric models. Biometrical Journal, 50(3), 346–363.
  • Igarashi, T., Matsuoka, S., & Tanaka, H. (2007). Teddy: A sketching interface for 3D freeform design. In ACM SIGGRAPH 2007 Courses. New York, NY: ACM.
  • Jerald, J., Mlyniec, P., Yoganandan, A., Rubin, A., Paullus, D., & Solotko, S. (2013). Makevr: A 3d world-building interface. In 3D User Interfaces (3DUI), 2013 IEEE Symposium on (pp. 197–198). Orlando, FL.
  • Loup-Escande, E., Jamet, E., Ragot, M., Erhel, S., & Michinov, N. (2017). Effects of stereoscopic display on learning and user experience in an educational virtual environment. International Journal of Human–Computer Interaction, 33 (2), 115–122.
  • Luftig, R. L., & Lloyd, L. L. (1981). Manual sign translucency and referential concreteness in the learning of signs. Sign Language Studies, 30(1), 49–60.
  • Mandel, M. (1977a). Iconic devices in American sign language. On the Other Hand: New Perspectives on American Sign Language, 1, 57–107.
  • Mandel, M. (1977b). Iconicity of signs and their learnability by non-signers. In Proceedings of the first national symposium on sign language research and teaching (pp. 259–266). Chicago, IL.
  • Marsh, T., & Watt, A. (1998). Shape your imagination: iconic gestural-based interaction. In Virtual Reality Annual International Symposium, 1998. Proceedings, IEEE 1998 (pp. 122–125). http://doi.org/10.1109/VRAIS.1998.658465
  • McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago, IL: University of Chicago Press.
  • Morsella, E., & Krauss, R. M. (2004). The role of gestures in spatial working memory and speech. The American Journal of Psychology, 117, 411–424.
  • Naya, F., Jorge, J., Conesa, J., Contero, M., & Gomis, J. M. (2002). Direct modeling: from sketches to 3D models. In Proceedings of the 1st Ibero-American Symposium in Computer Graphics SIACG (pp. 109–117). Guimarães, Portugal
  • Nevins, J. L., & Whitney, D. E. (1989). Concurrent design of products and processes: A strategy for the next generation in manufacturing. New York, NY: McGraw-Hill Companies.
  • Norman, D. A. (1986). Cognitive engineering. User Centered System Design: New Perspectives on Human-Computer Interaction, 3161, 31–61.
  • Oh, J.-Y., Stuerzlinger, W., & Dadgari, D. (2006). Group selection techniques for efficient 3D modeling. In 3D User Interfaces, 2006. 3DUI 2006. IEEE Symposium on (pp. 95–102). Alexandria, VA.
  • Pereira, J., Jorge, J., Branco, V., & Ferreira, F. (2000). Towards calligraphic interfaces: sketching 3D scenes with gestures and context icons. In WSCG ‘2000: Conference proceeding: The 8th International Conference in Central Europe on Computers Graphics, Visualization and Interaktive Digital Media ‘2000 in cooperation with EUROGRAPHICS and IFIP WG 5.10 (pp. 314–321). Plzen, Czech Republic: University of West Bohemia.
  • Pinheiro, J., Bates, D., DebRoy, S., Sarkar, D., & R Core Team. (2014). {nlme}: Linear and Nonlinear Mixed Effects Models. R Foundation for Statistical Computing. Retrieved from https://cran.r-project.org/package=nlme.
  • Quek, F., Mcneill, D., Bryll, R., & Mccullough, K. E. (2002). Multimodal human discourse: Gesture and speech university of illinois at Chicago. ACM Transactions on Computer-Human Interaction (TOCHI), 9(3), 171–193.
  • Rahimian, F. P., & Ibrahim, R. (2011, May). Impacts of VR 3D sketching on novice designersâA˘ Z´ spatial cognition in collaborative conceptual architectural de- sign. Design Studies, 32(3), 255–291. doi:10.1016/j.destud.2010.10.003
  • Ruiter, J. D. (2000). The production of gesture and speech. Language and Gesture.
  • Sadeghipour, A., & Morency, L.-P. (2014). 3DIG - 3D Iconic gesture dataset.
  • Sadeghipour, A., Morency, L.-P., & Kopp, S. (2012). Gesture-based object recognition using histograms of guiding strokes. In Proceedings of the British Machine Vision Conference (p. 44.1–44.11). Durham, UK: BMVA Press.
  • Shah, J. J., Smith, S. M., & Vargas-Hernandez, N. (2003, March). Metrics for measuring ideation effectiveness. Design Studies, 24(2), 111–134. doi:10.1016/S0142-694X(02)00034-0
  • Shen, Y., Ong, S.-K., & Nee, A. Y. (2011). Vision-based hand interaction in augmented reality environment. International Journal of Human–Computer Interaction, 27(6), 523–544.
  • Sowa, T., & Wachsmuth, I. (2002). Interpretation of shape- related iconic gestures in virtual environments. In I. Wachsmuth & T. Sowa (Eds.), Gesture and Sign Language in Human-Computer Interaction: International Gesture Workshop, GW 2001 London, UK, April 18&20, 2001 Revised Papers (pp. 21–33). Berlin, Heidelberg, Germany: Springer Berlin Heidelberg.
  • Sturman, D. J., & Zeltzer, D. (1993). A Design Method for “Whole-hand” Human-computer Interaction. ACM Trans. Inf. Syst., 11(3), 219–238.
  • Suwa, M., Gero, J., & Purcell, T. (2006). Unexpected discoveries and s-inventions of design requirements: A key to creative designs (Computational Models of Creative Design IV, Key Centre of Design Computing and Cognition). Sydney, Australia: University of Sydney.
  • van Dijk, C. G. (1995, January). New insights in computer-aided conceptual design. Design Studies, 16(1), 62–80. doi:10.1016/0142-694X(95)90647-X
  • VRClay. (2014). VRClay - Sculpting in virtual reality. Retrieved from http://vrclay.com/
  • Wang, L., Shen, W., & Xie, H. (2002). Collaborative conceptual designâA˘Tˇ state of the art and future trends. Computer-Aided Design, 34(13), 981–996.
  • Wesp, R., Hesse, J., Keutmann, D., & Wheaton, K. (2001). Gestures maintain spatial imagery. The American Journal of Psychology, 114, 591.
  • Zeleznik, R., Herndon, K., & Hughes, J. (2007). SKETCH: An Interface for Sketching 3D Scenes. In ACM SIGGRAPH 2007 Courses. New York, NY, USA: ACM.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.