1,339
Views
20
CrossRef citations to date
0
Altmetric
Articles

Grasping VR: Presence of Pseudo-Haptic Interface Based Portable Hand Grip System in Immersive Virtual Reality

, ORCID Icon, &

References

  • Achibet, M., Gouis, B. L., Marchal, M., Léziart, P., Argelaguret, F., Girard, A., … Kajimoto, H. (2017, March). Flexifingers: Multi-finger interaction in VR combining passive haptics and pseudo-haptics. In 2017 IEEE Symposium on 3D User Interface (3DUI). Washington, DC: IEEE.
  • Achibet, M., Marchal, M., Argelaguet, F., & Lécuyer, A. (2014, March). The virtual mitten: A novel interaction paradigm for visuo-haptic manipulation of objects using grip force. In 2014 IEEE Symposium on 3D User Interface (3DUI). Washington, DC: IEEE.
  • Bharath, V. G., & Patil, R. (2018). Solid modelling interaction with sensors for virtual reality welding. In MATEC Web Conf. 144(01008) (pp. 1–6). Les Ulis, France: EDP Sciences.
  • Bloomfield, A., Deng, Y., Wampler, P., Rondot, D. H., McManus, M., & Badler, A. (2003). A taxonomy and comparison of haptic actions for disassembly tasks. In IEEE Virtual Reality 2003 (pp. 225–231). Washington, DC: IEEE Computer Society.
  • Borst, C. W., & Indugula, A. P. (2005). Realistic virtual grasping. In IEEE Conference 2005 on Virtual Reality (pp. 91–98). Washington, DC: IEEE Computer Society.
  • Buckingham, G., Cant, J. S., & Goodale, M. A. (2009). Living in a material world: How visual cues to material properties affect the way that we lift objects and perceive their weight. Journal of Neurophysiology, 102, 3111–3118. doi:10.1152/jn.00515.2009
  • Choi, I., Hawkes, E. W., Christensen, D.L., Ploch, C.J., & Follmer, S. (2016). Wolverine: A wearable haptic interface for grasping in virtual reality. In IEEE/RSJ International Conference on Intelligent Robots and System (pp. 986–993). Washington, DC: IEEE.
  • Choi, I., Ofek, E., Benko, H., Sinclair, M., & Holz, C. (2018). Claw: A multifunctional handheld haptic controller for grasping, touching, and triggering in virtual reality. In Conference on Human Factors in Computing Systems (CHI’ 18) Vol. 654 (pp. 1–13). New York, NY: ACM.
  • Cybergrasp. (2017). CyberGlove Systems Inc. Retrieved from http://www.cyberglovesystems.com/cybergrasp
  • D’Agostino, R., & Pearson, E. (1973). Tests for departure from normality. Empirical Results for the distributions of b2 and √ b1. Biometrika, 60(3), 613–622.
  • De Tinguy, X., Pacchierotti, C., Marchal, M., & Lécuyer, A. (2018). Enhancing the stiffness perception of tangible objects in mixed reality using wearable haptics. In IEEE VR 2018-25th IEEE Conference on Virtual Reality and 3D User Interfaces (pp. 81–90). Washington, DC: IEEE.
  • Dominjon, L., Lecuyer, A., Burkhardt, J., Richard, P., & Richir, S. (2005). Influence of control/display ratio on the perception of mass of manipulated objects in virtual environments. In IEEE Proceedings VR 2005 (pp. 19–25). Washington, DC: IEEE.
  • Fernández-Vargas, J., Tarvainen, T. V. J., Kita, K., & Yu, W. (2017). Effects of using virtual reality and virtual avatar on hand motion reconstruction accuracy and brain activity. IEEE Access, 5, 23736–23750. doi:10.1109/ACCESS.2017.2766174
  • Gordon, A. M., Forssberg, H., Johansson, R. S., & Westling, G. (1991). Visual size cues in the programming of manipulative forces during precision grip. Experimental Brain Research, 83, 477–482. doi:10.1007/BF00229824
  • Han, S., & Kim, J. (2017). A study on immersion of hand interaction for mobile platform virtual reality contents. Symmetry, 9, 22. doi:10.3390/sym9020022
  • Henrysson, A., Billinghurst, M., & Ollila, M. (2005). Virtual Object manipulation using a mobile phone. In International Conference on Augmented Tele-existence (ICAT’ 05) (pp. 164–171). New York, NY: ACM.
  • Jayasiri, A., Ma, S., Qian, Y., Akahane, K., & Sato, M. (2015). Desktop versions of the string-based haptic interface – Spidar. In 2015 IEEE Virtual Reality (VR) (pp. 199–200). Washington, DC: IEEE.
  • Jeong, K., Kim, M., Lee, J., & Kim, J. (2018). Asymmetric interface: User interface of asymmetric virtual reality for new presence and experience. Behaviour & Information Technology (under review).
  • Joo, H., Simon, T., & Shikh, Y. (2018). Total capture: A 3d deformation model for tracking faces, hands, and bodies. In 2018 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 8320–8329). Washington, DC: IEEE Computer Society.
  • Kim, M., Jeon, C., & Kim, J. (2017). A study on immersion and presence of a portable hand haptic system for immersive virtual reality. Sensors, 17, 1141.
  • Kim, M., Lee, J., Jeon, C., & Kim, J. (2017). A study on interaction of gaze pointer-based user interface in mobile virtual reality environment. Symmetry, 9, 189.
  • Kim, M., Lee, J., Kim, C., & Kim, J. (2018). TPVR: User interaction of third person virtual reality for new presence and experience. Symmetry, 10, 109.
  • Kim, Y. R., & Kim, G. J. (2017). Presence and immersion of “easy” mobile VR with open flip-on lenses. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST ‘17) (pp. 38: 1–38:7). New York, NY: ACM. Article 38.
  • Kimura, T., & Nojima, T. (2012). Pseudo-haptic feedback on softness induced by grasping. In International Conference on Haptics: Perception, Devices, Mobility, and Communication – Volume Part II (pp. 202–205). Berlin, Heidelberg: Springer-Verlag.
  • Koyanagi, K., Fujii, Y., & Furusho, J. (2005). Development of VR-stef system with force display glove system. In 2005 International Conference on Augmented Tele-existence (ICAT’ 05) (pp. 91–97). New York, NY: ACM.
  • Lee, J., Jeong, K., & Kim, J. (2017). MAVE: Maze-based immersive virtual environment for new presence and experience. Computer Animation and Virtual Worlds, 28, 3–4. doi:10.1002/cav.1756
  • Lee, J., Kim, M., & Kim, J. (2017). A study on immersion and VR sickness in walking interaction for immersive virtual reality applications. Symmetry, 9, 78.
  • Lindemann, P., & Rigoll, G. (2017). A diminished reality simulation for driver-car interaction with transparent cockpits. In IEEE Virtual Reality (VR) (pp. 305–306). Washington, DC: IEEE.
  • Marwecki, S., Brehm, M., Wagner, L., Cheng, L.-P., Mueller, F. F., & Baudisch, P. (2018). Virtualspace – Overloading physical space with multiple virtual reality users. In 2018 CHI Conference on Human Factors in Computing Systems (CHI’ 18) (pp. 1–10). New York, NY: ACM.
  • Minamizawa, K., Fukamachi, S., Kajimoto, H., Kawakami, N., & Tachi, S. (2007). Gravity grabber: Wearable haptic display to present virtual mass sensation. In ACM SIGGRAPH 2007 Emerging Technologies (SIGGRAPH’ 07). New York, NY: ACM.
  • Minamizawa, K., Kajimoto, H., Kawakami, N., & Tachi, S. (2007). A wearable haptic display to present the gravity sensation – Preliminary observations and device design. In Second Joint EuroHaptics Conference and Symposium on Haptic Interface for Virtual Environment and Teleoperator Systems (WHC’ 07) (pp. 133–138). Washington, DC: IEEE.
  • Myo Armband. (2016). Thalmic lab. Retrieved from https://developerblog.myo.com
  • Otaduy, M. A., Okamura, A., & Subramanian, S. (2016). Haptic technologies for direct touch in virtual reality. In ACM SIGGRAPH 2016 Courses (SIGGRAPH ‘16) (pp. 1–123). New York, NY: ACM.
  • Oyarzabal, M., Ferre, M., Cobos, S., Monroy, M., Barrio, J., & Ortego, J. (2007). Multi-finger haptic interface for collaborative tasks in virtual environments. In Human-Computer Interaction. Interaction Platforms and Techniques (HCI 2007) (pp. 673–680). Berlin, Heidelberg: Springer Berlin Heidelberg.
  • Pacchierotti, C., Chinello, F., Malvezzi, M., Meli, L., & Prattichizzo, D. (2012, June 13–15). Two finger grasping simulation with cutaneous and kinesthetic force feedback. In P. Isokoski & J. Springare (Eds.). Haptics: Perception, devices, mobility, and communication (EuroHaptics 2012) (373–382). Berlin, Heidelberg: Springer Berlin Heidelberg.
  • Pfeiffer, T. (2011, December). Understanding multimodal deixis with gaze and gesture in conversational interfaces (Thesis for: PhD). Shaker Verlag GmbH, Aachen, Germany.
  • Pfeiffer, T. (2012). Using virtual reality technology in linguistic research. In IEEE Virtual Reality Workshops (VRW) (pp. 83–84). Washington, DC: IEEE.
  • Pusch, A., & Lécuyer, A. (2011). Pseudo-haptics: From the theoretical foundations to practical system design guidelines. In 13th International Conference on Multimodal Interfaces (ICMI’ 11) (pp. 57–64). New York, NY: ACM.
  • Pusch, A., Martin, O., & Coquillart, S. (2008). Hemp-hand-displacement based pseudo-haptics: A study of a force field application. In IEEE Symposium on 3D User Interfaces (pp. 59–66). Washington, DC: IEEE.
  • Rinderknecht, M. D., Kim, Y., Santos-Carreras, L., Bleuler, H., & Gassert, R. (2013). Combined tendon vibration and virtual reality for post stroke hand rehabilitation. In World Haptics Conference (WHC) (pp 277–282). Washington, DC: IEEE.
  • Scheggi, S., Meli, L., Pacchierotti, C., & Prattichizzo, D. (2015). Touch the virtual reality: Using the leap motion controller for hand tracking and wearable tactile devices for immersive haptic rendering. In ACM SIGGRAPH 2015 Posters (SIGGRAPH ‘15). New York, NY: ACM.
  • Slater, M., & Sanshez-Vives, M. V. (2016). Enhancing our lives with immersive virtual reality. Frontiers in Robotics and AI, 3, 74.
  • Slater, M., & Usoh, M. (1993). Simulating peripheral vision in immersive virtual environments. Computers & Graphics, 17, 643–653.
  • Slater, M., Usoh, M., & Steed, A. (1995). Taking steps: The influence of a walking technique on presence in virtual reality. ACM Transactions on Computer-Human Interaction, 2, 201–219.
  • Tesic, R., & Banerjee, P. (1999). Exact collision detection using virtual objects in virtual reality modeling of a manufacturing process. Journal of Manufacturing Systems, 18, 367–376.
  • Tsai, H. R., & Rekimoto, J. (2018). ElasticVR: Providing multi-level active and passive force feedback in virtual reality using elasticity. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA ‘18) (pp. 1–4). New York, NY: ACM.
  • Vinayagamoorthy, V., Garau, M., Steed, A., & Slater, M. (2004). An eye gaze model for dyadic interaction in an immersive virtual environment: Practice and experience. Computer Graphics Forum, 23, 1–11.
  • Visell, Y., Fontana, F., Giordano, B., Nordahl, R., Serafin, S., & Bresin, R. (2009). Sound design and perception in walking interactions. International Journal of Human-computer Studies, 67, 947–959.
  • Visell, Y., Law, A., Ip, J., Smith, S., & Cooperstock, J. R. (2010). Interaction capture in immersive virtual environments via an intelligent floor surface. In IEEE Virtual Reality Conference (VR) (pp. 313–314). Washington, DC: IEEE.
  • Witmer, B. G., Jerome, C. J., & Singer, M. J. (2005). The factor structure of the presence questionnaire. Presence: Teleoperators & Virtual Environments, 14, 298–312.
  • Zhao, W., Chai, J., & Xu, Y.-Q. (2012). Combining marker-based mocap and rgb-d camera for acquiring high-fidelity hand motion data. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA’ 12) (pp. 33–42). Aire-la-Ville, Switzerland: Eurographics Association.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.