364
Views
2
CrossRef citations to date
0
Altmetric
Research Articles

Multi-Finger-Based Arbitrary Region-of-Interest Selection in Virtual Reality

ORCID Icon, &
Pages 3969-3983 | Received 24 Dec 2021, Accepted 29 Jul 2022, Published online: 09 Aug 2022

References

  • Akers, D., Sherbondy, A., Mackenzie, R., Dougherty, R., & Wandell, B. (2004). 800Exploration of the brain’s white matter pathways with dynamic queries. In IEEE Visualization 2004 (pp. 377–384). https://doi.org/10.1109/VISUAL.2004.30
  • Argelaguet, F., & Andujar, C. (2009). Efficient 3D pointing selection in cluttered virtual environments. IEEE Computer Graphics and Applications, 29(6), 34–43. https://doi.org/10.1016/j.cag.2012.12.003
  • Argelaguet, F., & Andujar, C. (2013). A survey of 3D object selection techniques for virtual environments. Computers & Graphics, 37(3), 121–136. https://doi.org/10.1016/j.cag.2012.12.003
  • Argelaguet, F., Andujar, C., & Trueba, R. (2008). Overcoming eye-hand visibility mismatch in 3D pointing selection. In Proceedings of the 2008 ACM Symposium on Virtual Reality Software and Technology (pp. 43–46). https://doi.org/10.1145/1450579.1450588
  • Bacim, F., Kopper, R., & Bowman, D. A. (2013). Design and evaluation of 3D selection techniques based on progressive refinement. International Journal of Human-Computer Studies, 71(7–8), 785–802. https://doi.org/10.1016/j.ijhcs.2013.03.003
  • Bacim, F., Nabiyouni, M., & Bowman, D. A. (2014). Slice-n-swipe: A free-hand gesture user interface for 3D point cloud annotation. In 2014 IEEE Symposium on 3D User Interfaces (pp. 185–186).
  • Besançon, L., Sereno, M., Yu, L., Ammi, M., & Isenberg, T. (2019). Hybrid touch/tangible spatial 3D data selection. Computer Graphics Forum, 38(3), 553–567. https://doi.org/10.1111/cgf.13710
  • Caligiana, P., Liverani, A., Ceruti, A., Santi, G. M., Donnici, G., & Osti, F. (2020). An interactive real-time cutting technique for 3D models in mixed reality. Technologies, 8(2), 23. https://doi.org/10.3390/technologies8020023
  • Cashion, J., Wingrave, C., & LaViola, J. J. Jr. (2012). Dense and dynamic 3D selection for game-based virtual environments. IEEE Transactions on Visualization and Computer Graphics, 18(4), 634–642. https://doi.org/10.1109/TVCG.2012.40
  • Chen, H.-L J., Samavati, F. F., Sousa, M. C., & Mitchell, J. R. (2006). Sketch-based volumetric seeded region growing. In SBM (pp. 123–129).
  • Forsberg, A., Herndon, K., & Zeleznik, R. (1996). Aperture based selection for immersive virtual environments. In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology (pp. 95–96). https://doi.org/10.1145/237091.237105
  • GmbH, E. (2015). Enscape. Retrieved from https://enscape3d.com/
  • Google (2016). Tilt brush. Retrieved from https://www.tiltbrush.com/
  • Gosset, S., Sereno, M., Besançon, L., & Isenberg, T. (2020). Tangible volumetric brushing in augmented reality. In IEEE VIS Posters.
  • Jackson, B., Beckham, K., Cohen, A. K., & Heggeseth, B. C. (2019). Comparing convex region-of-interest selection techniques for surface geometry. In 25th ACM Symposium on Virtual Reality Software and Technology (pp. 1–5). https://doi.org/10.1145/3359996.3364258
  • Jackson, B., Jelke, B., & Brown, G. (2018). Yea big, yea high: A 3D user interface for surface selection by progressive refinement in virtual environments. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 320–326). https://doi.org/10.1109/VR.2018.8447559
  • Jackson, B., & Keefe, D. F. (2016). Lift-off: Using reference imagery and freehand sketching to create 3D models in VR. IEEE Transactions on Visualization and Computer Graphics, 22(4), 1442–1451. https://doi.org/10.1109/TVCG.2016.2518099
  • Jones, K. S., McIntyre, T. J., & Harris, D. J. (2020). Leap motion-and mouse-based target selection: Productivity, perceived comfort and fatigue, user preference, and perceived usability. International Journal of Human–Computer Interaction, 36(7), 621–630. https://doi.org/10.1080/10447318.2019.1666511
  • Kaleja, P., & Kozlovska, M. (2017). Virtual reality as innovative approach to the interior designing. Selected Scientific Papers-Journal of Civil Engineering, 12(1), 109–116. https://doi.org/10.1515/sspjce-2017-0011
  • Keefe, D., Zeleznik, R., & Laidlaw, D. (2007). Drawing on air: Input techniques for controlled 3D line illustration. IEEE Transactions on Visualization and Computer Graphics, 13(5), 1067–1081. https://doi.org/10.1109/tvcg.2007.1060
  • Keefe, D. F., Zeleznik, R. C., & Laidlaw, D. H. (2008). Tech-note: Dynamic dragging for input of 3D trajectories. In 2008 IEEE Symposium on 3D User Interfaces (pp. 51–54).
  • Kopper, R., Bacim, F., & Bowman, D. A. (2011). Rapid and accurate 3D selection by progressive refinement. In 2011 IEEE Symposium on 3D User Interfaces (pp. 67–74).
  • Liang, J., & Green, M. (1994). Jdcad: A highly interactive 3D modeling system. Computers & Graphics, 18(4), 499–506. https://doi.org/10.1016/0097-8493(94)90062-0
  • Lucas, J. F. (2005). Design and evaluation of 3D multiple object selection techniques (Unpublished doctoral dissertation). Virginia Tech.
  • Mine, M. R. (1995). Virtual environment interaction techniques. UNC Chapel Hill CS Dept.
  • Mine, M. R., Brooks Jr, F. P., & Sequin, C. H. (1997). Moving objects in space: Exploiting proprioception in virtual-environment interaction. In Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques (pp. 19–26).
  • Montano-Murillo, R. A., Nguyen, C., Kazi, R. H., Subramanian, S., DiVerdi, S., & Martinez-Plasencia, D. (2020). Slicing-volume: Hybrid 3D/2D multi-target selection technique for dense virtual environments. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 53–62). https://doi.org/10.1109/VR46266.2020.00023
  • Mossel, A., & Koessler, C. (2016). Large scale cut plane: An occlusion management technique for immersive dense 3D reconstructions. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology (pp. 201–210).
  • Paes, D., Arantes, E., & Irizarry, J. (2017). Immersive environment for improving the understanding of architectural 3D models: Comparing user spatial perception between immersive and traditional virtual reality systems. Automation in Construction, 84, 292–303. https://doi.org/10.1016/j.autcon.2017.09.016
  • Ponchio, F., Callieri, M., Dellepiane, M., & Scopigno, R. (2020). Effective annotations over 3D models. Computer Graphics Forum, 39(1), 89–105. https://doi.org/10.1111/cgf.13664
  • Poupyrev, I., Billinghurst, M., Weghorst, S., & Ichikawa, T. (1996). The go-go interaction technique: Non-linear mapping for direct manipulation in VR. In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology (pp. 79–80).
  • Poupyrev, I., Ichikawa, T., Weghorst, S., & Billinghurst, M. (1998). Egocentric object manipulation in virtual environments: empirical evaluation of interaction techniques. Computer Graphics Forum, 17(3), 41–52. https://doi.org/10.1111/1467-8659.00252
  • Ramani, K. (2015). Hand grasp and motion for intent expression in mid-air virtual pottery. In Proceedings of the 41st Graphics Interface Conference (pp. 49–57).
  • Ryu, K., Lee, J.-J., & Park, J.-M. (2019). Gg interaction: A gaze–grasp pose interaction for 3D virtual object selection. Journal on Multimodal User Interfaces, 13(4), 383–393. https://doi.org/10.1007/s12193-019-00305-y
  • Sereno, M., Ammi, M., Isenberg, T., & Besançon, L. (2016). Tangible brush: Performing 3D selection with portable and position-aware devices. In IEEE VIS 2016.
  • Shan, G., Xie, M., Li, F., Gao, Y., & Chi, X. (2014). Interactive visual exploration of halos in large-scale cosmology simulation. Journal of Visualization, 17(3), 145–156. https://doi.org/10.1007/s12650-014-0206-5
  • Sherbondy, A., Akers, D., Mackenzie, R., Dougherty, R., & Wandell, B. (2005). Exploring connectivity of the brain’s white matter with dynamic queries. IEEE Transactions on Visualization and Computer Graphics, 11(4), 419–430. https://doi.org/10.1109/TVCG.2005.59
  • Sketch, G. (2006). Gravity sketch. Retrieved from https://www.gravitysketch.com
  • Song, P., Goh, W. B., Hutama, W., Fu, C.-W., & Liu, X. (2012). A handle bar metaphor for virtual object manipulation with mid-air interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1297–1306). https://doi.org/10.1145/2207676.2208585
  • Steed, A., & Parker, C. (2004). 3D selection strategies for head tracked and non-head tracked operation of spatially immersive displays. In 8th International Immersive Projection Technology Workshop (Vol. 2).
  • Tukey, J. W. (1977). Exploratory data analysis (Vol. 2). Addison-Wesley Publishing Company.
  • Ulinski, A., Zanbaka, C., Wartell, Z., Goolkasian, P., & Hodges, L. F. (2007). Two handed selection techniques for volumetric data. In 2007 IEEE Symposium on 3D User Interfaces. https://doi.org/10.1109/3DUI.2007.340782
  • Wilches, D., & Banic, A. (2016). Volselectaware: Visual-cognition coupled non-visual data-driven volume selection for immersive scientific visualizations. In Proceedings of the 9th International Symposium on Visual Information Communication and Interaction (pp. 148–149).
  • Yu, L., Efstathiou, K., Isenberg, P., & Isenberg, T. (2012). Efficient structure-aware selection techniques for 3D point cloud visualizations with 2DOF input. IEEE Transactions on Visualization and Computer Graphics, 18(12), 2245–2254. https://doi.org/10.1109/TVCG.2012.217
  • Yuan, X., Zhang, N., Nguyen, M. X., & Chen, B. (2005). Volume cutout. The Visual Computer, 21(8–10), 745–754. https://doi.org/10.1007/s00371-005-0330-2
  • Zhang, Q., Ban, J.-S., Kim, M., Byun, H. W., & Kim, C.-H. (2021). Low-asymmetry interface for multiuser VR experiences with both HMD and Non-HMD users. Sensors, 21(2), 397. https://doi.org/10.3390/s21020397
  • Zhou, W., Correia, S., & Laidlaw, D. H. (2008). Haptics-assisted 3D lasso drawing for tracts-of-interest selection in DTI visualization. In IEEE Visualization.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.