176
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual Reality

, ORCID Icon, , , , , & show all
Received 24 Aug 2023, Accepted 29 Jan 2024, Published online: 15 Feb 2024

References

  • Ahn, S., Santosa, S., Parent, M., Wigdor, D., Grossman, T., & Giordano, M. (2021). StickyPie: A gaze-based, scale-invariant marking menu optimized for AR/VR [Paper presentation]. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, New York, NY, USA (pp. 1–16). https://doi.org/10.1145/3411764.3445297
  • Almoctar, H., Irani, P., Peysakhovich, V., & Hurter, C. (2018). Path word: A multimodal password entry method for ad-hoc authentication based on digits’ shape and smooth pursuit eye movements. Proceedings of the 20th ACM [Paper presentation]. International Conference on Multimodal Interaction, New York, NY, USA (pp. 268–277). https://doi.org/10.1145/3242969.3243008
  • Bailly, G., Lecolinet, E., & Nigay, L. (2016). Visual menu techniques. ACM Computing Surveys, 49(4), 1–41. https://doi.org/10.1145/3002171
  • Bao, Y., Wang, J., Wang, Z., & Lu, F. (2023). Exploring 3D interaction with gaze guidance in augmented reality [Paper presentation]. 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), Shanghai, China (pp. 22–32). https://doi.org/10.1109/VR55154.2023.00018
  • Barbotin, N., Baumeister, J., Cunningham, A., Duval, T., Grisvard, O., & Thomas, B. H. (2022). Evaluating visual cues for future airborne surveillance using simulated augmented reality displays [Paper presentation]. 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Christchurch, New Zealand (pp. 213–221). https://doi.org/10.1109/VR51125.2022.00040
  • Bee, N., & André, E. (2008). Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. In E. André, L. Dybkjær, W. Minker, H. Neumann, R. Pieraccini, & M. Weber (Eds.), Perception in multimodal dialogue systems (pp. 111–122). Springer. https://doi.org/10.1007/978-3-540-69369-7_13
  • Bermejo, C., Lee, L. H., Chojecki, P., Przewozny, D., & Hui, P. (2021). Exploring button designs for mid-air interaction in virtual reality: A hexa-metric evaluation of key representations and multi-modal cues. Proceedings of the ACM on Human-Computer Interaction, 5(EICS), 1–26. https://doi.org/10.1145/3457141
  • Bertin, J., & Barbut, M. (1973). Sémiologie graphique: Les diagrammes, les réseaux, les cartes (2nd éd.). Mouton.
  • Blattgerste, J., Renner, P., & Pfeiffer, T. (2018). Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views [Paper presentation]. Proceedings of the Workshop on Communication by Gaze Interaction, New York, NY, USA (pp. 1–9). https://doi.org/10.1145/3206343.3206349
  • Cai, J., Ge, X., Tian, Y., Ge, L., Shi, H., Wan, H., & Xu, J. (2023). Designing gaze-based interactions for teleoperation: Eye stick and eye click. International Journal of Human–Computer Interaction, 0(0), 1–15. https://doi.org/10.1080/10447318.2023.2169992
  • Chen, C., Yin, J., & Fu, T. (2023). Investigation to screen region division for stable input of eye-controlled cursor [Paper presentation]. 2023 IEEE International Conference on Control, Electronics and Computer Technology (ICCECT), Jilin, China (pp. 535–539). https://doi.org/10.1109/ICCECT57938.2023.10140872
  • Chen, R., Huang, J., & Zhou, J. (2020). Skeuomorphic or flat icons for an efficient visual search by younger and older adults? Applied Ergonomics, 85, 103073. https://doi.org/10.1016/j.apergo.2020.103073
  • Chen, X., Tang, X., Luo, Z., & Zhang, J. (2021). Evaluating user cognition of network diagrams. Visual Informatics, 5(4), 26–33. https://doi.org/10.1016/j.visinf.2021.12.004
  • Chen, X., Tang, X., Zhao, Y., Huang, T., Qian, R., Zhang, J., Chen, W., & Wang, X. (2023). Evaluating visual consistency of icon usage in across-devices. International Journal of Human–Computer Interaction, 1–17. https://doi.org/10.1080/10447318.2022.2162275
  • Choe, M., Choi, Y., Park, J., & Kim, H. K. (2019). Comparison of gaze cursor input methods for virtual reality devices. International Journal of Human–Computer Interaction, 35(7), 620–629. https://doi.org/10.1080/10447318.2018.1484054
  • Contreras, R., Ghajar, J., Bahar, S., & Suh, M. (2011). Effect of cognitive load on eye-target synchronization during smooth pursuit eye movement. Brain Research, 1398, 55–63. https://doi.org/10.1016/j.brainres.2011.05.004
  • Cournia, N., Smith, J. D., & Duchowski, A. T. (2003). Gaze- vs. Hand-based pointing in virtual environments [Paper presentation]. CHI ’03 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA (pp. 772–773). https://doi.org/10.1145/765891.765982
  • Das, K., & Borst, C. W. (2010). An evaluation of menu properties and pointing techniques in a projection-based VR environment [Paper presentation]. 2010 IEEE Symposium on 3D User Interfaces (3DUI), Waltham, MA, USA (pp. 47–50). https://doi.org/10.1109/3DUI.2010.5444721
  • Deng, C. L., Tian, C. Y., & Kuai, S. G. (2022). A combination of eye-gaze and head-gaze interactions improves efficiency and user experience in an object positioning task in virtual environments. Applied Ergonomics, 103, 103785. https://doi.org/10.1016/j.apergo.2022.103785
  • Drewes, H., & Schmidt, A. (2007). Interacting with the computer using gaze gestures. In C. Baranauskas, P. Palanque, J. Abascal, & S. D. J. Barbosa (Eds.), Human-computer interaction – INTERACT 2007 (pp. 475–488). Springer. https://doi.org/10.1007/978-3-540-74800-7_43
  • Egeth, H. E. (1966). Parallel versus serial processes in multidimensional stimulus discrimination. Perception & Psychophysics, 1(4), 245–252. https://doi.org/10.3758/BF03207389
  • Esteves, A., Shin, Y., & Oakley, I. (2020). Comparing selection mechanisms for gaze input techniques in head-mounted displays. International Journal of Human-Computer Studies, 139, 102414. https://doi.org/10.1016/j.ijhcs.2020.102414
  • Esteves, A., Velloso, E., Bulling, A., & Gellersen, H. (2015). Orbits: Gaze interaction for smart watches using smooth pursuit eye movements [Paper presentation]. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, Charlotte, NC, USA (pp. 457–466). https://doi.org/10.1145/2807442.2807499
  • Esteves, A., Verweij, D., Suraiya, L., Islam, R., Lee, Y., & Oakley, I. (2017). Smooth moves: Smooth pursuits head movements for augmented reality [Paper presentation]. Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, Québec City, QC, Canada (pp. 167–178). https://doi.org/10.1145/3126594.3126616
  • Fernandes, A. S., & Feiner, S. K. (2016). Combating VR sickness through subtle dynamic field-of-view modification [Paper presentation]. 2016 IEEE Symposium on 3D User Interfaces (3DUI), Greenville, SC, USA (pp. 201–210). https://doi.org/10.1109/3DUI.2016.7460053
  • Fischer, R., & Plessow, F. (2015). Efficient multitasking: Parallel versus serial processing of multiple tasks. Frontiers in Psychology, 6, 1366. https://doi.org/10.3389/fpsyg.2015.01366
  • Gao, S., Wang, H., & Xue, C. (2021). The effects of brightness difference on visual perception of characters. 2021 22nd IEEE International Conference on Industrial Technology (ICIT), Valencia, Spain (Vol. 1, pp. 1200–1204). IEEE. https://doi.org/10.1109/ICIT46573.2021.9453647
  • Greene, C. M., Broughan, J., Hanlon, A., Keane, S., Hanrahan, S., Kerr, S., & Rooney, B. (2021). Visual search in 3D: Effects of monoscopic and stereoscopic cues to depth on the validity of feature integration theory and perceptual load theory. Frontiers in Psychology, 12, 596511. https://doi.org/10.3389/fpsyg.2021.596511
  • Grimmer, J., Simon, L., & Ehlers, J. (2021). The cognitive eye: Indexing oculomotor functions for mental workload assessment in cognition-aware systems [Paper presentation]. Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, New York, NY, USA (pp. 1–6). https://doi.org/10.1145/3411763.3451662
  • Guo, F., Chen, J., Li, M., Lyu, W., & Zhang, J. (2022). Effects of visual complexity on user search behavior and satisfaction: An eye-tracking study of mobile news apps. Universal Access in the Information Society, 21(4), 795–808. https://doi.org/10.1007/s10209-021-00815-1
  • Hincapié-Ramos, J. D., Guo, X., Moghadasian, P., & Irani, P. (2014). Consumed endurance: A metric to quantify arm fatigue of mid-air interactions [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, Ontario, Canada (pp. 1063–1072). https://doi.org/10.1145/2556288.2557130
  • Hou, W., & Chen, X. (2021). Comparison of eye-based and controller-based selection in virtual reality. International Journal of Human–Computer Interaction, 37(5), 484–495. https://doi.org/10.1080/10447318.2020.1826190
  • Hyrskykari, A., Istance, H., & Vickers, S. (2012). Gaze gestures or dwell-based interaction? [Paper presentation]. Proceedings ETRA 2012: Eye Tracking Research and Applications Symposium, Glasgow, United Kingdom (pp. 229–232). https://doi.org/10.1145/2168556.2168602
  • Jackson, K. M., Shaw, T. H., & Helton, W. S. (2023). Evaluating the dual-task decrement within a simulated environment: Word recall and visual search. Applied Ergonomics, 106, 103861. https://doi.org/10.1016/j.apergo.2022.103861
  • Jacob, R. J. K. (1990). What you look at is what you get: Eye movement-based interaction techniques [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA (pp. 11–18). https://doi.org/10.1145/97243.97246
  • Khamis, M., Oechsner, C., Alt, F., & Bulling, A. (2018). VRpursuits: Interaction in virtual reality using smooth pursuit eye movements [Paper presentation]. Proceedings of the 2018 International Conference on Advanced Visual Interfaces, New York, NY, USA (pp. 1–8). https://doi.org/10.1145/3206505.3206522
  • Khamis, M., Saltuk, O., Hang, A., Stolz, K., Bulling, A., & Alt, F. (2016). TextPursuits: Using text for pursuits-based interaction and calibration on public displays [Paper presentation]. Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Heidelberg, Germany (pp. 274–285). https://doi.org/10.1145/2971648.2971679
  • Kim, T., Ham, A., Ahn, S., & Lee, G. (2022). Lattice menu: A low-error gaze-based marking menu utilizing target-assisted gaze gestures on a lattice of visual anchors [Paper presentation]. CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA (pp. 1–12). https://doi.org/10.1145/3491102.3501977
  • Kishishita, N., Kiyokawa, K., Orlosky, J., Mashita, T., Takemura, H., & Kruijff, E. (2014). Analysing the effects of a wide field of view augmented reality display on search performance in divided attention tasks [Paper presentation]. 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany (pp. 177–186). https://doi.org/10.1109/ISMAR.2014.6948425
  • Kishishita, N., Orlosky, J., Mashita, T., Kiyokawa, K., & Takemura, H. (2013). Investigation on the peripheral visual field for information display with real and virtual wide field-of-view see-through HMDs [Paper presentation]. 2013 IEEE Symposium on 3D User Interfaces (3DUI), Orlando, FL, USA (pp. 143–144). https://doi.org/10.1109/3DUI.2013.6550219
  • Li, K., Kadohisa, M., Kusunoki, M., Duncan, J., Bundesen, C., & Ditlevsen, S. (2020). Distinguishing between parallel and serial processing in visual attention from neurobiological data. Royal Society Open Science, 7(1), 191553. https://doi.org/10.1098/rsos.191553
  • Lindberg, T., Näsänen, R., & Müller, K. (2006). How age affects the speed of perception of computer icons. Displays, 27(4–5), 170–177. https://doi.org/10.1016/j.displa.2006.06.002
  • Lohr, D. J., & Komogortsev, O. V. (2017). A comparison of smooth pursuit- and dwell-based selection at multiple levels of spatial accuracy [Paper presentation]. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA (pp. 2760–2766). https://doi.org/10.1145/3027063.3053233
  • Luro, F. L., & Sundstedt, V. (2019). A comparative study of eye tracking and hand controller for aiming tasks in virtual reality [Paper presentation]. Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, New York, NY, USA (pp. 1–9). https://doi.org/10.1145/3317956.3318153
  • MacKenzie, I. S. (1992). Fitts’ law as a research and design tool in human-computer interaction. Human–Computer Interaction, 7(1), 91–139. https://doi.org/10.1207/s15327051hci0701_3
  • MacKenzie, I. S., & Buxton, W. (1992). Extending Fitts’ law to two-dimensional tasks [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA (pp. 219–226). https://doi.org/10.1145/142750.142794
  • Majaranta, P., Ahola, U. K., & Špakov, O. (2009). Fast gaze typing with an adjustable dwell time [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA (pp. 357–360). https://doi.org/10.1145/1518701.1518758
  • McDougall, S., Reppa, I., Kulik, J., & Taylor, A. (2016). What makes icons appealing? The role of processing fluency in predicting icon appeal in different task contexts. Applied Ergonomics, 55, 156–172. https://doi.org/10.1016/j.apergo.2016.02.006
  • Miniotas, D. (2000). Application of Fitts’ law to eye gaze interaction [Paper presentation]. CHI ’00 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA (pp. 339–340). https://doi.org/10.1145/633292.633496
  • Moran, R., Zehetleitner, M., Liesefeld, H. R., Müller, H. J., & Usher, M. (2016). Serial vs. parallel models of attention in visual search: Accounting for benchmark RT-distributions. Psychonomic Bulletin & Review, 23(5), 1300–1315. https://doi.org/10.3758/s13423-015-0978-1
  • Morimoto, C. H., & Mimica, M. R. M. (2005). Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding, 98(1), 4–24. https://doi.org/10.1016/j.cviu.2004.07.010
  • Mutasim, A. K., Stuerzlinger, W., & Batmaz, A. U. (2020). Gaze tracking for eye-hand coordination training systems in virtual reality [Paper presentation]. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, New York, NY, USA (pp. 1–9). https://doi.org/10.1145/3334480.3382924
  • Nanjappan, V., Liang, H.-N., Lu, F., Papangelis, K., Yue, Y., & Man, K. L. (2018). User-elicited dual-hand interactions for manipulating 3D objects in virtual reality environments. Human-Centric Computing and Information Sciences, 8(1), 31. https://doi.org/10.1186/s13673-018-0154-5
  • Niu, Y., Li, X., Yang, W., Xue, C., Peng, N., & Jin, T. (2021). Smooth pursuit study on an eye-control system for continuous variable adjustment tasks. International Journal of Human–Computer Interaction, 39(1), 23–33. https://doi.org/10.1080/10447318.2021.2012979
  • Niu, Y., Tian, J., Han, Z., Qu, M., Tong, M., Yang, W., & Xue, C. (2022). Enhancing user experience of eye-controlled systems: Design recommendations on the optimal size, distance and shape of interactive components from the perspective of peripheral vision. International Journal of Environmental Research and Public Health, 19, 10737. https://doi.org/10.3390/ijerph191710737
  • Niu, Y., Zuo, H., Yang, X., Xue, C., Peng, N., Zhou, L., Zhou, X., & Jin, T. (2021). Improving accuracy of gaze‐control tools: Design recommendations for optimum position, sizes, and spacing of interactive objects. Human Factors and Ergonomics in Manufacturing & Service Industries, 31(3), 249–269. https://doi.org/10.1002/hfm.20884
  • Pan, Y., Ge, X., Ge, L., & Xu, J. (2021). Using eye-controlled highlighting techniques to support both serial and parallel processing in visual search. Applied Ergonomics, 97, 103522. https://doi.org/10.1016/j.apergo.2021.103522
  • PfeifferMarc, T., Latoschik, E., & Wachsmuth, I. (2008). Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. Journal of Virtual Reality and Broadcasting, 16(5). https://core.ac.uk/reader/15958839
  • Pfeuffer, K., Abdrabou, Y., Esteves, A., Rivu, R., Abdelrahman, Y., Meitner, S., Saadi, A., & Alt, F. (2021). ARtention: A design space for gaze-adaptive user interfaces in augmented reality. Computers & Graphics, 95, 1–12. https://doi.org/10.1016/j.cag.2021.01.001
  • Piening, R., Pfeuffer, K., Esteves, A., Mittermeier, T., Prange, S., Schröder, P., & Alt, F. (2021). Looking for info: Evaluation of gaze based information retrieval in augmented reality. In C. Ardito, R. Lanzilotti, A. Malizia, H. Petrie, A. Piccinno, G. Desolda, & K. Inkpen (Eds.), Human-computer interaction – INTERACT 2021 (pp. 544–565). Springer International Publishing. https://doi.org/10.1007/978-3-030-85623-6_32
  • Porcino, T., Trevisan, D., & Clua, E. (2020). Minimizing cybersickness in head-mounted display systems: Causes and strategies review [Paper presentation]. 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil (pp. 154–163). https://doi.org/10.1109/SVR51698.2020.00035
  • Qian, Y. Y., & Teather, R. J. (2017). The eyes don’t have it: An empirical comparison of head-based and eye-based selection in virtual reality [Paper presentation]. Proceedings of the 5th Symposium on Spatial User Interaction, New York, NY, USA (pp. 91–98). https://doi.org/10.1145/3131277.3132182
  • Sauer, Y., Sipatchin, A., Wahl, S., & García García, M. (2022). Assessment of consumer VR-headsets’ objective and subjective field of view (FoV) and its feasibility for visual field testing. Virtual Reality, 26(3), 1089–1101. https://doi.org/10.1007/s10055-021-00619-x
  • Schenk, S., Dreiser, M., Rigoll, G., & Dorr, M. (2017). Gazeeverywhere: Enabling gaze-only user interaction on an unmodified desktop PC in everyday scenarios [Paper presentation]. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, New York, NY, USA (pp. 3034–3044). https://doi.org/10.1145/3025453.3025455
  • Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2009). Object recognition during foveating eye movements. Vision Research, 49(18), 2241–2253. https://doi.org/10.1016/j.visres.2009.05.022
  • Sidenmark, L., Clarke, C., Zhang, X., Phu, J., & Gellersen, H. (2020). Outline pursuits: Gaze-assisted selection of occluded objects in virtual reality [Paper presentation]. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA (pp. 1–13). https://doi.org/10.1145/3313831.3376438
  • Sidenmark, L., Potts, D., Bapisch, B., & Gellersen, H. (2021). Radi-eye: Hands-free radial interfaces for 3D interaction using gaze-activated head-crossing [Paper presentation]. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan (pp. 1–11). https://doi.org/10.1145/3411764.3445697
  • Tanriverdi, V., & Jacob, R. J. K. (2000). Interacting with eye movements in virtual environments [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Hague, Netherlands (pp. 265–272). https://doi.org/10.1145/332040.332443
  • Treisman, A., & Gormican, S. (1988). Feature analysis in early vision: Evidence from search asymmetries. Psychological Review, 95(1), 15–48. https://doi.org/10.1037/0033-295X.95.1.15
  • Treisman, A. M., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12(1), 97–136. https://doi.org/10.1016/0010-0285(80)90005-5
  • Trepkowski, C., Eibich, D., Maiero, J., Marquardt, A., Kruijff, E., & Feiner, S. (2019). The effect of narrow field of view and information density on visual search performance in augmented reality [Paper presentation]. 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan (pp. 575–584). https://doi.org/10.1109/VR.2019.8798312
  • Vater, C., Klostermann, A., Kredel, R., & Hossner, E. J. (2020). Detecting motion changes with peripheral vision: On the superiority of fixating over smooth-pursuit tracking. Vision Research, 171, 46–52. https://doi.org/10.1016/j.visres.2020.04.006
  • Vidal, M., Bulling, A., & Gellersen, H. (2013). Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets [Paper presentation]. Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Zurich, Switzerland (pp. 439–448). https://doi.org/10.1145/2493432.2493477
  • Whitlock, M., Harnner, E., Brubaker, J. R., Kane, S., & Szafir, D. A. (2018). Interacting with distant objects in augmented reality [Paper presentation]. 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany (pp. 41–48). https://doi.org/10.1109/VR.2018.8446381
  • Wolfe, J. M. (2007). Guided search 4.0: Current progress with a model of visual search. In W. D. Gray (Ed.), Integrated models of cognitive systems (p. 0). Oxford University Press. https://doi.org/10.1093/acprof:oso/9780195189193.003.0008
  • Wu, H., Deng, Y., Pan, J., Han, T., Hu, Y., Huang, K., & Zhang, X. L. (2021). User capabilities in eyes-free spatial target acquisition in immersive virtual reality environments. Applied Ergonomics, 94, 103400. https://doi.org/10.1016/j.apergo.2021.103400
  • Yi, X., Lu, Y., Cai, Z., Wu, Z., Wang, Y., & Shi, Y. (2022). GazeDock: Gaze-only menu selection in virtual reality using auto-triggering peripheral menu [Paper presentation].2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Christchurch, New Zealand (pp. 832–842). https://doi.org/10.1109/VR51125.2022.00105
  • Zhang, X., Manley, C. E., Micheletti, S., Tesic, I., Bennett, C. R., Fazzi, E. M., & Merabet, L. B. (2022). Assessing visuospatial processing in cerebral visual impairment using a novel and naturalistic static visual search task. Research in Developmental Disabilities, 131, 104364. https://doi.org/10.1016/j.ridd.2022.104364
  • Zhang, Y., Bulling, A., & Gellersen, H. (2013). SideWays: A gaze interface for spontaneous interaction with situated displays [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA (pp. 851–860). https://doi.org/10.1145/2470654.2470775

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.