620
Views
15
CrossRef citations to date
0
Altmetric
Making technology work

Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand

, , &
Pages 1044-1062 | Received 02 Jan 2015, Accepted 20 May 2016, Published online: 16 Jun 2016

References

  • Alvarez-Santos, V., R. Iglesias, X. M. Pardo, C. V. Regueiro, and A. Canedo-Rodriguez. 2014. “Gesture-based Interaction with Voice Feedback for a Tour-guide Robot.” Journal of Visual Communication and Image Representation 25 (2): 499–509. doi: 10.1016/j.jvcir.2013.03.017
  • Atia, A., S. Takahashi, K. Misue, and J. Tanaka. 2009. “UbiGesture: Customizing and Profiling Hand Gestures in Ubiquitous Environment.” In Proceedings of the 13th International Conference on Human–Computer Interaction. Part II: Novel Interaction Methods and Techniques, 141–150. Berlin: Springer-Verlag
  • Baudel, T., and M. Beaudouin-Lafon. 1993. “Charade: Remote Control of Objects Using Free-hand Gestures.” Communications of the ACM 36 (7): 28–35.
  • Bhuiyan, M., and R. Picking. 2009. “Gesture-controlled User Interfaces, What Have We Done and What’s Next.” In 5th Collaborative Research Symposium on Security, E-Learning, Internet and Networking
  • Bolt, R. A. 1980. “‘Put-that-there’: Voice and Gesture at the Graphics Interface.” In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH ‘80), Vol. 14, 262–270. New York, NY: ACM
  • Bulling, A., and H. Gellersen. 2010. “Toward Mobile Eye-based Human–Computer Interaction.” IEEE Pervasive Computing 9 (4): 8–12. doi: 10.1109/MPRV.2010.86
  • Bulling, A., D. Roggen, and G. Tröster. 2008. “EyeMote – Towards Context-aware Gaming Using Eye Movements Recorded from Wearable Electrooculography.” In Proceedings of the 2nd International Conference on Fun and Games, 33–45. Berlin: Springer Berlin Heidelberg
  • Burke, J. L., M. S. Prewett, A. A. Gray, L. Yang, F. R. Stilson, M. D. Coovert, L. R. Elliot, and E. Redden. 2006. “Comparing the Effects of Visual–Auditory and Visual-Tactile Feedback on User Performance: A Meta-analysis.” In Proceedings of the 8th International Conference on Multimodal Interfaces (ICMI ‘06), 108–117. New York, NY: ACM
  • Callahan, J., D. Hopkins, M. Weiser, and B. Shneiderman. 1988. “An Empirical Comparison of Pie vs. Linear Menus.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 95–100. New York, NY: ACM
  • Cassidy, A., D. Hook, and A. Baliga. 2002. “Hand Tracking Using Spatial Gesture Modeling and Visual Feedback for a Virtual DJ System.” In Proceedings of the 4th IEEE International Conference on Multimodal Interfaces (ICMI ‘02), 197–202. Washington, DC: IEEE Computer Society
  • Chen, F., E. Choi, J. Epps, S. Lichman, N. Ruiz, Y. Shi, Chen, F., E. Choi, J. Epps, S. Lichman, N. Ruiz, Y. Shi, R. Taib, and M. Wu. 2005. “A Study of Manual Gesture-based Selection for the PEMMI Multimodal Transport Management Interface.” In Proceedings of the 7th International Conference on Multimodal Interfaces (ICMI ‘05), 274–281. New York, NY: ACM
  • Clawson, J., K. Lyons, T. Starner, and E. Clarkson. 2005. “The Impacts of Limited Visual Feedback on Mobile Text Entry for the Twiddler and Mini-QWERTY Keyboards.” In Proceedings of the Ninth IEEE International Symposium on Wearable Computers (ISWC ‘05), 170–177. Washington, DC: IEEE Computer Society
  • Drewes, H., and A. Schmidt. 2007. “Interacting with the Computer Using Gaze Gestures.” In Proceedings of the 11th IFIP TC 13 International Conference on Human–Computer Interaction – Volume Part II (INTERACT ‘07), 475–488. Berlin: Springer-Verlag
  • Dybdal, M. L., J. S. Agustin, and J. P. Hansen. 2012. “Gaze Input for Mobile Devices by Dwell and Gestures.” In Proceedings of the Symposium on Eye Tracking Research and Applications. New York, NY: ACM.
  • Foehrenbach, S., W. König, J. Gerken, and H. Reiterer. 2009. “Tactile Feedback Enhanced Hand Gesture Interaction at Large, High-resolution Displays.” Journal of Visual Languages & Computing 20: 341–351. doi: 10.1016/j.jvlc.2009.07.005
  • Garzotto, F., and M. Valoriani. 2012. “‘Don’t Touch the Oven’: Motion-based Touchless Interaction with Household Appliances.” In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI ‘12), 721–724. New York, NY: ACM
  • Graupner, S.-T., and S. Pannasch. 2014. “Continuous Gaze Cursor Feedback in Various Tasks: Influence on Eye Movement Behavior, Task Performance and Subjective Distraction.” In HCI International 2014 – Posters’ Extended Abstracts, 323–329. Cham, Switzerland: Springer International Publishing
  • Gustafson, S. 2012. “Imaginary Interfaces: Touchscreen-like Interaction Without the Screen.” In CHI ‘12 Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘12), 927–930. New York, NY: ACM
  • Heikkilä, H. 2013. “Tools for a Gaze-controlled Drawing Application – Comparing Gaze Gestures Against Dwell Buttons.” In Proceedings of INTERACT 2013, Part II, LNCS 8118, 187–201. Berlin: Springer
  • Heikkilä, H., and K.-J. Räihä. 2009. “Speed and Accuracy of Gaze Gestures.” Journal of Eye Movement Research 3 (2): 1–14.
  • Heimonen, T., J. Hakulinen, M. Turunen, J. P. Jokinen, T. Keskinen, and R. Raisamo. 2013. “Designing Gesture-based Control for Factory Automation.” In 14th IFIP TC 13 International Conference on Human–Computer Interaction – INTERACT 2013, 202–209. Berlin: Springer
  • Hinckley, K., R. Pausch, and D. Proffitt. 1997. “Attention and Visual Feedback: The Bimanual Frame of Reference.” In Proceedings of the 1997 Symposium on Interactive 3D Graphics (I3D ‘97), 121–126. New York, NY: ACM
  • Isokoski, P., and M. Käki. 2002. “Comparison of Two Touchpad-based Methods for Numeric Entry.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘02), 25–32. New York, NY: ACM
  • Isokoski, P., and R. Raisamo. 2000. “Device Independent Text Input: A Rationale and an Example.” In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI ‘00), 76–83. New York, NY: ACM
  • Istance, H., A. Hyrskykari, L. Immonen, S. Mansikkamaa, and S. Vickers. 2010. “Designing Gaze Gestures for Gaming: An Investigation of Performance.” In Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications (ETRA). New York, NY: ACM
  • Iwamoto, T., M. Tatezono, and H. Shinoda. 2008. “Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound.” In Proceedings of EuroHaptics 2008. Haptics: Perception, Devices and Scenarios, 504–513. Berlin: Springer
  • Jacob, R. J. 1991. “The Use of Eye Movements in Human–Computer Interaction Techniques: What You Look At Is What You Get.” ACM Transactions on Information Systems 9 (2): 152–169. doi: 10.1145/123078.128728
  • Kajastila, R., and T. Lokki. 2013. “Eyes-free Interaction with Free-hand Gestures and Auditory Menus.” International Journal of Human–Computer Studies 71 (5): 627–640. doi: 10.1016/j.ijhcs.2012.11.003
  • Kangas, J., D. Akkil, J. Rantala, P. Isokoski, P. Majaranta, and R. Raisamo. 2014a. “Gaze Gestures and Haptic Feedback in Mobile Devices.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2014), 435–438. New York, NY: ACM
  • Kangas, J., J. Rantala, P. Majaranta, P. Isokoski, and R. Raisamo. 2014b. “Haptic Feedback to Gaze Events.” In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ‘14), 11–18. New York, NY: ACM
  • Karam, M., and M. C. Schraefel. 2005. A Taxonomy of Gestures in Human Computer Interactions. Southampton: University of Southampton, Computing Service. http://eprints.soton.ac.uk/261149/.
  • Khoshelham, K., and S. O. Elberink. 2012. “Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications.” Sensors 12: 1437–1454. doi: 10.3390/s120201437
  • Köpsel, A., and A. Huckauf. 2013. “Evaluation of Static and Dynamic Freehand Gestures in Device Control.” In Proceedings of the Tilburg Gesture Research Meeting (TiGeR 2013 ), Tilburg
  • Kratz, S., and R. Ballagas. 2009. “Unravelling Seams: Improving Mobile Gesture Recognition with Visual Feedback Techniques.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘09), 937–940. New York, NY: ACM
  • Lécuyer, A., C. Megard, J. M. Burkhardt, T. C. Lim, P. Coiffet, and L. Graux. 2002. “The Effect of Haptic, Visual and Auditory Feedback on an Insertion Task on a 2-Screen Workbench.” In Proceedings of Immersive Projection Technology Symposium, 12–18
  • Lee, S. C., B. Li, and T. Starner. 2011. “AirTouch: Synchronizing In-air Hand Gesture and On-body Tactile Feedback to Augment Mobile Gesture Interaction.” In Proceedings of the 2011 15th Annual International Symposium on Wearable Computers (ISWC ‘11), 3–10. Washington, DC: IEEE Computer Society
  • Lenman, S., L. Bretzner, and B. Thuresson. 2002. Computer Vision Based Hand Gesture Interfaces for Human–Computer Interaction. Schweden: Royal Institute of Technology.
  • MacGregor, C., and A. Thomas. 2001. “Does Multi-modal Feedback Help in Everyday Computing Tasks?” In Proceedings of the 8th IFIP International Conference on Engineering for Human–Computer Interaction (EHCI ‘01), 251–262. Berlin: Springer-Verlag
  • MacKenzie, I. S., and S. J. Castellucci. 2012. “Reducing Visual Demand for Gestural Text Input on Touchscreen Devices.” In Extended Abstracts of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ‘12), 2585–2590. New York, NY: ACM
  • Majaranta, P., I. S. MacKenzie, A. Aula, and K. J. Räihä. 2006. “Effects of Feedback and Dwell Time on Eye Typing Speed and Accuracy.” Universal Access in the Information Society 5 (2): 199–208. doi: 10.1007/s10209-006-0034-z
  • Møllenbach, E., J. P. Hansen, and M. Lillholm. 2013. “Eye Movements in Gaze Interaction.” Journal of Eye Movement Research 6 (2): 1–15.
  • Nielsen, J. 1993. Usability Engineering. Boston, MA: Academic Press.
  • Norman, D. A. 2010. “The Way I See It: Natural User Interfaces Are Not Natural.” Interactions 17 (3): 6–10. doi: 10.1145/1744161.1744163
  • Pakkanen, T., R. Raisamo, K. Salminen, and V. Surakka. 2010. “Haptic Numbers: Three Haptic Representation Models for Numbers on a Touch Screen Phone.” In International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI ‘10), 4. New York, NY: ACM, Article 35
  • Piateski, E., and L. Jones. 2005. “Vibrotactile Pattern Recognition on the Arm and Torso.” In Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC ‘05), 90–95. Washington, DC: IEEE Computer Society
  • Porta, M., and M. Turina. 2008. “Eye-S: A Full-screen Input Modality for Pure Eye-based Communication.” In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA ‘08), 27–34. New York, NY: ACM
  • Prinz, W. 1997. “Perception and Action Planning.” European Journal of Cognitive Psychology 9: 129–154. doi: 10.1080/713752551
  • Prinz, W., and B. Hommel. 2002. Common Mechanisms in Perception and Action: Attention and Performance XIV. Oxford: Oxford Press.
  • Rantala, J., J. Kangas, D. Akkil, P. Isokoski, and R. Raisamo. 2014. “Glasses with Haptic Feedback of Gaze Gestures.” In Proceedings of the Extended Abstracts of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI EA ‘14), 1597–1602. New York, NY: ACM
  • Rantala, J., R. Raisamo, J. Lylykangas, T. Ahmaniemi, J. Raisamo, J. Rantala, K. Mäkelä, K. Salminen, and V. Surakka. 2011. “The Role of Gesture Types and Spatial Feedback in Haptic Communication.” IEEE Transactions on Haptics 4 (4): 295–306. doi: 10.1109/TOH.2011.4
  • Rozado, D., J. San Agustin, F. B. Rodriguez, and B. Varona. 2012. “Gliding and Saccadic Gaze Gesture Recognition in Real Time.” ACM Transactions on Interactive Intelligent Systems 1 (2): 1–27 (Article No. 1). doi: 10.1145/2070719.2070723
  • Saxen, F., O. Rashid, A. Al-Hamadi, S. Adler, A. Kernchen, and R. Mecke. 2012. “Image-based Gesture Recognition for User Interaction with Mobile Companion-based Assistance Systems.” In Proceedings of the 4th International Conference of Soft Computing and Pattern Recognition (SoCPaR ), Brunei
  • Shan, C. 2010. “Gesture Control for Consumer Electronics.” In Multimedia Interaction and Intelligent User Interfaces, edited by L. Shao, C. Shan, J. Luo, and M. Etoh, 107–128. London: Springer.
  • Shneiderman, B. 1998. Designing the User Interface. Reading, MA: Addison-Wesley.
  • Tsukadaa, K., and M. Yasumura. 2002. “Ubi-Finger: Gesture Input Device for Mobile Use.” In Proceedings of the APCHI 2002 (5th Asia Pacific Conference on Human Computer Interaction): User Interaction Technology in the 21st Century. Beijing: Science Press
  • Turunen, S., H. Vilpponen, A. Vänskä, J. Raisamo, R. Raisamo, and J. Rantala. 2012. US Patent No. 20120206371
  • Urbina, M. H., and A. Huckauf. 2010. “Alternatives to Single Character Entry and Dwell Time Selection on Eye Typing.” In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA ‘10), 315–322. New York, NY: ACM
  • Vickers, S., H. Istance, and A. Hyrskykari. 2013. “Performing Locomotion Tasks in Immersive Computer Games with an Adapted Eye-tracking Interface.” ACM Transactions on Accessible Computing (TACCESS) 5 (1): 1–33 (Article No. 2) doi: 10.1145/2531922.2514856
  • Vidal, M., K. Pfeuffer, A. Bulling, and H. W. Gellersen. 2013. “Pursuits: Eye-based Interaction with Moving Targets.” In CHI ‘13 Extended Abstracts on Human Factors in Computing Systems, 3147–3150. New York, NY: ACM
  • Villamor, C., D. Willis, and L. Wroblewski. 2010. Touch Gesture Reference Guide. LukeW Ideation + Design. http://www.lukew.com/ff/entry.asp?1071
  • Vogel, D., and R. Balakrishnan. 2005. “Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays.” In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST ‘05), 33–42. New York, NY: ACM
  • Ware, C., and H. H. Mikaelian. 1987. “An Evaluation of an Eye Tracker as a Device for Computer Input.” In CHI ‘87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface, 183–188. New York, NY: ACM
  • Winer, B. 1971. Statistical Principles in Experimental Design. New York: McGraw-Hill.
  • Witt, H., M. Lawo, and M. Drugge. 2008. “Visual Feedback and Different Frames of Reference: The Impact on Gesture Interaction Techniques for Wearable Computing.” In Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileCHI ‘08), 293–300. New York, NY: ACM
  • Wobbrock, J. O., M. R. Morris, and A. D. Wilson. 2009. “User-defined Gestures for Surface Computing.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘09), 1083–1092. New York, NY: ACM
  • Wobbrock, J. O., J. Rubinstein, M. W. Sawyer, and A. T. Duchowski. 2008. “Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry.” In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA ‘08), 11–18. New York, NY: ACM
  • Wu, H., and J. Wang. 2013. “Understanding User Preferences for Freehand Gestures in the TV Viewing Environment.” International Journal on Advances in Information Sciences and Service Sciences 5 (4): 709–717. doi: 10.4156/aiss.vol5.issue4.86
  • Zhai, S., P. O. Kristensson, C. Appert, T. H. Andersen, and X. Cao. 2012. “Foundational Issues in Touch-surface Stroke Gesture Design – An Integrative Review.” Foundations and Trends in Human–Computer Interaction 5 (2): 97–205. doi: 10.1561/1100000012
  • Zhang, Y., A. Bulling, and H. Gellersen. 2013. “Sideways: A Gaze Interface for Spontaneous Interaction with Situated Displays.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 851–860. New York, NY: ACM
  • Zhao, S., P. Dragicevic, M. Chignell, R. Balakrishnan, and P. Baudisch. 2007. “Earpod: Eyes-free Menu Selection Using Touch Input and Reactive Audio Feedback.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘07), 1395–1404. New York: ACM.
  • Zimmerman, T. G., J. Lanier, C. Blanchard, S. Bryson, and Y. Harvill. 1987. “A Hand Gesture Interface Device.” In ACM SIGCHI Bulletin (In Vol. 18, No. 4), 189–192 New York, NY: ACM

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.