569
Views
23
CrossRef citations to date
0
Altmetric
Original Research

A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments

, &

REFERENCES

  • Abidi, S., Williams, M., & Johnston, B. (2013). Human pointing as a robot directive. In Proceedings of the 8th ACM/IEEE International Conference on Human–Robot Interaction (pp. 67–68). IEEE Press.
  • Bernardos, A. M., Bergesio, L., Iglesias, J., & Casar, J. R. (2013). MECCANO: A mobile-enabled configuration framework to coordinate and augment networks of smart objects. Journal of Universal Computer Science, 19(17), 2503–2525.
  • Bernardos, A. M., Marquínez, I., Gómez, D., Besada, J. A., & Casar, J. R. (2015). A system for multimodal interaction with Kinect-enabled virtual windows. Proceedings of the International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN).
  • Bevan, N. (2009). What is the difference between the purpose of usability and user experience evaluation methods. In Proceedings of the Workshop UXEM (Vol. 9) (pp. 1–4), colocated with INTERACT 2009, Uppsala, Sweden.
  • Biskupski, A., Fender, A. R., Feuchtner, T. M., Karsten, M., & Willaredt, J. D. (2014). Drunken ed: A balance game for public large screen displays. In CHI’14 Extended Abstracts on Human Factors in Computing Systems (pp. 289–292). New York: ACM.
  • Bolt, R. A. (1980). “Put-that-there”: Voice and gesture at the graphics interface. In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques (Vol. 14, No. 3) (pp. 262–270). New York: ACM.
  • Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4–7.
  • Card, S. K., English, W. K., & Burr, B. J. (1978). Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics, 21(8), 601–613.
  • Carnahan, H., & Marteniuk, R. G. (1991). The temporal organization of hand, eye, and head movements during reaching and pointing. Journal of Motor Behavior, 23(2), 109–119.
  • Cho, K., Lee, J.-H., Lee, B.-T., & Park, E. (2015). Effects of feedforward in in-air remote pointing. International Journal of Human–Computer Interaction, 31(2), 89–100.
  • Chuan, N. K., & Sivaji, A. (2012). Combining eye gaze and hand tracking for pointer control in HCI: Developing a more robust and accurate interaction system for pointer positioning and clicking. In 2012 IEEE Colloquium on Humanities, Science and Engineering (CHUSER) (pp. 172–176). IEEE.
  • Corbett, B., Nam, C. S., Yamaguchi, T. (2015). The effects of haptic feedback and visual distraction on pointing task performance. International Journal of Human–Computer Interaction, 32( 2), 89–102.
  • Cournia, N., Smith, J. D., & Duchowski, A. T. (2003). Gaze-vs. hand-based pointing in virtual environments. In CHI’03 Extended Abstracts on Human factors in Computing Systems ( pp. 772–773). New York: ACM.
  • Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.
  • Crossman, E. R. F. W. (1956). The measurement of perceptual load in manual operations. Doctoral Dissertation, University of Birmingham.
  • Fernández, A., Bergesio, L., Bernardos, A. M., Besada, J. A., & Casar, J. R. (2015). A Kinect-based system to enable interaction by pointing in smart spaces. In Sensors Applications Symposium (SAS) (pp. 1–6). IEEE.
  • Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6), 381.
  • Fürntratt, H., & Neuschmied, H. (2014). Evaluating pointing accuracy on Kinect V2 sensor. International Conference on Multimedia and Human–Computer Interaction (MHCI).
  • Gallo, L., Placitelli, A. P., & Ciampi, M. (2011). Controller-free exploration of medical image data: Experiencing the Kinect. In 24th International Symposium on Computer-based Medical Systems 1–6. IEEE.
  • Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500.
  • Henschke, M., Gedeon, T., & Jones, R. (2013). Extending the index finger is worse than sitting: Posture and minimal objects in 3D pointing interfaces. In 4th International Conference on Cognitive Infocommunications (CogInfoCom) (pp. 797–802). IEEE.
  • Imai, G. (1997). Gestures: Body language and nonverbal communication. TASSI: Asian Gestures – Body Language and Nonverbal Communication. California State University, Pomona, CA. Retrieved from: http://www.coralgablescavaliers.org/ourpages/users/099346/IB%20Theory%20of%20Knowledge/Bastian%20Chapter%2006/Bastian%206/Gestures%20%20Body%20Language%20and%20Nonverbal%20Communication.pdf
  • ISO 9241–9. (2000). Ergonomic requirements for office work with visual display terminals (VDTs)-Part 9: Requirements for non-keyboard input devices. FDIS-Final Draft International Standard. International Organization for Standardization.
  • Jacob, R. J. (1990). What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human factors in Computing Systems (pp. 11–18). New York: ACM.
  • Kim, H., Kim, Y., & Lee, E. C. (2014). Method for user interface of large displays using arm pointing and finger counting gesture recognition. The Scientific World Journal, 2014, 1–9. Retrieved from http://www.hindawi.com/journals/tswj/2014/683045/
  • König, W. A., Gerken, J., Dierdorf, S., & Reiterer, H. (2009). Adaptive pointing–design and evaluation of a precision enhancing technique for absolute pointing devices. In T. Gross, J. Gulliksen, P. Kotzé, L. Oestreicher, P. Palanque, R. O. Prates, & M. Winkler (Eds.), Human–Computer Interaction–INTERACT 2009 (pp. 658–671). Springer Verlag Heidelberg.
  • Kouroupetroglou, G., Pino, A., Balmpakakis, A., Chalastanis, D., Golematis, V., Ioannou, N., & Koutsoumpas, I. (2012). Using Wiimote for 2D and 3D pointing tasks: gesture performance evaluation. In E. Efthimiou, F. Kouroupetroglou, & S.-E. Fotinea (Eds.), Gesture and Sign Language in Human–Computer Interaction and Embodied Communication, Volume 7206 of the series Lecture Notes in Computer Science (pp. 13–23). Springer Berlin Heidelberg.
  • Lin, C. J., Ho, S. H., & Chen, Y. J. (2015). An investigation of pointing postures in a 3D stereoscopic environment. Applied Ergonomics, 48, 154–163.
  • MacKenzie, I. S. (1992). Fitts’ law as a research and design tool in human-computer interaction. Human–Computer Interaction, 7(1), 91–139.
  • MacKenzie, I. S., & Teather, R. J. (2012). FittsTilt: The application of Fitts’ law to tilt-based interaction. In Proceedings of the 7th Nordic Conference on Human–Computer Interaction: Making Sense Through Design (pp. 568–577). New York: ACM.
  • Miniotas, D., Špakov, O., Tugoy, I., & MacKenzie, I. S. (2006). Speech-augmented eye gaze interaction with small closely spaced targets. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (pp. 67–72). New York: ACM.
  • Mora, K. A. F., & Odobez, J. M. (2012). Gaze estimation from multimodal Kinect data. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (pp. 25–30). IEEE.
  • Murata, A., & Iwase, H. (2001). Extending Fitts’ law to a three-dimensional pointing task. Human Movement Science, 20(6), 791–805.
  • Murphy-Chutorian, E., & Trivedi, M. M. (2009). Head pose estimation in computer vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(4), 607–626.
  • Neggers, S. F., & Bekkering, H. (2001). Gaze anchoring to a pointing target is present during the entire pointing movement and is driven by a non-visual signal. Journal of Neurophysiology, 86(2), 961–970.
  • Newman, R., Matsumoto, Y., Rougeaux, S., & Zelinsky, A. (2000). Real-time stereo tracking for head pose and gaze estimation. In Proceedings of the International Conference on Automatic Face and Gesture Recognition (pp. 122–128). IEEE.
  • Norman, D. A. (2010). Natural user interfaces are not natural. Interactions, 17(3), 6–10.
  • Pappu, R., & Beardsley, P. A. (1998). A qualitative approach to classifying gaze direction. In Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition (pp. 160–165). IEEE.
  • Pino, A., Tzemis, E., Ioannou, N., & Kouroupetroglou, G. (2013). Using kinect for 2D and 3D pointing tasks: Performance evaluation. In M. Kurosu (Ed.), Human–Computer Interaction. Interaction Modalities and Techniques (pp. 358–367). Springer Berlin Heidelberg.
  • Rantanen, V., Verho, J., Lekkala, J., Tuisku, O., Surakka, V., & Vanhala, T. (2012). The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. In S. N. Spencer (Ed.), Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 345–348). New York: ACM.
  • Rauschenberger, M., Olschner, S., Cota, M. P., Schrepp, M., & Thomaschewski, J. (2012). Measurement of user experience: A Spanish language version of the user experience questionnaire (UEQ). In 7th Iberian Conference on Information Systems and Technologies (CISTI) (pp. 1–6). IEEE.
  • Reale, M. J., Canavan, S., Yin, L., Hu, K., & Hung, T. (2011). A multi-gesture interaction system using a 3-D iris disk model for gaze estimation and an active appearance model for 3-D hand pointing. IEEE Transactions on Multimedia, 13(3), 474–486.
  • Rossetti, Y., Tadary, B., & Prablanc, C. (1994). Optimal contributions of head and eye positions to spatial accuracy in man tested by visually directed pointing. Experimental brain research, 97(3), 487–496.
  • Schwaller, M., Brunner, S., & Lalanne, D. (2013). Two handed mid-air gestural hci: Point+ command. In M. Kurosu (Ed.), Human–Computer Interaction. Interaction Modalities and Techniques, Volume 8007 of the series Lecture Notes in Computer Science (pp. 388–397). Springer Berlin Heidelberg. Retrieved from http://link.springer.com/chapter/10.1007%2F978-3-642-39330-3_41
  • Sibert, L. E., & Jacob, R. J. (2000). Evaluation of eye gaze interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 281–288). New York: ACM.
  • Sibert, L. E., Templeman, J. N., & Jacob, R. J. (2001). Evaluation and analysis of eye gaze interaction (No. NRL/FR/5513–01-9990). Naval Research Lab Washington, DC.
  • Song, H. T. (2012). Updating Fitts’ law to account for restricted display field of view conditions. International Journal of Human–Computer Interaction, 28(4), 269–279.
  • Soukoreff, R. W., & MacKenzie, I. S. (2004). Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal of Human–Computer Studies, 61(6), 751–789.
  • Stiefelhagen, R., & Zhu, J. (2002). Head orientation and gaze direction in meetings. In CHI ‘02 Extended Abstracts on Human Factors in Computing Systems (CHI EA ‘02) (pp. 858–859). New York, NY: ACM.
  • Valenti, R., Sebe, N., & Gevers, T. (2012). Combining head pose and eye location information for gaze estimation. IEEE Transactions on Image Processing, 21(2), 802–815.
  • Valenti, R., Yucel, Z., & Gevers, T. (2009). Robustifying eye center localization by head pose cues. In IEEE Conference on Computer Vision and Pattern Recognition (pp. 612–618). IEEE.
  • Watanabe, K., Miyake, Y., Nakamichi, N., Yamada, T., & Ozeki, T. (2014). Remote Touch Pointing for Smart TV Interaction. In 3rd Global Conference on Consumer Electronics (GCCE) (pp. 232–235). IEEE.
  • Weiser, M., & Brown, J. S. (1995). Designing Calm Technology. Xerox PARC, California, USA. Retrieved from July, 1, 2015.
  • Wu, J., & Trivedi, M. M. (2008). A two-stage head pose estimation framework and evaluation. Pattern Recognition, 41(3), 1138–1158.
  • Xiong, X., Cai, Q., Liu, Z., & Zhang, Z. (2014). Eye gaze tracking using an RGBD camera: a comparison with a RGB solution. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (pp. 1113–1121). New York: ACM.
  • Xu, X., & McGorry, R. W. (2015). The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures. Applied Ergonomics, 49, 47–54.
  • Zaranek, A., Ramoul, B., Yu, H. F., Yao, Y., & Teather, R. J. (2014). Performance of modern gaming input devices in first-person shooter target acquisition. In CHI’14 Extended Abstracts on Human Factors in Computing Systems (pp. 1495–1500). New York: ACM.
  • Zhai, S., Morimoto, C., & Ihde, S. (1999). Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 246–253). New York: ACM.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.