274
Views
14
CrossRef citations to date
0
Altmetric
Thinking beyond the box

PalmRC: leveraging the palm surface as an imaginary eyes-free television remote control

, , , &
Pages 829-843 | Received 08 Nov 2012, Accepted 28 May 2013, Published online: 29 Oct 2013

References

  • Ballendat, T., Marquardt, N., and Greenberg, S., 2010. Proxemic interaction: designing for a proximity and orientation-aware environment. In: ACM International Conference on Interactive Tabletops and Surfaces (ITS’10), 7–10 November, Saarbrücken, Germany. New York: ACM, 121–130.
  • Berglund, A., et al., 2006. Paper remote: an augmented television guide and remote control. Universal Access in the Information Society, 4 (4), 300–327. doi: 10.1007/s10209-004-0108-8
  • Bernhaupt, R., et al., 2008. Trends in the living room and beyond: results from ethnographic studies using creative and playful probing. Computers in Entertainment (CIE), 6 (1), 23 pp. doi:10.1145/1350843.1350848.
  • Brookstone. 2013. Available from: http://www.brookstone.com/webassets/pdf/650598p_manual.pdf [Accessed 17 September 2013].
  • Brutti, A., et al., 2008. WOZ acoustic data collection for interactive TV. In: Nicoletta Calzolari, et al., eds. Proceeding of the sixth international conference on Language and resources and evaluation (LREC’08), European language resources association (ELRA), 28–30 May, Marrakech, Morocco. Luxembourg: European Language Resources Association (ELRA).
  • Cesar, P., Bulterman, D., and Jansen A.J., 2008. Usages of the secondary screen in an interactive television environment: control, enrich, share, and transfer television content. Proceedings of the changing television environments, Salzburg, Austria.
  • Dezfuli, N., et al., 2012. Leveraging the spatial information of viewers for social interactive television systems. Proceedings of the EuroITV, Berlin, Germany.
  • Freeman, W.T. and Weissman, C.D., 1995. Television control by hand gestures. Proceeding of the international workshop on automatic face and gesture recognition, Santa Barbara, CA, USA. 179–183.
  • Gustafson, S., Bierwirth, D., and Baudisch, P., 2010. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In: Proceedings of the 23rd annual ACM symposium on User interface software and technology (UIST’10), New York, NY, USA. New York: ACM, 3–12.
  • Gustafson, S., Holz, C., and Baudisch, P., 2011. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device. In: Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST’11), 16–19 October, Santa Barbara, CA, USA. New York: ACM, 283–292.
  • Gustafson, S., Rabe, B., and Baudisch, P., 2013. Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing. In: Proceedings of the SIGCHI conference on Human factors in computing systems (CHI’13), Paris, France. New York: ACM, 889–898.
  • Harrison, C., Benko, H., and Wilson, A., 2011. OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST’11), Santa Barbara, CA, USA. New York: ACM, 441–450.
  • Harrison, C., Tan, D., and Morris, D., 2010. Skinput: appropriating the body as an input surface. In: Proceedings of the SIGCHI conference on Human factors in computing systems (CHI’10), Atlanta, GA, USA. New York: ACM, 453–462.
  • Harrison, C., Ramamurthy, S., and Hudson, S.E., 2012. On-body interaction: armed and dangerous. In: Stephen N. Spencer, ed. Proceeding of the sixth internation conference on Tangible, embedded and embodied interaction (TEI’12), Kingston, Canada. New York: ACM, 69–76.
  • Hart, S.G. and Staveland, L.E., 1988. Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: A. Hancock and N. Meshkati, eds. Human mental workload. Amsterdam: North Holland Press, 139–183.
  • Hassenzahl, M., Burmester, M., and Koller, F., 2003. AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. Proc. Mensch & Computer. Interaktion in Bewegung, 187–196.
  • Hess, J., Küstermann, G., and Pipek, V., 2008. pRemote: a user customizable remote control. In: CHI’08 extended abstracts on Human factors in computing systems (CHI EA ’08), Florence, Italy. New York: ACM Press, 3279–3284.
  • Igrashi, T. and Hughes, J.F., 2001. Voice as sound: using non-verbal voice input for interactive control. In: Proceedings of the 14th annual ACM symposium on User interface software and technology (UIST ’01), Orlando, FL, USA. New York: ACM, 155–156.
  • Kohli, L. and Whitton, M., 2005. The Haptic hand: providing user interface feedback with the non-dominant hand in virtual environments. Proceedings of the GI 2005, Victoria, BC, Canada. 1–8.
  • Kuester, F., et al., 2005. Towards keyboard independent touch typing in VR. Proceedings of the VRST 2005, Monterery, CA, USA. 86–95.
  • Luyten, K., et al., 2006. Telebuddies: social stitching with interactive television. Proceedings of the CHI extended abstracts 2006, Montreal, Canada. 1049–154.
  • Mäntyjärvi, J., et al., 2004. Enabling fast and effortless customization in accelerometer based gesture interaction. Proceedings of the mobile and ubiquitous multimedia, College Park, MD, USA. 25–31.
  • Microsoft Kinect. 2013. Available from: http://www.xbox.com/kinect [Accessed 17 September 2013].
  • Mirlacher, T., et al., 2010. Interactive simplisity for iTV: minimizing keys for navigating content. Proceedings of the EuroITV 2010, Tampere, Finland. 137–140.
  • Mistry, P., Maes, P., and Chang, L., 2009. WUW – wear Ur world: a wearable gestural interface. Proceedings of the CHI extended abstracts 2009, Boston, MA, USA. 4111–4116.
  • Negroponte, N. and Asher, M., 1995. Being digital. New York, NY: Random House Inc.
  • Obrist, M., et al., 2010. Field evaluation of a cross platform 6 key navigation model and a unified user interface design. Proceedings of the EuroITV 2010, Tampere, Finland. 141–144.
  • Pemberton, L. and Griffiths, R.N., 2003. Usability evaluation techniques for interactive television. Proceedings of the HCI international, Crete.
  • Robertson, S., et al., 1996. Dual device user interface design: PDAs and interactive television. Proceedings of the CHI 1996, Vancouver BC, Canada. 79–86.
  • Samsung Smart TV. 2012. Available from: http://www.samsung.com/us/article/apps-built-for-your-tv [Accessed 17 September 2013]
  • Sherrington, C.S. 1907. On the proprioceptive system, especially in its reflex aspect. Brain, 29 (4), 467–482. doi: 10.1093/brain/29.4.467
  • Strauss, A. and Corbin, J., 2008. Basics of qualitative research: techniques and procedures for developing grounded theory. New York: Sage Publications.
  • Tamaki, E., Miyaki, T., and Rekimoto, J., 2009. Brainy Hand: an earworn hand gesture interaction device. Proceedings of the CHI extended abstracts 2009, Boston, MA, USA. 4255–4260.
  • Weisz, J.D., et al., 2007. Watching together: integrating text chat with video. Proceedings of the CHI 2007, San Jose, CA, USA. 877–886.
  • Wilson, A.D., 2010. Using a depth camera as a touch sensor. Proceedings of the ITS 2010, Saarbrücken, Germany. 69–72.
  • Wilson, A.D. and Benko, H., 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. Proceedings of the UIST 2010, New York, USA. 273–282.
  • Zimmermann, G., et al., 2003. Toward a unified universal remote console standard. Proceedings of the CHI extended abstracts 2003, Ft. Lauderdale, FL, USA. 874–875.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.