650
Views
4
CrossRef citations to date
0
Altmetric
Research Article

WristDial: An Eyes-Free Integer-Value Input Method by Quantizing the Wrist Rotation

, , , , &

References

  • Amento, B., Hill, W., & Terveen, L. (2002). The sound of one hand: A wrist-mounted bio-acoustic fingertip gesture interface. In Chi’02 extended abstracts on human factors in computing systems (pp. 724–725).
  • Badger, P. (2020). Arduino playground - capacitivesensor. (Online). The Arduino Playground. Retrieved March 26, 2020, from https://playground.arduino.cc/Main/CapacitiveSensor/
  • Bianchi, A., Oakley, I., & Kwon, D. S. (2012). Counting clicks and beeps: Exploring numerosity based haptic and audio pin entry. Interacting with Computers, 24(5), 409–422. https://doi.org/10.1016/j.intcom.2012.06.005
  • Boone, D. C., & Azen, S. P. (1979). Normal range of motion of joints in male subjects. The Journal of Bone & Joint Surgery, 61(5), 756–759. https://doi.org/10.2106/00004623-197961050-00017
  • Brewster, S., Lumsden, J., Bell, M., Hall, M., & Tasker, S. (2003). Multimodal’eyes-free’interaction techniques for wearable devices. In Proceedings of the sigchi conference on human factors in computing systems (pp. 473–480). Fort Lauderdale, FL, USA.
  • Cauchard, J. R., Cheng, J. L., Pietrzak, T., & Landay, J. A. (2016). Activibe: Design and evaluation of vibrations for progress monitoring. In Proceedings of the 2016 chi conference on human factors in computing systems (pp. 3261–3271). San Jose, CA, USA.
  • Chan, L., Chen, Y.-L., Hsieh, C.-H., Liang, R.-H., & Chen, B.-Y. (2015). Cyclopsring: Enabling whole- hand and context-aware interactions through a fisheye ring. In Proceedings of the 28th annual acm symposium on user interface software & technology (pp. 549–556). Charlotte, NC.
  • Crossan, A., Williamson, J., Brewster, S., & Murray-Smith, R. (2008). Wrist rotation for interaction in mobile contexts. In Proceedings of the 10th international conference on human computer interaction with mobile devices and services (pp. 435–438). Amsterdam, NL.
  • Dementyev, A., & Paradiso, J. A. (2014). Wristflex: Low-power gesture input with wrist-worn pressure sensors. In Proceedings of the 27th annual acm symposium on user interface software and technology (pp. 161–166). Honolulu, HI.
  • Gong, J., Yang, X.-D., & Irani, P. (2016). Wristwhirl: One-handed continuous smartwatch input using wrist gestures. In Proceedings of the 29th annual symposium on user interface software and technology (pp. 861–872). Tokyo, JP.
  • Guo, A., & Paek, T. (2016). Exploring tilt for no-touch, wrist-only interactions on smartwatches. In Proceedings of the 18th international conference on human-computer interaction with mobile devices and services (pp. 17–28). Florence, IT.
  • Gupta, A., Pietrzak, T., Roussel, N., & Balakrishnan, R. (2016). Direct manipulation in tactile displays. In Proceedings of the 2016 chi conference on human factors in computing systems (pp. 3683–3693). San Jose, CA, USA.
  • Harrison, C., Tan, D., & Morris, D. (2010). Skinput: Appropriating the body as an input surface. In Proceedings of the sigchi conference on human factors in computing systems (pp. 453–462). Atlanta, GA, USA.
  • Jagdish, D., Sawhney, R., Gupta, M., & Nangia, S. (2008). Sonic grid: An auditory interface for the visually impaired to navigate gui-based environments. In Proceedings of the 13th international conference on intelligent user interfaces (pp. 337–340). Canary Islands, ES.
  • Kajastila, R., & Lokki, T. (2013). Eyes-free interaction with free-hand gestures and auditory menus. International Journal of Human-Computer Studies, 71(5), 627–640. https://doi.org/10.1016/j.ijhcs.2012.11.003
  • Kajastila, R. A., & Lokki, T. (2009). A gesture-based and eyes-free control method for mobile devices. In Chi’09 extended abstracts on human factors in computing systems (pp. 3559–3564). ACM.
  • Khan, A. A., O’Sullivan, L., & Gallwey, T. J. (2010). Effect on discomfort of frequency of wrist exertions combined with wrist articulations and forearm rotation. International Journal of Industrial Ergonomics, 40(5), 492–503. https://doi.org/10.1016/j.ergon.2010.05.003
  • Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., & Olivier, P. (2012). Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th annual acm symposium on user interface software and technology (pp. 167–176). Cambridge, MA, USA.
  • Kubo, Y., Koguchi, Y., Shizuki, B., Takahashi, S., & Hilliges, O. (2019). Audiotouch: Minimally invasive sensing of micro-gestures via active bio-acoustic sensing. In Proceedings of the 21st international conference on human-computer interaction with mobile devices and services (pp. 1–13). Taipei, TW.
  • Lechelt, E. C. (1975). Temporal numerosity discrimination: Intermodal comparisons revisited. British Journal of Psychology, 66(1), 101–108. https://doi.org/10.1111/j.2044-8295.1975.tb01444.x
  • Liao, Y.-C., Chen, Y.-C., Chan, L., & Chen, B.-Y. (2017). Dwell+: Multi-level mode selection using vibrotactile cues. In Proceedings of the 30th annual acm symposium on user interface software and technology (pp. 5–16). Quebec City, CA.
  • Makous, J. C., Friedman, R. M., & Vierck, C. J. (1995). A critical band filter in touch. Journal of Neuroscience, 15(4), 2808–2818. https://doi.org/10.1523/JNEUROSCI.15-04-02808.1995
  • Min, C., Kang, S., Yoo, C., Cha, J., Choi, S., Oh, Y., & Song, J. (2015). Exploring current practices for battery use and management of smartwatches. In Proceedings of the 2015 acm international symposium on wearable computer (pp. 11–18). , New York, NY, USA.
  • Ni, T., Bowman, D. A., North, C., & McMahan, R. P. (2011). Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures. International Journal of Human-Computer Studies, 69(9), 551–562. https://doi.org/10.1016/j.ijhcs.2011.05.001
  • Nuwer, R. (2013). Armband adds a twitch to gesture control. Elsevier.
  • O’Sullivan, L., & Gallwey, T. J. (2005). Forearm torque strengths and discomfort profiles in pronation and supination. Ergonomics, 48(6), 703–721. https://doi.org/10.1080/00140130500070954
  • Oakley, I., & O’Modhrain, S. (2005). Tilt to scroll: Evaluating a motion based vibrotactile mobile interface. In Proceedings of the 2005 World Haptics Conference (WHC05): The First Joint EuroHaptics Conference and the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Pisa, Italy (pp. 40–49).
  • Oakley, I., & Park, J. (2007). A motion-based marking menu system. In Chi’07 extended abstracts on human factors in computing systems (pp. 2597–2602). ACM.
  • Park, Y., Hun Kim, J., & Lee, K. (2015). Exploring effects of auditory feedback on menu selection in hand-gesture interfaces. IEEE MultiMedia, 22(1), 32–40. https://doi.org/10.1109/MMUL.2015.5
  • Pasquero, J., Stobbe, S. J., & Stonehouse, N. (2011). A haptic wristwatch for eyes-free interactions. In Proceedings of the sigchi conference on human factors in computing systems (pp. 3257–3266). Vancouver, BC, CA.
  • Philippi, T. G., Van Erp, J. B., & Werkhoven, P. J. (2008). Multisensory temporal numerosity judgment. Brain Research, 1242, 116–125. https://doi.org/10.1016/j.brainres.2008.05.056
  • Rahman, M., Gustafson, S., Irani, P., & Subramanian, S. (2009). Tilt techniques: Investigating the dexterity of wrist-based input. In Proceedings of the sigchi conference on human factors in computing systems (pp. 1943–1952). Boston, MA, USA.
  • Sabic, E., & Chen, J. (2016). Threshold of spearcon recognition for auditory menus. In Proceedings of the human factors and ergonomics society annual meeting (Vol. 60, pp. 1539–1543). Washington, DC, USA.
  • Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., & Landay, J. A. (2009). Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22nd annual acm symposium on user interface software and technology (pp. 167–176). Victoria, BC, CA.
  • Shakeri, G., Williamson, J. H., & Brewster, S. (2017). Novel multimodal feedback techniques for in-car mid-air gesture interaction. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications (pp. 84–93). Oldenburg, DE.
  • Shima, K., Onishi, K., Takada, R., Adachi, T., Shizuki, B., & Tanaka, J. (2016). Investigating accuracy of tilting operation on wrist-worn devices with touchscreens. In Proceedings of the 2016 chi conference extended abstracts on human factors in computing systems (pp. 2705–2711). San Jose, CA, USA.
  • Sun, K., Wang, Y., Yu, C., Yan, Y., Wen, H., & Shi, Y. (2017). Float: One-handed and touch-free target selection on smartwatches. In Proceedings of the 2017 chi conference on human factors in computing systems (pp. 692–704). Denver, CO, USA.
  • Walmsley, W. S., Snelgrove, W. X., & Truong, K. N. (2014). Disambiguation of imprecise input with one-dimensional rotational text entry. ACM Transactions on Computer-Human Interaction (TOCHI), 21(1), 4. https://doi.org/10.1145/2542544
  • Wobbrock, J. O., Findlater, L., Gergle, D., & Higgins, J. J. (2011). The aligned rank transform for nonparametric factorial analyses using only anova procedures. In Proceedings of the sigchi conference on human factors in computing systems (pp. 143–146). Vancouver, BC, CA.
  • Yalla, P., & Walker, B. N. (2008). Advanced auditory menus: Design and evaluation of auditory scroll bars. In Proceedings of the 10th international acm sigaccess conference on computers and accessibility (pp. 105–112). New York, NY, USA.
  • Zhang, C., Xue, Q., Waghmare, A., Meng, R., Jain, S., & Han, Y., … others. (2018). Fingerping: Recognizing fine-grained hand poses using active acoustic on-body sensing. In Proceedings of the 2018 chi conference on human factors in computing systems (pp. 1–10). Montreal, QC, CA.
  • Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., & Baudisch, P. (2007). Earpod: Eyes-free menu selection using touch input and reactive audio feedback. In Proceedings of the sigchi conference on human factors in computing systems (pp. 1395–1404). San Jose, CA, USA.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.