1,407
Views
19
CrossRef citations to date
0
Altmetric
Articles

Visual Impairments and Mobile Touchscreen Interaction: State-of-the-Art, Causes of Visual Impairment, and Design Guidelines

References

  • Abdolrahmani, A., Kuber, R., & Hurst, A. (2016). An empirical investigation of the situationally-induced impairments experienced by blind mobile device users. In Proceedings of the 13th Web for All Conference, W4A ’16 (pp. 21:1–21:8), New York, NY.
  • Anderson, F., Grossman, T., Wigdor, D., & Fitzmaurice, G. (2015). Supporting subtlety with deceptive devices and illusory interactions. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (pp. 1489–1498), New York, NY.
  • Annett, M., Gupta, A., & Bischof, W. F. (2014). Exploring and understanding unintended touch during direct pen interaction. ACM Transactions on Computer–Human Interaction, 21 (5), 1–39.
  • Anthony, L., Vatavu, R.-D., & Wobbrock, J. O. (2013). Understanding the consistency of users’ pen and finger stroke gesture articulation. In Proceedings of Graphics Interface 2013, GI ’13 (pp. 87–94), Toronto, Ontario.
  • Anthony, L., & Wobbrock, J. O. (2010). A lightweight multistroke recognizer for user interface prototypes. In Proceedings of Graphics Interface 2010, GI ’10 (pp. 245–252), Toronto, Ontario.
  • AppdLab. (2016). Magnify. Retrieved from https://goo.gl/SEMhlH
  • Appert, C., Chapuis, O., & Pietriga, E. (2010). High-precision magnification lenses. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10 (pp. 273–282), New York, NY.
  • Apple. (2016). Voiceover for iOS (Let Our Voice be Your Guide). Retrieved from http://www.apple.com/accessibility/ios/voiceover/
  • Appollonio, I., Carabellese, C., Frattola, L., & Trabucchi, M. (1996). Effects of sensory aids on the quality of life and mortality of elderly people: A multivariate analysis. Age Ageing, 25 (2), 89–96.
  • Arefin Shimon, S. S., Lutton, C., Xu, Z., Morrison-Smith, S., Boucher, C., & Ruiz, J. (2016). Exploring non-touchscreen gestures for smartwatches. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16 (pp. 3822–3833), New York, NY.
  • Ash, D., Keegan, D., & Greenough, T. (1978). Factors in adjustment to blindness. Canadian Journal of Ophthalmology, 13 (1), 15–21.
  • Azenkot, S., Bennett, C. L., & Ladner, R. E. (2013). Digitaps: Eyes-free number entry on touchscreens with minimal audio feedback. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, UIST ’13 (pp. 85–90), New York, NY.
  • Azenkot, S., Prasain, S., Borning, A., Fortuna, E., Ladner, R. E., & Wobbrock, J. O. (2011). Enhancing independence and safety for blind and deaf-blind public transit riders. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11 (pp. 3247–3256), New York, NY.
  • Azenkot, S., Rector, K., Ladner, R., & Wobbrock, J. (2012a). Passchords: Secure multi-touch authentication for blind people. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’12 (pp. 159–166), New York, NY.
  • Azenkot, S., Wobbrock, J. O., Prasain, S., & Ladner, R. E. (2012b). Input finger detection for nonvisual touch screen text entry in perkinput. In Proceedings of Graphics Interface 2012, GI ’12 (pp. 121–129). Toronto, Ontario.
  • Azenkot, S., & Zhai, S. (2012). Touch behavior with different postures on soft smartphone keyboards. In Proceedings of the 14th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’12 (pp. 251–260), New York, NY.
  • Azrak, C., Palazón-Bru, A., Baeza-Díaz, M., La Rosa, -D. F.-D., Hernández-Martínez, C., Martínez-Toldos, J., & Gil-Guillén, V. (2015). A predictive screening tool to detect diabetic retinopathy or macular edema in primary health care: Construction, validation and implementation on a mobile application. PeerJ, 3, e1404:1–10.
  • Bacim, F., Sinclair, M., & Benko, H. (2013). Understanding touch selection accuracy on flat and hemispherical deformable surfaces. In Proceedings of Graphics Interface 2013, GI ’13 (pp. 197–204), Toronto, Ontario.
  • Barnard, L., Yi, J. S., Jacko, J. A., & Sears, A. (2007). Capturing the effects of context on human performance in mobile computing systems. Personal and Ubiquitous Computing, 11 (2), 81–96.
  • Bau, O., & Mackay, W. E. (2008). Octopocus: A dynamic guide for learning gesture-based command sets. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, UIST ’08 (pp. 37–46). New York, NY.
  • Bigham, J., Jayant, C., Miller, A., White, B., & Yeh, T. (2010). VizWiz::LocateIt - enabling blind people to locate objects in their environment. In Proceedings of the CVPR Workshop: Computer Vision Applications for the Visually Impaired, San Francisco, CA.
  • Bischof, W., Krajnc, E., Dornhofer, M., & Ulm, M. (2012). Navcom-wlan communication between public transport vehicles and smart phones to support visually impaired and blind people. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs, ICCHP ’12 (pp. 91–98), Linz, Austria.
  • Blagojevic, R., Chang, S.-H.-H., & Plimmer, B. (2010). The power of automatic feature selection: Rubine on steroids. In Proceedings of the Seventh Sketch-Based Interfaces and Modeling Symposium, SBIM ’10 (pp. 79–86), Aire-la-Ville, Switzerland.
  • Brady, E., Morris, M. R., Zhong, Y., White, S., & Bigham, J. P. (2013). Visual challenges in the everyday lives of blind people. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13 (pp. 2117–2126), New York, NY.
  • Bragi. (2016). Bragi - The dash. Retrieved from http://www.bragi.com/thedash/
  • Burch, D. S., & Pawluk, D. T. (2009). A cheap, portable haptic device for a method to relay 2-d texture-enriched graphical information to individuals who are visually impaired. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, Assets ’09 (pp. 215–216), New York, NY.
  • Buzzi, M., Buzzi, M., Leporini, B., & Martusciello, L. (2011). Making visual maps accessible to the blind. In Proceedings of the 6th International Conference on Universal Access in Human–Computer Interaction: Users Diversity, UAHCI’11 (pp. 271–280), Berlin, Heidelberg.
  • Buzzi, M. C., Buzzi, M., Leporini, B., & Trujillo, A. (2015). Exploring visually impaired people’s gesture preferences for smartphones. In Proceedings of the 11th Biannual Conference on Italian SIGCHI Chapter, CHItaly 2015 (pp. 94–101), New York, NY.
  • Buzzi, M. C., Buzzi, M., Leporini, B., & Trujillo, A. (2016). Analyzing visually impaired people’s touch gestures on smartphones. Multimedia Tools and Applications, Advance online publication. doi:10.1007/s11042-016-3594-9.
  • Chang, Y., L’Yi, S., Koh, K., & Seo, J. (2015). Understanding users’ touch behavior on large mobile touch-screens and assisted targeting by tilting gesture. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (pp. 1499–1508), New York, NY.
  • Chen, D., Feng, W., Zhao, Q., Hu, M., & Wang, T. (2012). An infrastructure-free indoor navigation system for blind people. In Proceedings of the 5th International Conference on Intelligent Robotics and Applications, ICIRA ’12 (pp. 552–561), Berlin, Heidelberg.
  • Chen, X. A., Schwarz, J., Harrison, C., Mankoff, J., & Hudson, S. E. (2014). Air+touch: Interweaving touch & in-air gestures. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST ’14 (pp. 519–525), New York, NY:.
  • Chhetri, A. P., Wen, F., Wang, Y., & Zhang, K. (2010). Shape discrimination test on handheld devices for patient self-test. In Proceedings of the 1st ACM International Health Informatics Symposium, IHI ’10 (pp. 502–506), New York, NY.
  • Chi, P.-Y. P., & Li, Y. (2015). Weave: Scripting cross-device wearable interaction. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (pp. 3923–3932). New York, NY.
  • Chi, P.-Y. P., Li, Y., & Hartmann, B. (2016). Enhancing cross-device interaction scripting with interactive illustrations. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16 (pp. 5482–5493), New York, NY.
  • Choe, W., & Lee, M. W. (2015). Discomfort glare model for mobile display. In Proceedings of the IEEE 5th International Conference on Consumer Electronics (pp. 471–473), Berlin, Germany.
  • Csapó, A., Wersényi, G., Nagy, H., & Stockman, T. (2015). A survey of assistive technologies and applications for blind users on mobile platforms: A review and foundation for research. Journal on Multimodal User Interfaces, 9 (4), 275–286.
  • Delamare, W., Janssoone, T., Coutrix, C., & Nigay, L. (2016). Designing 3d gesture guidance: Visual feedback and feedforward design options. In Proceedings of the International Working Conference on Advanced Visual Interfaces, AVI ’16 (pp. 152–159), New York, NY:.
  • Dezfuli, N., Khalilbeigi, M., Huber, J., Müller, F., & Mühlhäuser, M. (2012). PalmRC: Imaginary palm-based remote control for eyes-free television interaction. In Proceedings of the 10th European Conference on Interactive Tv and Video, EuroiTV ’12 (pp. 27–34), New York, NY:.
  • El-Glaly, Y., Quek, F., Smith-Jackson, T., & Dhillon, G. (2012). Audible rendering of text documents controlled by multi-touch interaction. In Proceedings of the 14th ACM International Conference on Multimodal Interaction, ICMI ’12 (pp. 401–408), New York, NY:.
  • Elton, E., & Nicolle, C. A. (2010). The importance of context in inclusive design. In Proceedings of the International Conference on Contemporary Ergonomics and Human Factors (pp. 1–16), Keele, UK.
  • Encelle, B., Ollagnier-Beldame, M., Pouchot, S., & Prié, Y. (2011). Annotation-based video enrichment for blind people: A pilot study on the use of earcons and speech synthesis. In The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’11 (pp. 123–130), New York, NY.
  • Eriksen, C. W., & Hoffman, J. E. (1972). Temporal and spatial characteristics of selective encoding from visual displays. Perception & Psychophysics, 12 (2), 201–204.
  • Eriksen, C. W., & James, J. D. S. (1986). Visual attention within and around the field of focal attention: A zoom lens model. Perception & Psychophysics, 40 (4), 225–240.
  • Exler, A., Braith, M., Schankin, A., & Beigl, M. (2016). Preliminary investigations about interruptibility of smartphone users at specific place types. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, UbiComp ’16 (pp. 1590–1595), New York, NY.
  • Frey, B., Southern, C., & Romero, M. (2011). Brailletouch: Mobile texting for the visually impaired. In Proceedings of the 6th International Conference on Universal Access in Human–Computer Interaction: Context Diversity - Volume Part III, UAHCI’11 (pp. 19–25), Berlin, Heidelberg.
  • Gajos, K., & Weld, D. S. (2004). Supple: Automatically generating user interfaces. In Proceedings of the 9th International Conference on Intelligent User Interfaces, IUI ’04 (pp. 93–100), New York, NY.
  • Gajos, K. Z., Weld, D. S., & Wobbrock, J. O. (2010). Automatically generating personalized user interfaces with supple. Artificial Intelligence, 174 (12–13), 910–950.
  • Gajos, K. Z., Wobbrock, J. O., & Weld, D. S. (2007). Automatically generating user interfaces adapted to users’ motor and vision capabilities. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST ’07 (pp. 231–240). New York, NY.
  • Garcia, D. G., Paluri, M., & Wu, S. (2016). Under the hood: Building accessibility tools for the visually impaired on Facebook. Retrieved from https://code.facebook.com/posts/457605107772545/under-the-hood-building-accessibility-tools-for-the-visually-impaired-on-facebook/
  • GeorgiePhone. (2016). Smart android apps for blind people. Retrieved from http://www.georgiephone.com/
  • Giachritsis, C., Randall, G., & Roselier, S. (2012). Development of intuitive tactile navigational patterns. In Proceedings of the 2012 International Conference on Haptics: Perception, Devices, Mobility, and Communication - Volume Part I, EuroHaptics’12 (pp. 136–147), Berlin, Heidelberg.
  • Goel, M., Findlater, L., & Wobbrock, J. (2012a). Walktype: Using accelerometer data to accommodate situational impairments in mobile touch screen text entry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12 (pp. 2687–2696), New York, NY.
  • Goel, M., Jansen, A., Mandel, T., Patel, S. N., & Wobbrock, J. O. (2013). Contexttype: Using hand posture information to improve mobile touch screen text entry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13 (pp. 2795–2798). New York, NY.
  • Goel, M., Wobbrock, J., & Patel, S. (2012b). Gripsense: Using built-in sensors to detect hand posture and pressure on commodity mobile phones. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, UIST ’12 (pp. 545–554). New York, NY.
  • Goh, T., & Kim, S. W. (2014). Eyes-free text entry interface based on contact area for people with visual impairment. In Proceedings of the Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST’14 Adjunct (pp. 69–70), New York, NY.
  • Gollner, U., Bieling, T., & Joost, G. (2012). Mobile lorm glove: Introducing a communication device for deaf-blind people. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, TEI ’12 (pp. 127–130), New York, NY.
  • Google. (2016a). Accessibility |Android developers. Retrieved from https://developer.android.com/design/patterns/accessibility.html
  • Google. (2016b). Google talkBack. Retrieved from https://goo.gl/t43nNu
  • Grossman, T., & Balakrishnan, R. (2005). The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor’s activation area. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’05 (pp. 281–290), New York, NY.
  • Guerreiro, T., Lagoá, P., Nicolau, H., Gonçalves, D., & Jorge, J. A. (2008). From tapping to touching: Making touch screens accessible to blind users. IEEE Multimedia, 15 (4), 48–50.
  • Guerreiro, T., Oliveira, J., Benedito, J., Nicolau, H., Jorge, J., & Gonçalves, D. (2011). Blind people and mobile keypads: Accounting for individual differences. In Proceedings of the 13th IFIP TC 13 International Conference on Human–Computer Interaction, INTERACT’11 (pp. 65–82), Lisbon, Portugal.
  • Gupta, A., Irudayaraj, A., Chandran, V., Palaniappan, G., Truong, K. N., & Balakrishnan, R. (2016). Haptic learning of semaphoric finger gestures. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, UIST ’16 (pp. 219–226), New York, NY.
  • Gustafson, S., Holz, C., & Baudisch, P. (2011). Imaginary phone: Learning imaginary interfaces by transferring spatial memory from a familiar device. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST ’11 (pp. 283–292), New York, NY.
  • Hakobyan, L., Lumsden, J., & O’Sullivan, D. (2014). Participatory research with older adults with amd: Co-designing a smart diet diary app. In Proceedings of the 28th International BCS Human–Computer Interaction Conference on HCI 2014 - Sand, Sea and Sky - Holiday HCI, BCS-HCI ’14 (pp. 32–41), Southport, UK.
  • Hakobyan, L., Lumsden, J., O’Sullivan, D., & Bartlett, H. (2013). Designing a mobile diet diary application with and for older adults with amd: A case study. In Proceedings of the 27th International BCS Human–Computer Interaction Conference, BCS-HCI ’13 (pp. 17:1–17:10). Swinton, UK.
  • HaptiMap. Haptimap, haptic, audio and visual interfaces for maps and location based services. Retrieved from http://www.haptimap.org/
  • Hersh, M., & Johnson, M. A. (Eds.). (2008). Assistive technology for visually impaired and blind people. London, UK: Springer.
  • Hinckley, K., Heo, S., Pahud, M., Holz, C., Benko, H., Sellen, A., … Buxton, W. (2016). Pre-touch sensing for mobile interaction. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16 (pp. 2869–2881), New York, NY.
  • Holz, C., & Baudisch, P. (2011). Understanding touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11 (pp. 2501–2510), New York, NY.
  • Hwang, A., & Peli, E. (2014). An augmented-reality edge enhancement application for Google glass. Optometry & Vision Science, 91 (8), 1021–1030.
  • Jacko, J. A., & Sears, A. (1998). Designing interfaces for an overlooked user group: Considering the visual profiles of partially sighted users. In Proceedings of the Third International ACM Conference on Assistive Technologies, Assets ’98 (pp. 75–77), New York, NY.
  • Jackson, S., & Gleeson, K. (2013). Living and coping with strabismus as an adult. European Medical Journal Ophthalmology, 1, 15–22.
  • Jafri, R., Ali, S. A., Arabnia, H. R., & Fatima, S. (2014). Computer vision-based object recognition for the visually impaired in an indoors environment: A survey. Vis. Comput., 30 (11), 1197–1222.
  • Kane, S. K., Bigham, J. P., & Wobbrock, J. O. (2008a). Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Assets ’08 (pp. 73–80), New York, NY.
  • Kane, S. K., Frey, B., & Wobbrock, J. O. (2013a). Access lens: A gesture-based screen reader for real-world documents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13 (pp. 347–350), New York, NY.
  • Kane, S. K., Jayant, C., Wobbrock, J. O., & Ladner, R. E. (2009). Freedom to roam: A study of mobile device adoption and accessibility for people with visual and motor disabilities. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, Assets ’09 (pp. 115–122), New York, NY.
  • Kane, S. K., Morris, M. R., Perkins, A. Z., Wigdor, D., Ladner, R. E., & Wobbrock, J. O. (2011a). Access overlays: Improving non-visual access to large touch screens for blind users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST ’11 (pp. 273–282), New York, NY.
  • Kane, S. K., Morris, M. R., & Wobbrock, J. O. (2013b). Touchplates: Low-cost tactile overlays for visually impaired touch screen users. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’13 (pp. 22:1–22:8), New York, NY.
  • Kane, S. K., Wobbrock, J. O., & Ladner, R. E. (2011b). Usable gestures for blind people: Understanding preference and performance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11 (pp. 413–422), New York, NY.
  • Kane, S. K., Wobbrock, J. O., & Smith, I. E. (2008b). Getting off the treadmill: Evaluating walking user interfaces for mobile devices in public spaces. In Proceedings of the 10th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’08 (pp. 109–118), New York, NY.
  • Karikoski, J., & Soikkeli, T. (2013). Contextual usage patterns in smartphone communication services. Personal and Ubiquitous Computing, 17 (3), 491–502.
  • Karim, S., Andjomshoaa, A., & Tjoa, A. M. (2006). Exploiting sensecam for helping the blind in business negotiations. In Proceedings of the 10th International Conference on Computers Helping People with Special Needs, ICCHP ’06 (pp. 1147–1154), Berlin, Heidelberg.
  • Kelley, E. F., Lindfors, M., & Penczek, J. (2006). Display daylight ambient contrast measurement methods and daylight readability. Journal of the Society for Information Display, 14, 1019–1030.
  • Kempen, G. I., & Zijlstra, G. R. (2014). Clinically relevant symptoms of anxiety and depression in low-vision community-living older adults. The American Journal of Geriatric Psychiatry, 22 (3), 309–313.
  • Khademi, M., Mousavi Hondori, H., & Videira Lopes, C. (2012). Optical illusion in augmented reality. In Proceedings of the 18th ACM Symposium on Virtual Reality Software and Technology, VRST ’12 (pp. 195–196), New York, NY.
  • Kim, D.-J., & Lim, Y.-K. (2011). Handscope: Enabling blind people to experience statistical graphics on websites through haptics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11 (pp. 2039–2042), New York, NY.
  • Kim, H. N., Smith-Jackson, T. L., & Nam, C. S. (2013). Elicitation of haptic user interface needs of people with low vision. International Journal of Human–Computer Interaction, 29 (7), 488–500.
  • Koike, H., Matoba, Y., & Takahashi, Y. (2013). Aquatop display: Interactive water surface for viewing and manipulating information in a bathroom. In Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces, ITS ’13 (pp. 155–164), New York, NY.
  • Krajnc, E., Feiner, J., & Schmidt, S. (2010). User centered interaction design for mobile applications focused on visually impaired and blind people. In Proceedings of the 6th International Conference on HCI in Work and Learning, Life and Leisure: Workgroup Human–Computer Interaction and Usability Engineering, USAB ’10 (pp. 195–202), Klagenfurt, Austria.
  • Kuber, R., Hastings, A., Tretter, M., & Fitzpatrick, D. (2012). Determining the accessibility of mobile screen readers for blind users. In Proceedings of the IASTED Conference on Human–Computer Interaction (pp. 182–189), Baltimore, MD.
  • Kurtenbach, G., & Buxton, W. (1994). User learning and performance with marking menus. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’94 (pp. 258–264), New York, NY.
  • Kurtenbach, G., Moran, T., & Buxton, W. (1994). Contextual animation of gestural commands. Computer Graphics Forum, 13 (5), 305–314.
  • Landau, S., & Wells, L. (2003). Merging tactile sensory input and audio data by means of the talking tactile tablet. In Proceedings of EuroHaptics ’03 (pp. 414–418), Dublin, Ireland.
  • Larson, S. (2016a). How tech companies are making their apps more accessible to the disabled. Retrieved from http://theweek.com/articles/628552/how-tech-companies-are-making-apps-more-accessible-disabled
  • Larson, S. (2016b). Twitter finally adds photo captioning for the visually impaired. Retrieved from http://www.dailydot.com/debug/twitter-alt-text-captioning/
  • Lee, S., & Zhai, S. (2009). The performance of touch screen soft buttons. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09 (pp. 309–318), New York, NY.
  • Leonard, V. K., Jacko, J. A., & Pizzimenti, J. J. (2005). An exploratory investigation of handheld computer interaction for older adults with visual impairments. In Proceedings of the 7th International ACM SIGACCESS Conference on Computers and Accessibility, Assets ’05 (pp. 12–19), New York, NY.
  • Leporini, B., Buzzi, M. C., & Buzzi, M. (2012). Interacting with mobile devices via voiceover: Usability and accessibility issues. In Proceedings of the 24th Australian Computer–Human Interaction Conference, OzCHI ’12 (pp. 339–348), New York, NY.
  • Lévesque, V., Pasquero, J., Hayward, V., & Legault, M. (2005). Display of virtual braille dots by lateral skin deformation: Feasibility study. ACM Trans. Appl. Percept., 2 (2), 132–149.
  • Li, F. C. Y., Dearman, D., & Truong, K. N. (2010). Leveraging proprioception to make mobile phones more accessible to users with visual impairments. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’10 (pp. 187–194), New York, NY.
  • Lindeman, R. W., Page, R., Yanagida, Y., & Sibert, J. L. (2004). Towards full-body haptic feedback: The design and deployment of a spatialized vibrotactile feedback system. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST ’04 (pp. 146–149), New York, NY.
  • Mack, A., Pappas, Z., Silverman, M., & Gay, R. (2002). What we see: Inattention and the capture of attention by meaning. Consciousness and Cognition, 11 (4), 488–506.
  • Mack, A., & Rock, I. (1998). Inattentional Blindness. Cambridge, MA: MIT Press.
  • Manduchi, R., & Coughlan, J. (2012). (Computer) vision without sight. Communications of the ACM, 55 (1), 96–104.
  • Manohar, P., & Parthasarathy, A. (2009). An innovative braille system keyboard for the visually impaired. In Proceedings of the 11th International Conference on Computer Modelling and Simulation, UKSIM ’09 (pp. 559–562), Cambridge, UK.
  • Mariakakis, A., Goel, M., Aumi, M. T. I., Patel, S. N., & Wobbrock, J. O. (2015). Switchback: Using focus and saccade tracking to guide users’ attention for mobile task resumption. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (pp. 2953–2962), New York, NY.
  • Matero, J., & Colley, A. (2012). Identifying unintentional touches on handheld touch screen devices. In Proceedings of the Designing Interactive Systems Conference, DIS ’12 (pp. 506–509), New York, NY.
  • McGookin, D., Brewster, S., & Jiang, W. (2008). Investigating touchscreen accessibility for people with visual impairments. In Proceedings of the 5th Nordic Conference on Human–Computer Interaction: Building Bridges, NordiCHI ’08 (pp. 298–307), New York, NY.
  • Meier, A., Matthies, D. J. C., Urban, B., & Wettach, R. (2015). Exploring vibrotactile feedback on the body and foot for the purpose of pedestrian navigation. In Proceedings of the 2Nd International Workshop on Sensor-based Activity Recognition and Interaction, WOAR ’15 (pp. 11:1–11:11), New York, NY.
  • Meijer, P. (2016). The vOICe for Android. Retrieved from https://goo.gl/DDLg03
  • Miller, D., Parecki, A., & Douglas, S. A. (2007). Finger dance: A sound game for blind people. In Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, Assets ’07 (pp. 253–254), New York, NY.
  • Minatani, K., & Watanabe, T. (2012). A non-visual interface for tasks requiring rapid recognition and response: An rc helicopter control system for blind people. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs (pp. 630–635), Berlin, Heidelberg.
  • Morris, M. R., Wobbrock, J. O., & Wilson, A. D. (2010). Understanding users’ preferences for surface gestures. In Proceedings of Graphics Interface 2010, GI ’10 (pp. 261–268), Toronto, Ont., Canada.
  • Moschos, M. M. (2014). Physiology and psychology of vision and its disorders: A review. Medical Hypothesis, Discovery and Innovation in Ophthalmology, 3 (3), 83–90.
  • Mott, M. E., & Wobbrock, J. O. (2014). Beating the bubble: Using kinematic triggering in the bubble lens for acquiring small, dense targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’14 (pp. 733–742), New York, NY.
  • Nacenta, M. A., Kamber, Y., Qiang, Y., & Kristensson, P. O. (2013). Memorability of pre-designed and user-defined gesture sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13 (pp. 1099–1108), New York, NY.
  • Ng, A., & Brewster, S. (2013). The relationship between encumbrance and walking speed on mobile interactions. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’13 (pp. 1359–1364), New York, NY.
  • Ng, A., Brewster, S. A., & Williamson, J. (2013). The impact of encumbrance on mobile interactions. In Proceedings of the 14th IFIP TC 13 International Conference on Human–Computer Interaction, INTERACT ’13 (pp. 92–109), Cape Town, South Africa.
  • Nicolle, C., & Elton, E. (2016). Understanding everyday activities in context. Retrieved from http://designingwithpeople.rca.ac.uk/understanding-everyday-activities-in-context
  • Oh, U., Branham, S., Findlater, L., & Kane, S. K. (2015). Audio-based feedback techniques for teaching touchscreen gestures. ACM Transactions on Accessible Computing (TACCESS), 7 (3), 1–29.
  • Oh, U., & Findlater, L. (2015). A performance comparison of on-hand versus on-phone nonvisual input by blind and sighted users. ACM Transactions on Accessible Computing (TACCESS), 7 (4), 1–20.
  • Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J., & Gonã§Alves, D. (2011). Brailletype: Unleashing braille over touch screen mobile phones. In Proceedings of the 13th IFIP TC 13 International Conference on Human–Computer Interaction, INTERACT’11 (pp. 100–107), Berlin, Heidelberg.
  • Pamplona, V. F., Oliveira, M. M., Aliaga, D. G., & Raskar, R. (2012). Tailored displays to compensate for visual aberrations. ACM Transactions Graph, 31 (4), 1–12.
  • Pareddy, S., Agarwal, A., & Swaminathan, M. (2016). Knowwhat: Mid field sensemaking for the visually impaired. In Proceedings of the 2016 Symposium on Spatial User Interaction, SUI ’16 (pp. 191–191), New York, NY.
  • Park, K., Goh, T., & So, H.-J. (2014). Toward accessible mobile application design: Developing mobile application accessibility guidelines for people with visual impairment. In Proceedings of HCI Korea, HCIK ’15 (pp. 31–38), Seoul, South Korea
  • Park, W., & Han, S. H. (2014). An analytical approach to creating multitouch gesture vocabularies in mobile devices: A case study for mobile web browsing gestures. International Journal of Human–Computer Interaction, 30 (2), 126–141.
  • Pawluk, D. T., Adams, R. J., & Kitada, R. (2015). Designing haptic assistive technology for individuals who are blind or visually impaired. IEEE Transactions on Haptics, 8, 3.
  • Pielot, M., Poppinga, B., Heuten, W., & Boll, S. (2012). Tacticycle: Supporting exploratory bicycle trips. In Proceedings of the 14th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’12 (pp. 369–378), New York, NY.
  • Pier, M. D., & Goldberg, I. R. (2005). Using water as interface media in vr applications. In Proceedings of the 2005 Latin American Conference on Human–Computer Interaction, CLIHC ’05 (pp. 162–169), New York, NY.
  • Poláček, O., Grill, T., & Tscheligi, M. (2012). Towards a navigation system for blind people: A Wizard of Oz study. ACM SIGACCESS Accessibility and Computing, 104, 12–29.
  • Prasanna, P., Jain, S., Bhagat, N., & Madabhushi, A. (2013). Decision support system for detection of diabetic retinopathy using smartphones. In Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare, PervasiveHealth ’13 (pp. 176–179), ICST, Brussels, Belgium.
  • Pressl, B., & Wieser, M. (2006). A computer-based navigation system tailored to the needs of blind people. In Proceedings of the 10th International Conference on Computers Helping People with Special Needs, ICCHP’06 (pp. 1280–1286), Linz, Austria.
  • Project-RAY. (2016). Smart phones for blind, cells for visually impaired. Retrieved from http://project-ray.com
  • Qian, H., Kuber, R., & Sears, A. (2013). Developing tactile icons to support mobile users with situationally-induced impairments and disabilities. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’13 (pp. 47:1–47:2). New York, NY.
  • Quek, F., & Oliveira, F. (2013). Enabling the blind to see gestures. ACM Transactions on Computer–Human Interaction, 20 (1), 1–32.
  • Rädle, R., Jetter, H.-C., Schreiner, M., Lu, Z., Reiterer, H., & Rogers, Y. (2015). Spatially-aware or spatially-agnostic? Elicitation and evaluation of user-defined cross-device interactions. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (pp. 3913–3922). New York, NY.
  • Rekik, Y., Grisoni, L., & Roussel, N. (2013). Towards many gestures to one command: A user study for tabletops. In Proceedings of the 14th IFIP TC 13 International Conference on Human–Computer Interaction, INTERACT ’13 (pp. 246–263), Cape Town, South Africa.
  • Rekik, Y., Vatavu, R.-D., & Grisoni, L. (2014). Understanding users’ perceived difficulty of multi-touch gesture articulation. In Proceedings of the 16th International Conference on Multimodal Interaction, ICMI ’14 (pp. 232–239), New York, NY.
  • Rico, J., & Brewster, S. (2009). Gestures all around us: User differences in social acceptability perceptions of gesture based interfaces. In Proceedings of the 11th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’09 (pp. 64:1–64:2). New York, NY.
  • Rico, J., & Brewster, S. (2010). Usable gestures for mobile interfaces: Evaluating social acceptability. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10 (pp. 887–896). New York, NY.
  • Rock, I. (1997). Indirect Perception. Cambridge, MA: MIT Press.
  • Rock, I., Linnett, C. M., & Grant, P. (1992). Perception without attention: Results of a new method. Cognitive Psychology, 24 (4), 502–534.
  • Rodriguez, R., Garretã3n, J. Y., & Pattini, A. (2016). Glare and cognitive performance in screen work in the presence of sunlight. Lighting Research and Technology, 48 (2), 221–238.
  • Rotard, M., Taras, C., & Ertl, T. (2008). Tactile web browsing for blind people. Multimedia Tools and Applications, 37 (1), 53–69.
  • Rovner, B., Casten, R., Hegel, M., Leiby, B., & Tasman, W. (2007). Preventing depression in age-related macular degeneration. Archives of General Psychiatry, 64 (8), 886–892.
  • Rubine, D. (1991). Specifying gestures by example. In Proceedings of the 18th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’91 (pp. 329–337), New York, NY.
  • Sagata, Y., Watanabe, M., & Asano, Y. (2007). Voiceblog for blind and weak-eyed people. In Proceedings of the 4th International Conference on Universal Access in Human–Computer Interaction: Applications and Services, UAHCI ’07 (pp. 961–969)..
  • Sánchez, J., & Aguayo, F. (2007). Mobile messenger for the blind. In Proceedings of the 9th Conference on User Interfaces for All, ERCIM’06 (pp. 369–385), Berlin, Heidelberg.
  • Schauerte, B., Martinez, M., Constantinescu, A., & Stiefelhagen, R. (2012). An assistive vision system for the blind that helps find lost things. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs, ICCHP’12 (pp. 566–572), Berlin, Heidelberg.
  • Schildbach, B., & Rukzio, E. (2010). Investigating selection and reading performance on a mobile phone while walking. In Proceedings of the 12th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’10 (pp. 93–102). New York, NY.
  • Schmidt, A., Aidoo, K. A., Takaluoma, A., Tuomela, U., Laerhoven, K. V., & Velde, W. V. D. (1999). Advanced interaction in context. In Proceedings of the 1st International Symposium on Handheld and Ubiquitous Computing, HUC ’99 (pp. 89–101), London, UK.
  • Schönauer, C., Mossel, A., Zaiti, I.-A., & Vatavu, R.-D. (2015). Touch, movement & vibration: User perception of vibrotactile feedback for touch and mid-air gestures. In Proceedings of the 15th IFIP TC.13 International Conference on Human–Computer Interaction, INTERACT ’15 (pp. 165–172), Bamberg, Germany.
  • Schuchhardt, M., Jha, S., Ayoub, R., Kishinevsky, M., & Memik, G. (2015). Optimizing mobile display brightness by leveraging human visual perception. In Proceedings of the 2015 International Conference on Compilers, Architecture and Synthesis for Embedded Systems, CASES ’15 (pp. 11–20), Piscataway, NJ.
  • Sears, A., Lin, M., Jacko, J., & Xiao, Y. (2003). When computers fade: Pervasive computing and situationally-induced impairments and disabilities. In Proceedings of HCI International ’03 (pp. 1298–1302), Crete, Greece.
  • Senra, H., Barbosa, F., Ferreira, P., Vieira, C. R., Perrin, P. B., Rogers, H., … Leal, I. (2015). Psychologic adjustment to irreversible vision loss in adults: A systematic review. Ophthalmology, 122 (4), 851–861.
  • Shaik, A. S., Hossain, G., & Yeasin, M. (2010). Design, development and performance evaluation of reconfigured mobile android phone for people who are blind or visually impaired. In Proceedings of the 28th ACM International Conference on Design of Communication, SIGDOC ’10 (pp. 159–166), New York, NY.
  • Shen, F. (2016). Talking location. Retrieved from https://itunes.apple.com/us/app/talking-location/id887104200?mt=8
  • Shinohara, K., & Tenenberg, J. (2007). Observing sara: A case study of a blind person’s interactions with technology. In Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, Assets ’07 (pp. 171–178), New York, NY.
  • Siqueira, J., Soares, F. A. A. d. M. N., Ferreira, D. J., Silva, C. R. G., Berretta, d. O., Ferreira, C. B. R., Félix, I. M., Soares, A. d. S., Costa, R. M. d., Luna, M. M. (2016). Braille text entry on smartphones: A systematic review of the literature. In Proceedings of the 2016 IEEE 40th Annual Computer Software and Applications Conference (COMPSAC) (pp. 521–526), Atlanta, GA.
  • SmartTools. (2016). Smart magnifier. Retrieved from https://goo.gl/9XFBTS
  • Song, J., Sörös, G., Pece, F., Fanello, S. R., Izadi, S., Keskin, C., & Hilliges, O. (2014). In-air gestures around unmodified mobile devices. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST ’14 (pp. 319–329), New York, NY.
  • Statista. (2016). Smartphone users worldwide 2014–2019 |Statistic. Retrieved from http://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/
  • Su, C.-H., Chan, L., Weng, C.-T., Liang, R.-H., Cheng, K.-Y., & Chen, B.-Y. (2013). Naildisplay: Bringing an always available visual display to fingertips. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13 (pp. 1461–1464), New York, NY.
  • Szpiro, S., Hashash, S., Zhao, Y., & Azenkot, S. (2016a). How people with low vision access computing devices: Understanding challenges and opportunities. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’16, Reno, NV.
  • Szpiro, S., Zhao, Y., & Azenkot, S. (2016b). Finding a store, searching for a product: A study of daily challenges of low vision people. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp ’16 (pp. 61–72), New York, NY.
  • Szymczak, D., Magnusson, C., & Rassmus-Gröhn, K. (2012). Guiding tourists through haptic interaction: Vibration feedback in the lund time machine. In Proceedings of the 2012 International Conference on Haptics: Perception, Devices, Mobility, and Communication - Volume Part II, EuroHaptics’12 (pp. 157–162), Berlin, Heidelberg.
  • Tanuwidjaja, E., Huynh, D., Koa, K., Nguyen, C., Shao, C., Torbett, P., … Weibel, N. (2014). Chroma: A wearable augmented-reality solution for color blindness. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp ’14 (pp. 799–810), New York, NY.
  • Tian, Y., & Yuan, S. (2010). Clothes matching for blind and color blind people. In Proceedings of the 12th International Conference on Computers Helping People with Special Needs, ICCHP’10 (pp. 324–331), Vienna, Austria.
  • Tinwala, H., & MacKenzie, I. S. (2010). Eyes-free text entry with error correction on touchscreen mobile devices. In Proceedings of the 6th Nordic Conference on Human–Computer Interaction: Extending Boundaries, NordiCHI ’10 (pp. 511–520), New York, NY.
  • Tomlinson, B. J., Schuett, J. H., Shortridge, W., Chandran, J., & Walker, B. N. (2016). Talkin’ about the weather: Incorporating talkback functionality and sonifications for accessible app design. In Proceedings of the 18th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’16 (pp. 377–386), New York, NY.
  • Tu, H., Ren, X., Tian, F., & Wang, F. (2014). Evaluation of flick and ring scrolling on touch-based smartphones. International Journal of Human–Computer Interaction, 30 (8), 643–653.
  • Tu, H., Ren, X., & Zhai, S. (2015). Differences and similarities between finger and pen stroke gestures on stationary and mobile devices. ACM Transactions on Computer–Human Interaction, 22 (5), 1–39.
  • Turunen, M., Valkama, P., Rajaniemi, J.-P., Raisamo, R., Heimonen, T., Rantala, J., … Raisamo, R. (2010). Accessible multimodal media center application for blind and partially sighted people. Comput. Entertain, 8 (3), 1–30.
  • Vatavu, R.-D., Anthony, L., & Wobbrock, J. O. (2012). Gestures as point clouds: A $p recognizer for user interface prototypes. In Proceedings of the 14th ACM International Conference on Multimodal Interaction, ICMI ’12 (pp. 273–280), New York, NY.
  • Vatavu, R.-D., Anthony, L., & Wobbrock, J. O. (2013). Relative accuracy measures for stroke gestures. In Proceedings of the 15th ACM on International Conference on Multimodal Interaction, ICMI ’13 (pp. 279–286), New York, NY.
  • Vatavu, R.-D., Anthony, L., & Wobbrock, J. O. (2014). Gesture heatmaps: Understanding gesture performance with colorful visualizations. In Proceedings of the 16th International Conference on Multimodal Interaction, ICMI ’14 (pp. 172–179), New York, NY.
  • Vatavu, R.-D., Cramariuc, G., & Schipor, D. M. (2015). Touch interaction for children aged 3 to 6 years: Experimental findings and relationship to motor skills. International Journal of Human–Computer Studies, 74, 54–76.
  • Vatavu, R.-D., & Mancas, M. (2014). Visual attention measures for multi-screen TV. In Proceedings of the 2014 ACM International Conference on Interactive Experiences for TV and Online Video, TVX ’14 (pp. 111–118), New York, NY.
  • Vatavu, R.-D., & Mancas, M. (2015). Evaluating visual attention for multi-screen television: Measures, toolkit, and experimental findings. Personal and Ubiquitous Computing, 19 (5–6), 781–801.
  • Vatavu, R.-D., Mossel, A., & Schönauer, C. (2016). Digital vibrons: Understanding users’ perceptions of interacting with invisible, zero-weight matter. In Proceedings of the 18th International Conference on Human–Computer Interaction with Mobile Devices and Services, MobileHCI ’16 (pp. 217–226), New York, NY.
  • Vatavu, R.-D., & Wobbrock, J. O. (2015). Formalizing agreement analysis for elicitation studies: New measures, significance test, and toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (pp. 1325–1334), New York, NY.
  • Vatavu, R.-D., & Wobbrock, J. O. (2016). Between-subjects elicitation studies: Formalization and tool support. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16 (pp. 3390–3402), New York, NY.
  • Vertanen, K. (2016). Counting fingers: Eyes-free text entry without touch location. Presented at CHI’16 Workshop on Inviscid Text Entry and Beyond, San Jose, CA.
  • Vogel, D., & Balakrishnan, R. (2010). Occlusion-aware interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10 (pp. 263–272), New York, NY.
  • Vogel, D., & Casiez, G. (2012). Hand occlusion on a multi-touch tabletop. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12 (pp. 2307–2316), New York, NY.
  • Vogel, D., Cudmore, M., Casiez, G., Balakrishnan, R., & Keliher, L. (2009). Hand occlusion with tablet-sized direct pen input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09 (pp. 557–566), New York, NY.
  • Voykinska, V., Azenkot, S., Wu, S., & Leshed, G. (2016). How blind people interact with visual content on social networking services. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, CSCW ’16 (pp. 1584–1595), New York, NY.
  • Wall, S. A., & Brewster, S. A. (2006). Tac-tiles: Multimodal pie charts for visually impaired users. In Proceedings of the 4th Nordic Conference on Human–Computer Interaction: Changing Roles, NordiCHI ’06 (pp. 9–18), New York, NY.
  • Wang, Z., Li, N., & Li, B. (2012). Fast and independent access to map directions for people who are blind. Interacting with Computers, 24 (2), 91–106.
  • WHO. (2012). Global data on visual impairments 2010. Retrieved from http://www.who.int/blindness/GLOBALDATAFINALforweb.pdf
  • WHO. (2014). Visual impairment and blindness (Fact Sheet No. 282). Retrieved from http://www.who.int/mediacentre/factsheets/fs282/en/
  • WHO. (2016). International statistical classification of diseases and related health problems 10th revision (icd-10) - WHO version for 2016. Retrieved from http://apps.who.int/classifications/icd10/browse/2016/en
  • Wiese, J., Saponas, T. S., & Brush, A. B. (2013). Phoneprioception: Enabling mobile phones to infer where they are kept. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13 (pp. 2157–2166), New York, NY.
  • Wimmer, R., & Echtler, F. (2013). Exploring the benefits of fingernail displays. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’13 (pp. 937–942), New York, NY.
  • Wobbrock, J. O. (2006). The future of mobile device research in HCI. In Proceedings of the CHI ’06 Workshop What is the Next Generation of Human–Computer Interaction? (pp. 131–134), Quebec, Canada.
  • Wobbrock, J. O., Aung, H. H., Rothrock, B., & Myers, B. A. (2005). Maximizing the guessability of symbolic input. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’05 (pp. 1869–1872), New York, NY.
  • Wobbrock, J. O., Kane, S. K., Gajos, K. Z., Harada, S., & Froehlich, J. (2011). Ability-based design: Concept, principles and examples. ACM Trans. Access. Comput, 3 (3), 1–27.
  • Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09 (pp. 1083–1092), New York, NY.
  • Wobbrock, J. O., Wilson, A. D., & Li, Y. (2007). Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST ’07 (pp. 159–168), New York, NY.
  • Wolf, K., & Henze, N. (2014). Comparing pointing techniques for grasping hands on tablets. In Proceedings of the 16th International Conference on Human–Computer Interaction with Mobile Devices & Services, MobileHCI ’14 (pp. 53–62), New York, NY.
  • Wu, S., & Adamic, L. A. (2014). Visually impaired users on an online social network. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’14 (pp. 3133–3142), New York, NY.
  • Xu, C., Israr, A., Poupyrev, I., Bau, O., & Harrison, C. (2011). Tactile display for the visually impaired using teslatouch. In CHI ’11 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’11 (pp. 317–322), New York, NY.
  • Yang, Y., & Hwang, S. (2007). Specialized design of web search engine for the blind people. In Proceedings of the 4th International Conference on Universal Access in Human–Computer Interaction: Applications and Services, UAHCI ’07 (pp. 997–1005), Berlin, Heidelberg.
  • Ye, H., Malu, M., Oh, U., & Findlater, L. (2014). Current and future mobile and wearable device use by people with visual impairments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’14 (pp. 3123–3132), New York, NY.
  • Yoon, D., Hinckley, K., Benko, H., Guimbretière, F., Irani, P., Pahud, M., & Gavriliu, M. (2015). Sensing tablet grasp + micro-mobility for active reading. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, UIST ’15 (pp. 477–487), New York, NY.
  • Zhang, J., Ong, S. K., & Nee, A. Y. C. (2008). Navigation systems for individuals with visual impairment: A survey. In Proceedings of the 2Nd International Convention on Rehabilitation Engineering & Assistive Technology, iCREATe ’08 (pp. 159–162). Kaki Bukit TechPark II, Singapore.
  • Zhao, S., & Balakrishnan, R. (2004). Simple vs. compound mark hierarchical marking menus. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, UIST ’04 (pp. 33–42), New York, NY.
  • Zhao, Y., Szpiro, S., & Azenkot, S. (2015). Foresee: A customizable head-mounted vision enhancement system for people with low vision. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, ASSETS ’15 (pp. 239–249), New York, NY.
  • Zhao, Y., Szpiro, S., Knighten, J., & Azenkot, S. (2016). Cuesee: Exploring visual cues for people with low vision to facilitate a visual search task. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp ’16 (pp. 73–84), New York, NY.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.