45,216
Views
87
CrossRef citations to date
0
Altmetric
Review Article

Tools and Technologies for Blind and Visually Impaired Navigation Support: A Review

ORCID Icon, ORCID Icon & ORCID Icon

References

  • A. G. Nicholas, and G. E. Legge, “Blind navigation and the role of technology,” in The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, 2008, pp. 479–500.
  • A. Riazi, F. Riazi, R. Yoosfi, and F. Bahmeei, “Outdoor difficulties experienced by a group of visually impaired iranian people,” J. Curr. Ophthalmol., Vol. 28, no. 2, pp. 85–90, 2016. doi: https://doi.org/10.1016/j.joco.2016.04.002
  • R. Manduchi, S. Kurniawan, and H. Bagherinia, “Blind guidance using mobile computer vision: a usability study,” in ASSETS, 2010, pp. 241–242.
  • B.-S. Lin, C.-C. Lee, and P.-Y. Chiang, “Simple smartphone-based guiding system for visually impaired people,” Sensors, Vol. 17, no. 6, pp. 1371, 2017. doi: https://doi.org/10.3390/s17061371
  • Y. Zhao, E. Kupferstein, D. Tal, and S. Azenkot, “It looks beautiful but scary: How low vision people navigate stairs and other surface level changes,” in Proceedings of the 20th International ACM SIGAC-CESS Conference on Computers and Accessibility, ACM, 2018, pp. 307–320.
  • H. Devlin. Echolocation could help blind people learn to navigate like bats, Feb 2018.
  • Acoustical Society of America. Exploring the potential of human echolocation, Jun 2017.
  • L. Thaler, and M. A. Goodale, “Echolocation in humans: an overview,” Wiley Interdisciplinary Reviews: Cognitive Science, Vol. 7, no. 6, pp. 382–393, 2016.
  • M. Srikulwong, and E. O’Neill, “Tactile representation of landmark types for pedestrian navigation: user survey and experimental evaluation,” in Workshop on using audio and Haptics for delivering spatial information via mobile devices at MobileHCI 2010, 2010, pp. 18–21.
  • M. C. Holbrook, and A. Koenig. History and theory of teaching children and youths with visual impairments. Foundations of Education. Volume I:, 2000.
  • R. G. Long, “Orientation and Mobility research: what is known and what needs to be known,” Peabody J. Educ., Vol. 67, no. 2, pp. 89–109, 1990. doi: https://doi.org/10.1080/01619569009538683
  • J. Sánchez, M. Espinoza, M. de Borba Campos, and L. B. Merabet, “Enhancing orientation and mobility skills in learners who are blind through video gaming,” in Proceedings of the 9th ACM Conference on Creativity & Cognition, ACM, 2013, pp. 353–356.
  • A. Bhowmick, and S. M. Hazarika, “An insight into assistive technology for the visually impaired and blind people: state-of-the-art and future trends,” J. Multimodal User Interfaces, Vol. 11, no. 2, pp. 149–172, 2017. doi: https://doi.org/10.1007/s12193-016-0235-6
  • M. J. Field. Assistive and mainstream technologies for people with disabilities, Jan 1970.
  • S. Khenkar, H. Alsulaiman, S. Ismail, A. Fairaq, S. K. Jarraya, and H. Ben-Abdallah, “ENVISION: Assisted navigation of visually impaired smart-phone users,” Procedia. Comput. Sci., Vol. 100, pp. 128–135, 2016. doi: https://doi.org/10.1016/j.procs.2016.09.132
  • Á. Csapó, G. Wersényi, H. Nagy, and T. Stockman, “A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for re-search,” J. Multimodal User Interfaces, Vol. 9, no. 4, pp. 275–286, 2015. doi: https://doi.org/10.1007/s12193-015-0182-7
  • L. Ran, S. Helal, and S. Moore, “Drishti: an integrated indoor/outdoor blind navigation system and service,” in Second IEEE Annual Conference on Pervasive Computing and Communications, 2004. Proceedings of the, IEEE, 2004, pp. 23–30.
  • R. Tapu, B. Mocanu, and E. Tapu, “A survey on wearable devices used to assist the visual impaired user navigation in outdoor environments,” in 2014 11th International Symposium on Electronics and Telecommunications (ISETC), Nov 2014, pp. 1–4.
  • C. S. Silva, and P. Wimalaratne, “State-of-art-in-indoor navigation and positioning of visually impaired and blind,” in 17th International Conference on Advances in ICT for Emerging Regions, ICTer 2017 - Proceedings, 2018-Janua, 2018, pp. 271–279.
  • Z. Fei, E. Yang, H. Hu, and H. Zhou, “Review of machine vision-based electronic travel aids,” in 2017 23rd International Conference on Automation and Computing (ICAC), IEEE, 2017, pp. 1–7.
  • A. Hojjat. Enhanced navigation systems in gps denied environments for visually impaired people: A survey. arXiv preprint arXiv:1803.05987, 2018.
  • P. Chanana, R. Paul, M. Balakrishnan, and P. V. M. Rao, “Assistive technology solutions for aiding travel of pedestrians with visual impairment,” J Rehabil. Assist. Technol. Eng., Vol. 4, pp. 2055668317725993, 2017.
  • R. Manduchi. “Mobile vision as assistive technology for the blind: An experimental study,” in Computers helping people with special needs, K. Miesenberger, A. Karshmer, P. Penaz, and W. Zagler, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012, pp. 9–16.
  • V.-N. Hoang, T.-H. Nguyen, T.-L. Le, T.-T. H. Tran, T.-P. Vuong, and N. Vuillerme, “Obstacle detection and warning for visually impaired people based on electrode matrix and mobile kinect,” in 2015 2nd National Foundation for Science and Technology Development Conference on Information and Computer Science (NICS), IEEE, 2015, pp. 54–59.
  • H.-C. Huang, C.-T. Hsieh, and C.-H. Yeh, “An indoor obstacle Detection system using depth information and Region Growth,” Sensors, Vol. 15, no. 10, pp. 27116–27141, oct 2015. doi: https://doi.org/10.3390/s151027116
  • U. R. Roentgen, G. J. Gelderblom, M.-i. Soede, and L. P. De Witte, “The impact of electronic mobility devices for persons who are visually impaired: A systematic review of effects and effectiveness,” J. Visual Impair. Blin., Vol. 103, no. 11, pp. 743–753, 2009. doi: https://doi.org/10.1177/0145482X0910301104
  • V. Filipe, F. Fernandes, H. Fernandes, A. Sousa, H. Paredes, and J. Barroso, “Blind navigation support system based on Microsoft Kinect,” Procedia. Comput. Sci., Vol. 14, pp. 94–101, 2012. doi: https://doi.org/10.1016/j.procs.2012.10.011
  • Wireless Technology Advisor. Disadvantages of rfid. mostly minor or you can minimize them., Nov 2009 13.
  • S. S. Chawathe, “Lowlatency indoor localization using bluetooth beacons,” in 2009 12th International IEEE Conference on Intelligent Transportation Systems, IEEE, 2009, pp. 1–7.
  • N. Fallah, I. Apostolopoulos, K. Bekris, and E. Folmer, “Indoor human navigation systems: A survey,” Interact. Comput., Vol. 25, no. 1, pp. 21–33, 2013.
  • A. J. Moreira, R. T. Valadas, and A. M. de Oliveira Duarte, “Reducing the effects of artificial light interference in wireless infrared transmission systems,” in IET Conference Proceedings, January 1996, pp. 5–5(1).
  • D. Dakopoulos, and N. G. Bourbakis, “Wearable obstacle avoidance electronic travel aids for blind: a survey,” IEEE Trans. Syst., Man, and Cybernetics, Part C (Applications and Reviews), Vol. 40, no. 1, pp. 25–35, 2009. doi: https://doi.org/10.1109/TSMCC.2009.2021255
  • Z. Cai, D. G. Richards, M. L. Lenhardt, and A. G. Madsen, “Response of human skull to bone-conducted sound in the audiometric-ultrasonic range,” Int. Tinnitus J., Vol. 8, no. 1, pp. 3–8, 2002.
  • A. Fadell, A. Hodge, S. Zadesky, A. Lindahl, and A. Guetta. Tactile feedback in an electronic device, February 12 2013. US Patent 8,373,549.
  • R. H. Lander, and S. Haberman. Tactile feedback controlled by various medium, November 16 1999. US Patent 5,984,880.
  • S. Brewster, F. Chohan, and L. Brown, “Tactile feedback for mobile interactions,” in Proceedings of the SIGCHI Cnference on Human Factors in Computing Systems, 2007, pp. 159–162.
  • E. Hoggan, S. A. Brewster, and J. Johnston, “Investigating the effectiveness of tactile feedback for mobile touchscreens,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2008, pp. 1573–1582.
  • N. G. Bourbakis, and D. Kavraki, “An intelligent assistant for navigation of visually impaired people,” in Proceedings 2nd Annual IEEE International Symposium on Bioinformatics and Bioengineering (BIBE 2001), IEEE, 2001, pp. 230–235.
  • T. Schwarze, M. Lauer, M. Schwaab, M. Romanovas, S. Böhm, and T. Jürgensohn, “A camerabased mobility aid for visually impaired people,” KI-Künstliche Intelligenz, Vol. 30, no. 1, pp. 29–36, 2016. doi: https://doi.org/10.1007/s13218-015-0407-7
  • K. Chaccour, and G. Badr, “Novel indoor navigation system for visually impaired and blind people,” in 2015 International Conference on Applied Research in Computer Science and Engineering (ICAR), IEEE, 2015, pp. 1–5.
  • N. Karlsson, E. Di Bernardo, J. Ostrowski, L. Goncalves, P. Pirjanian, and M. E. Munich, “The vslam algorithm for robust localization and mapping,” in Proceedings of the 2005 IEEE international conference on robotics and automation, IEEE, 2005, pp. 24–29.
  • J. Bai, S. Lian, Z. Liu, K. Wang, and D. Liu, “Virtual-blind-road following-based wearable navigation device for blind people,” IEEE Trans. Consum. Electron., Vol. 64, no. 1, pp. 136–143, 2018. doi: https://doi.org/10.1109/TCE.2018.2812498
  • J. Bai, D. Liu, G. Su, and Z. Fu, “A cloud and vision-based navigation system used for blind people,” in Proceedings of the 2017 International Conference on Artificial Intelligence, Automation and Control Technologies, ACM, 2017, pp. 22.
  • B. Li, J. P. Munoz, X. Rong, Q. Chen, J. Xiao, Y. Tian, A. Arditi, and M. Yousuf, “Vision-Based mobile indoor assistive navigation Aid for blind people,” IEEE Trans. Mob. Comput., Vol. 18, no. 3, pp. 702–714, 2019. doi: https://doi.org/10.1109/TMC.2018.2842751
  • E. Marder-Eppstein, “Project tango,” in ACM SIGGRAPH 2016 Real-Time Live!, SIG-GRAPH ‘16, New York, NY, USA, Association for Computing Machinery, 2016, pp. 25.
  • J. Xiao, S. L. Joseph, X. Zhang, B. Li, X. Li, and J. Zhang, “An assistive navigation framework for the visually impaired,” IEEE Trans. Human-Mach. Syst., Vol. 45, no. 5, pp. 635–640, 2015. doi: https://doi.org/10.1109/THMS.2014.2382570
  • Y. H. Lee, and G. Medioni, “Rgb-d camera based wearable navigation system for the visually impaired,” Comput. Vis. Image. Underst., Vol. 149, pp. 3–20, 2016. doi: https://doi.org/10.1016/j.cviu.2016.03.019
  • S. L. Joseph, J. Xiao, X. Zhang, B. Chawda, K. Narang, N.-t. Rajput, S. Mehta, and L. Venkata Subramaniam, “Being aware of the world: To-ward using social media to support the blind with navigation,” IEEE Trans. Human-Mach. Syst., Vol. 45, no. 3, pp. 399–405, 2015. doi: https://doi.org/10.1109/THMS.2014.2382582
  • A. Bhowmick, S. Prakash, R. Bhagat, V. Prasad, and S. M. Hazarika, “Intellinavi: navigation for blind based on kinect and machine learning,” in International Workshop on Multi-disciplinary Trends in Artificial Intelligence, Springer, 2014, pp. 172–183.
  • M. Sain, and D. Necsulescu, “Portable Monitoring and navigation Control system for helping visually impaired people,” in Proceedings of the 4th International Conference of Control, Dynamic Systems, and Robotics (CDSR’17), 2017, pp. 1–9.
  • Z. Zhang, “Microsoft kinect sensor and its effect,” IEEE Multimedia - IEEEMM, Vol. 19, pp. 4–10, 02 2012. doi: https://doi.org/10.1109/MMUL.2012.24
  • A. Ali, and M. A. Ali, “Blind navigation system for visually impaired using windowing-based mean on Microsoft Kinect camera,” in International Conference on Advances in Biomedical Engineering, ICABME, 2017-Octob, 2017.
  • C. Ton, A. Omar, V. Szedenko, V. H. Tran, A. Aftab, F. Perla, M. J. Bernstein, and Y. Yang, “Lidar assist spatial sensing for the visually impaired and performance analysis,” IEEE Trans. Neural Syst. Rehabil. Eng., Vol. 26, no. 9, pp. 1727–1734, 2018. doi: https://doi.org/10.1109/TNSRE.2018.2859800
  • R. O’Keeffe, S. Gnecchi, S. Buckley, C. O’Murchu, A. Mathewson, S. Lesecq, and J. Foucault, “Long range lidar characterisation for obstacle detection for use by the visually impaired and blind,” in 2018 IEEE 68th Electronic Components and Technology Conference (ECTC), IEEE, 2018, pp. 533–538.
  • M. Castillo-Cara, E. Huaranga-Junco, G. Mondragón-Ruiz, A. Salazar, L. Orozco-Barbosa, and E. A. An-túnez, “Ray: smart indoor/outdoor routes for the blind using bluetooth 4.0 ble,” in ANT/SEIT, 2016, pp. 690–694.
  • T. Ishihara, J. Vongkulbhisal, K. M. Kitani, and C. Asakawa, “Beacon-guided structure from motion for smartphone-based navigation,” in 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), IEEE, 2017, pp. 769–777.
  • V. Nair, M. Budhai, G. Olmschenk, W. H. Seiple, and Z. Zhu, “Assist: personalized indoor navigation via multimodal sensors and high-level semantic information,” in Proceedings of the European Conference on computer vision (ECCV), 2018, pp. 0–0.
  • V. Nair, C. Tsangouri, B. Xiao, G. Olmschenk, Z. Zhu, and W. Seiple, “A hybrid indoor positioning system for the blind and visually impaired using Blue-tooth and Google Tango,” J. Technol. Persons Disabil., Vol. 6, pp. 62–82, 2018.
  • S. A. Cheraghi, V. Namboodiri, and L. Walker, “Guidebeacon: beacon-based indoor wayfinding for the blind, visually impaired, and disoriented,” in 2017 IEEE International Conference on Pervasive Computing and Communications (PerCom), IEEE, 2017, pp. 121–130.
  • B. Vamsi Krishna, and K. Aparna, “Iot-based in-door navigation wearable system for blind people,” in Artificial Intelligence and Evolutionary Computations in Engineering Systems, Springer, 2018, pp. 413–421.
  • N. Sathya Mala, S. Sushmi Thushara, and S. Subbiah, “Navigation gadget for visually impaired based on iot,” in 2017 2nd International Conference on Computing and Communications Technologies (ICCCT), IEEE, 2017, pp. 334–338.
  • S. B. Kallara, M. Raj, R. Raju, N.-h. J. Mathew, V. R. Padmaprabha, and D. S. Divya, “Indriya—a smart guidance system for the visually impaired,” in 2017 International Conference on Inventive computing and Informatics (ICICI), IEEE, 2017, pp. 26–29.
  • D. Vera, D. Marcillo, and A. Pereira, “Blind guide: Anytime, anywhere solution for guiding blind people,” in World Conference on Information Systems and Technologies, Springer, 2017, pp. 353–363.
  • S. Gupta, I. Sharma, A. Ti-wari, and G. Chitranshi, “Advanced guide cane for the visually impaired people,” in 2015 1st International Conference on Next Generation Computing Technologies (NGCT), IEEE, 2015, pp. 452–455.
  • A. Sen, K. Sen, and J. Das, “Ultrasonic blind stick for completely blind people to avoid any kind of obstacles,” in 2018 IEEE SENSORS, IEEE, 2018, pp. 1–4.
  • K. Patil, Q. Jawadwala, and F.-l. C. Shu, “Design and construction of electronic Aid for visually impaired people,” IEEE Trans. Human-Mach. Syst, Vol. 48, no. 2, pp. 172–182, 2018. doi: https://doi.org/10.1109/THMS.2018.2799588
  • J. Borenstein, and I. Ulrich, “Applying mobile Robot technologies to assist the visual Impaired,” GuideCane, Vol. 31, no. 2, pp. 131–136, 2001.
  • A. A. Nada, M. A. Fakhr, and A. F. Seddik, “Assistive infrared sensor based smart stick for blind people,” in 2015 Science and Information Conference (SAI), IEEE, 2015, pp. 1149–1154.
  • R. Jafri, R. L. Campos, S. A. Ali, and H. R. Arabnia, “Visual and infrared sensor data-based obstacle detection for the visually impaired using the google project tango tablet development kit and the unity engine,” IEEE. Access., Vol. 6, pp. 443–454, 2017. doi: https://doi.org/10.1109/ACCESS.2017.2766579
  • P. Marzec, and A. Kos, “Low energy precise navigation system for the blind with infrared sensors,” in 2019 MIXDES-26th International Conference” Mixed Design of Integrated Circuits and Systems”, IEEE, 2019, pp. 394–397.
  • J. Ducasse, A. M. Brock, and C. Jouffrais, “Accessible interactive maps for visually impaired users,” in Mobility of Visually Impaired People, Springer, 2018, pp. 537–584.
  • J. Albouys-Perrois, J. Laviole, C. Briant, and A. M. Brock, “Towards a multisensory augmented reality map for blind and low vision people: A participatory design approach,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 2018, pp. 1–14.
  • T. Götzelmann, and K. Winkler, “Smart-tactmaps: a smartphone-based approach to support blind persons in exploring tactile maps,” in Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments, 2015, pp. 1–8.
  • Q. Liu, R. Li, H. Hu, and D.-b. Gu, “Building semantic maps for blind people to navigate at home,” in 2016 8th Computer Science and Electronic Engineering (CEEC), IEEE, 2016, pp. 12–17.
  • T. Götzelmann, “Lucentmaps: 3d printed audiovisual tactile maps for blind and visually impaired people,” in Proceedings of the 18th International ACM Sigaccess Conference on Computers and ccessibility, 2016, pp. 81–90.
  • C. Gleason, A. Guo, G. Laput, K. Kitani, and J. P. Bigham, “Vizmap: Accessible visual information through crowdsourced map reconstruction,” in Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, 2016, pp. 273–274.
  • S. Caraiman, A. Morar, M. Owczarek, A. Burlacu, D. Rzeszotarski, N. Botezatu, P. Herghelegiu, F. Moldoveanu, P. Strumillo, and A. Moldoveanu, “Computer vision for the visually impaired: The sound of vision system,” in Proceedings - 2017 IEEE International Conference on Computer Vision Workshops, ICCVW 2017, 2018-Janua, 2018, pp. 1480–1489.
  • G. Balakrishnan, G. Sainarayanan, R. Nagarajan, and S. Yaacob, “Wearable real-time stereo vision for the visually impaired,” Eng. Lett., Vol. 14, no. 2, 2007, pp. 6–14.
  • D. Sato, U. Oh, K. Naito, H.-r. Takagi, K. Kitani, and C. Asakawa, “Navcog3,” in Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS ‘17, 2017, pp. 270–279.
  • A. Ganz, J. M. Schafer, Y. Tao, C. Wilson, and M. Robertson, “Perceptii: smartphone based indoor navigation system for the blind,” in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Aug 2014, pp. 3662–3665.
  • S. Ren, K. He, R. Girshick, and J. Sun, “Faster r-cnn: Towards real-time object detection with region proposal networks,” in Advances in Neural Information Processing Systems, 2015, pp. 91–99.
  • J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 779–788.
  • T. Mataŕo, F. Masulli, S. Rovetta, A. Cabri, C. Traverso, E. Capris, and S. Torretta, “An assistive mobile system supporting blind and visual impaired people when are outdoor,” in RTSI 2017 - IEEE 3rd International Forum on Research and Technologies for Society and Industry, Conference Proceedings, 2017, pp. 1–6.
  • J. Lock, G. Cielniak, and N. Bellotto. A portable navigation system with an adaptive multimodal interface for the blind. AAAI Spring Symposium - Technical Report, SS-17-01 -:395–400, 2017.
  • W. Heuten, N. Henze, S. Boll, and M. Pielot, “Tactile wayfinder: a non-visual support system for wayfinding,” in Proceedings of the 5th Nordic Conference on Human-computer Interaction: Building Bridges, ACM, 2008, pp. 172–181.
  • R. Manduchi, and J. Coughlan, “(com-puter) vision without sight,” Commun. ACM, Vol. 55, no. 1, pp. 96, 2012. doi: https://doi.org/10.1145/2063176.2063200
  • C. Jayant, H. Ji, S. White, and J. P. Bigham, “Supporting blind photography,” in The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, 2011, pp. 203–210.
  • M. Vázquez, and A. Steinfeld, “Helping visually impaired users properly aim a camera,” in Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility, 2012, pp. 95–102.
  • L. Maddalena, and A. Petrosino, “Moving object detection for real-time applications,” in 14th International Conference on Image Analysis and Processing (ICIAP 2007), IEEE, 2007, pp. 542–547.
  • G. Regal, E. Mattheiss, M. Busch, and M. Tscheligi, “Insights into internet privacy for visually impaired and blind people,” in International Conference on Computers Helping People with Special Needs, Springer, 2016, pp. 231–238.
  • F. E. Sandnes, “What do low-vision users really want from smart glasses? faces, text and perhaps no glasses at all,” in International Conference on Computers Helping People with Special Needs, Springer, 2016, pp. 187–194.
  • M. Gori, G. Cappagli, A. Tonelli, G. Baud-Bovy, and S. Finocchietti, “De-vices for visually impaired people: high technological devices with low user acceptance and no adaptability for children,” Neurosci. Biobehav. Rev., Vol. 69, pp. 79–88, 2016. doi: https://doi.org/10.1016/j.neubiorev.2016.06.043
  • L. Liu, W. Ouyang, X. Wang, P. Fieguth, J. Chen, X. Liu, and M. Pietikäinen, “Deep learning for generic object detection: A survey,” Int. J. Comput. Vision, Vol. 128, no. 2, pp. 261–318, 2020. doi: https://doi.org/10.1007/s11263-019-01247-4
  • Z.-Q. Zhao, “Peng Zheng, Shoutao Xu, and Xindong Wu. object detection with deep learning: A review,” IEEE Trans. Neural Netw. Learn. Sys., Vol. 30, no. 11, pp. 3212–3232, 2019. doi: https://doi.org/10.1109/TNNLS.2018.2876865
  • A. Caspo, G. Wersényi, and M.-o. Jeon, “A survey on hardware and software solutions for multimodal wearable assistive devices targeting the visually impaired,” Acta Polytech. Hung., Vol. 13, no. 5, pp. 39, 2016.
  • S. Real, and A. Araujo, “Navigation systems for the blind and visually impaired: Past work, challenges, and open problems,” Sensors, Vol. 19, no. 15, pp. 3404, 2019. doi: https://doi.org/10.3390/s19153404
  • W. Jeamwatthanachai, M. Wald, and G. Wills, “Indoor navigation by blind people: Behaviors and challenges in unfamiliar spaces and buildings,” Brit. J. Visual Impair., Vol. 37, no. 2, pp. 140–153, 2019. doi: https://doi.org/10.1177/0264619619833723
  • M. Bousbia-Salah, and M. Fezari, “A navigation tool for blind people,” in Innovations and Advanced Techniques in Computer and Information Sciences and Engineering, Springer, 2007, pp. 333–337.
  • A. Abdolrahmani, W. Easley, M. Williams, S. Branham, and A. Hurst, “Embracing errors: Examining how context of use impacts blind individuals’ acceptance of navigation aid errors,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 2017, pp. 4158–4169.
  • M. A. Williams, E. Buehler, A. Hurst, and S. K. Kane, “What not to wearable: using participatory workshops to explore wearable device form factors for blind users,” in Proceedings of the 12th Web for All Conference, 2015, pp. 1–4.
  • P. Angin, B. K. Bhargava, et al., “Real-time mobile-cloud computing for context-aware blind navigation,” Int. J. Next-Generation Compu., Vol. 2, no. 2, pp. 405–414, 2011.