Abstract
Nonlinear forms of diagrammatic presentation, such as node-arc graphs, are a powerful and elegant means of visual information presentation. Although providing nonvisual access is now routine for many forms of linear information, it becomes more difficult as the structure of the information becomes increasingly nonlinear. An understanding of the ways in which graphs benefit sighted people, based on experiments and the literature, together with the difficulties encountered when exploring graphs nonvisually, helps form a solution for nonvisual access to graphs. This article proposes that differing types of annotation offer a powerful and flexible technique for transferring the benefits of graph-based diagrams, as well as for reducing disorientation while moving around the graph and for tackling some of the inherent disadvantages of using sound. Different forms of annotation that may address these problems are explored, classified, and evaluated, including notes designed to summarise and to aid node differentiation. Graph annotation may be performed automatically, creating a graph that evaluation shows requires less mental effort to explore and on which tasks can be achieved more effectively and more efficiently.
Notes
1We use the UK terminology, where a person is considered blind if he or she has a visual acuity (as measured, e.g., using a Snellen letter chart) of less than 0.05; they are considered to have low vision (i.e., be partially sighted) if their visual acuity is greater than 0.05 but less than 0.3 (CitationInternational Council of Ophthalmology, 2002). Visually impaired is a term that covers people who are either blind or partially sighted.
2Note that these terms are used differently by CitationLynch (1960) than when applied to graphs.
Background. This article is based on the Ph.D. thesis of the first author.
Support. This research was undertaken while funded by an Engineering and Physical Sciences Research Council studentship.
HCI Editorial Record. First manuscript received May 15, 2009. Revision received November 5, 2010. Accepted by Rob Jacob. Final manuscript received August 12, 2011. — Editor