The inability of computer users who are visually impaired to access graphical user interfaces (GUIs) has led researchers to propose approaches for adapting GUIs to auditory interfaces, with the goal of providing access for visually impaired people. This article outlines the issues involved in nonvisual access to graphical user interfaces, reviews current research in this field, classifies methods and approaches, and discusses the extent to which researchers have resolved these issues.
Get full access to this article
View all access options for this article.
References
1.
AltyJ. L., & RigasD. I. (1998). Communicating graphical information to blind users using music: The role of context. CHI ‘98: Conference on Human Factors in Computing Systems (pp. 574–581). New York: ACM Press.
2.
AsakawaC., & ItohT. (1998). User interface of a home page reader. In Proceedings of the Third International ACM Conference on Assistive Technologies, 1998 (pp. 149–156). New York: ACM Press.
3.
AsakawaC., TakagiH., InoS., & IfukubeT. (2002). Auditory and tactile interfaces for representing visual effects on the web. Proceedings of the Fifth International ACM Conference on Assistive Technologies, 2002(pp. 65–72). New York: ACM Press.
4.
BeddoesM. P. (1968). An inexpensive reading instrument with a sound output for the blind. IEEE Transaction on Biomedical Engineering, 15, 70–79.
5.
BlattnerM., SumikawaD., & GreenbergR. (1989). Earcons and icons: Their structure and common design principles. Human-Computer Interaction, 4, 11–44.
6.
BlyS. (1981). Presenting information in sound. Proceedings of the 1982 Conference on Human Factors in Computing Systems (pp. 371–375). New York: ACM Press.
7.
BrewsterS. A. (2002). Visualisation tools for blind people using multiple modalities. Disability and Rehabilitation Technology, 24, 613–621.
8.
BrewsterS. A., RatyV., & KortekangasA. (1995). Representing complex hierarchies with earcons (Working paper ERCIM-05/95R037 of the European Research Consortium for Informatics and Mathematics Research Reports) (Online). Available: ftp://ftp.inria.fr/associations/ERCIM/research_reports/pdf/0595R037.pdf
9.
BrewsterS. A., WrightP. C., & EdwardsA. D. N. (1993). An evaluation of earcons for use in auditory human-computer interfaces. Proceedings of the INTERCHI ‘93 Conference on Human Factors in Computing Systems (pp. 222–227). New York: ACM Press.
10.
BuxtonW. (1989). Introduction to this Special Issue on Non-Speech Audio. Human-Computer Interaction, 4, 1–9.
11.
D'AlbeF. (1920). The Optophone: An instrument for reading by ear. Nature, 105, 295–296.
12.
DonkerH., KlanteP., & GornyP. (2002). The design of auditory user interfaces for blind users. Proceedings of the Second Nordic Conference on Human-Computer Interaction (pp. 149–155). New York: ACM Press.
13.
EdwardsA. D. N. (1989). Modelling blind users’ interactions with an auditory computer interface. International Journal of Man-Machine Studies, 30, 575–589.
14.
EdwardsW. K., MynattE., & StocktonK. (1994). Providing access to graphical user interfaces—Not graphical screens. Proceedings of the First Annual ACM Conference on Assistive Technologies (pp. 47–51). New York: ACM Press.
15.
FNB. (2004). TeDUB: Technical drawings understanding for the blind [Online]. Available: http://www.tedub.net
16.
GaverW. W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2, 11–44.
17.
GaverW. W. (1993). Synthesizing auditory icons. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 228–235). New York: ACM Press.
18.
GerberE. (2003). The benefits of and barriers to computer use for individuals who are visually impaired. Journal of Visual Impairment & Blindness, 97, 536–550.
19.
ItohK., & YonezawaY. (1990). Support system for handwriting characters and drawing figures for the blind using feedback of sound imaging signals. Journal of Microcomputer Applications, 13, 177–183.
20.
KennelA. R. (1996). Audiograf: A diagram reader for the blind. Proceedings of the Second Annual ACM Conference on Assistive Technologies (pp. 51–56). New York: ACM Press.
21.
LudwigL. F., PinceverN., & CohenM. (1990). Extending the notion of a window system to audio. IEEE Computer, 23, 66–72.
22.
LunneyD., MorrisonR. C. (1981). High technology laboratory aids for visually handicapped chemistry students. Journal of Chemistry Education, 58, 228–231.
23.
MarxA. N. (1994). Using metaphor effectively in user interface design. Conference Companion on Human Factors in Computing Systems (pp. 1–2). New York: ACM Press.
24.
MeijerP. B. L. (1992). An experimental system for auditory image representations. IEEE Transactions on Biomedical Engineering, 39, 112–121.
25.
MynattE. (1992). Auditory presentation of graphical user interfaces. Proceedings of the 1992 International Conference on Auditory Display (pp. 1–18). Santa Fe: Addison-Wesley.
26.
MynattE. (1997). Transforming graphical interfaces into auditory interfaces for blind users. Human Computer Interaction, 12, 7–45.
27.
MynattE., & EdwardsW. K. (1992a). Mapping GUIs to auditory interfaces. Proceedings of the Fifth Annual Symposium on User Interface Software and Technology (pp. 61–70). New York: ACM Press.
28.
MynattE., & EdwardsW. K. (1992b). The Mercator environment: A nonvisual interface to X Windows and UNIX workstations. Proceedings of the ACM Symposium on User Interface Software and Technology (pp. 92–105). New York: ACM Press.
29.
MynattE., & EdwardsW. K. (1995). Metaphors for nonvisual computing. In EdwardsA., & LongJ. (Eds.), Extraordinary human-computer interaction: Interfaces for users with disabilities (pp. 201–220). New York: Cambridge University Press.
30.
MynattE., & WeberG. (1994). Nonvisual presentation of graphical user interfaces: Contrasting two approaches. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Celebrating Interdependence (pp. 166–172). New York: ACM Press.
PerrettS., & NobleW. (1997). The effect of head rotations on vertical plane sound localization. Journal of the Acoustical Society of America, 102, 2325–2332.
33.
PetrieH., SchliederC., BlenkhornP., EvansG., KingA., O'NeillA.-M., IoannidisG., GallagerB., CrombieD., MagerR., & AlafaciM. (2002). TeDUB: A system for presenting and exploring technical drawings for blind people. In MiesenbergerK., KlausJ., & ZaglerW. (Eds.), Computers helping people with special needs (pp. 537–539). New York: Springer.
34.
RamlollR., BrewsterS., YuW., & RiedelB. (2001). Using non-speech sounds to improve access to 2-D tabular numerical information for visually impaired users. Proceedings of the First Workshop on Human Computer Interaction with Mobile Devices, 515–530.
35.
RothP., PetrucciL., PunT., & AssimacopoulosA. (1999). Auditory browser for blind and visually impaired users. CHI ‘99 Extended Abstracts on Human Factors in Computing Systems (pp. 218–219). New York: ACM Press.
36.
SavidisA., StephanidisC., KorteA., CrispienK., & FellbaumK. (1996). A generic direct-manipulation 3-D–auditory environment for hierarchical navigation in non-visual interaction. Proceedings of the Second Annual ACM Conference on Assistive Technologies (pp. 117–123). New York: ACM Press.
37.
ShneidermanB. (2003). Designing the user interface (3rd ed.). Reading, MA: Addison-Wesley.
38.
TobiasJ. (2003). Information technology and universal design: An agenda for accessible technology. Journal of Visual Impairment & Blindness, 97, 592–601.
39.
van DamA. (1997). Post WIMP user interfaces. Communications of the ACM, 40, 63–67.