Abstract
In recent years, a large number of research projects have focused on the use of auditory representations in a broadened scope of application scenarios. Results in such projects have shown that auditory elements can effectively complement other modalities not only in the traditional desktop computer environment but also in virtual and augmented reality, mobile platforms, and other kinds of novel computing environments. The successful use of auditory representations in this growing number of application scenarios has in turn prompted researchers to rediscover the more basic auditory representations and extend them in various directions. The goal of this article is to survey both classical auditory representations (e.g., auditory icons and earcons) and those auditory representations that have been created as extensions to earlier approaches, including speech-based sounds (e.g., spearcons and spindex representations), emotionally grounded sounds (e.g., auditory emoticons and spemoticons), and various other sound types used to provide sonifications in practical scenarios. The article concludes by outlining the latest trends in auditory interface design and providing examples of these trends.
Supplemental Material
Available for Download
Supplemental movie, appendix, image and software files for, Overview of auditory representations in human-machine interfaces
- Absar, R. and Guastavino, C. 2008. Usability of non-speech sounds in user interfaces. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Ando, B. 2008. A smart multisensor approach to assist blind people in specific urban navigation tasks. IEEE Trans. Neural Syst. Rehabil. Eng. 16, 6, 592--594.Google ScholarCross Ref
- Ando, B. and Graziani, S. 2009. Multisensor strategies to assist blind people: A clear-path indicator. IEEE Trans. Instr. Measurement 58, 8, 2488--2494.Google ScholarCross Ref
- Ballas, J. A. 1993. Common factors in the identification of an assortment of brief everyday sounds. J. Exp. Psychol. Human 19, 2, 250--267.Google ScholarCross Ref
- Barrass, S. 1996a. TaDa! Demonstrations of auditory information design. In Proceedings of the International Conference on Auditory Display, 6 pages.Google Scholar
- Barrass, S. 1996b. EarBenders: Using stories about listening to design auditory interfaces. In Proceedings of the 1st Asia-Pacific Conference on Human-Computer Interaction (APCHI’96). 525--538.Google Scholar
- Barrass, S. 1998. Auditory information design. Ph.D. thesis. Australian National University. Google ScholarDigital Library
- Basta, D., Singbartl, F., Todt, I., Clarke, A., and Ernst, A. 2008. Vestibular rehabilitation by auditory feedback in otolith disorders. Gait Posture 28, 3, 397--404.Google ScholarCross Ref
- Begault, D. R., Wenzel, E., and Anderson, M. 2001. Direct comparison of the impact of head tracking reverberation, and individualized head-related transfer functions on the spatial perception of a virtual speech source. J. Audio Eng. Soc. 49, 10, 904--917.Google ScholarCross Ref
- Bertin, J. 1981. Graphics and Graphic Information Processing. Walter de Gruyter and Co.Google Scholar
- Bezzi, M., Depoli, G., and Rocchesso, D. 1999. Sound authoring tools for future multimedia systems. In Proceedings of the IEEE International Conference on Multimedia Computing and Systems. 512--517. Google ScholarDigital Library
- Blattner, M. M., Sumikawa, D. A., and Greenberg, R. M. 1989. Earcons and icons: Their structure and common design principles. Hum.-Comput. Interact. 4, 1, 11--44. Google ScholarDigital Library
- Blauert, J. 1983. Spatial Hearing. MIT Press, Cambridge, MA.Google Scholar
- Bonebright, T. and Nees, M. 2007. Memory for auditory icons and earcons with localization cues. In Proceedings of the International Conference on Auditory Display. 419--422.Google Scholar
- Bourbakis, N. 2008. Sensing surrounding 3-D space for navigation of the blind. IEEE Eng. Med. Biol. Mag. 27, 1, 49--55.Google ScholarCross Ref
- Bovermann, T., Hermann, T., and Ritter, H. 2006. Tangible data scanning sonification model. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Bovermann, T. 2009. Tangible auditory interfaces: Combining auditory displays and tangible interfaces. Ph.D. thesis. University of Bielefeld.Google Scholar
- Boyd, L. H., Boyd, W. L., and Vanderheiden, G. C. 1990. The graphical user interface: Crisis, danger and opportunity. J. Vis. Impair. Blind. 48, 10, 496--502.Google ScholarCross Ref
- Brewster, S. 1994. Providing a structured method for integrating non-speech audio into human-computer interfaces. Ph.D. thesis. University of York.Google Scholar
- Brewster, S., Wright, P., and Edwards, A. 1995. Parallel earcons: Reducing the length of audio messages. Int. J. Hum.-Comput. St. 43, 2, 153--175. Google ScholarDigital Library
- Brewster, S. 1998. Using non-speech sounds to provide naviation cues. ACM Trans. Comput.-Hum. Interact. 5, 2, 224--259. Google ScholarDigital Library
- Brock, D., Ballas, J. A., and McFarlane, D. 2005. Encoding urgency in legacy audio alerting systems. In Proceedings of the 11th International Conference on Auditory Display. Limerick, Ireland.Google Scholar
- Bussemakers, M. P. and de Haan, A. 2000. When it sounds like a duck and it looks like a dog: Auditory icons vs. earcons in multimedia environments. In Proceedings of the International Conference on Auditory Display. 184--189.Google Scholar
- Capelle, C., Trullemans, C., Arno, P., and Veraart, C. 1998. A real-time experimental prototype for enhancement of vision rehabilitation using auditory system. IEEE Trans. Biomed. Eng. 45, 10, 1279--1293.Google ScholarCross Ref
- Cardin, S., Thalmann, D., and Vexo, F. 2007. A wearable system for mobility improvement of visually impaired people. Vis. Comput. 23, 2, 109--118. Google ScholarDigital Library
- Carello, C., Anderson, K. L., and Kunkler-Peck, A. J. 1998. Perception of object length by sound. Psych. Sci. 9, 3, 211--214.Google ScholarCross Ref
- Cheng, C. I. and Wakefield, G. H. 2001. Introduction to head-related transfer functions (HRTFs): Representations of HRTFs in time, frequency, and space. J. Audio Eng. Soc. 49, 4, 231--249.Google Scholar
- Crispien, K. and Petrie, H. 1993. Providing access to GUI's using multimedia system—based on spatial audio representation. J. Audio Eng. Soc. 95th Convention Preprint.Google Scholar
- Csapó, A. and Baranyi, P. 2011. Perceptual interpolation and open-ended exploration of auditory icons and earcons. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Csapó, A. and Baranyi, P. 2012. The spiral discovery method: An interpretable tuning model for CogInfoCom channels. JACIII 16, 2, 358--367.Google ScholarCross Ref
- Dakopoulos, D. and Bourbakis, N. G. 2010. Wearable obstacle avoidance electronic travel aids for blind: A survey. IEEE T. Syst. Man Cy. C 40, 1, 25--35. Google ScholarDigital Library
- Debnath, N., Hailani, Z. A., Jamaludin, S., and Aljunid, S. A. K. 2001. An electronically guided walking stick for the blind. In Proceedings of the 23rd Annual EMBS International Conference. 1377--1379.Google Scholar
- Dingler, T., Lindsay, J., and Walker, B. N. 2008. Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Dobrucki, A., Plaskota, P., Pruchnicki, P., Pec, M., Bujacz, M., and Strumillo, P. 2010. Measurement system for personalized head-related transfer functions and its verification by virtual source localization trials with visually impaired and sighted individuals. J. Audio Eng. Soc. 58, 9, 724--738.Google Scholar
- Dozza, M., Chiari, L., and Horak, F. B. 2004. A portable audio-biofeedback system to improve postural control. In Proceedings of the IEEE Annual International Conference of the Engineering in Medicine and Biology Society. 4799--4802.Google Scholar
- Ebling, M. 2009. Virtual senses. IEEE Pervasive Comput. 8, 4, 4--5. Google ScholarDigital Library
- Edworthy, J. and Hards, R. 1999. Learning auditory warnings: The effects of sound type, verbal labelling and imagery on the identification of alarm sounds. Int. J. Ind. Ergonom. 24, 603--618.Google ScholarCross Ref
- Edworthy, J., Loxley, S., and Dennis, I. 1991. Improving auditory warning design: Relationship between warning sound parameters and perceived urgency. Hum. Factors 33, 2, 205--231.Google ScholarCross Ref
- Fagerlonn, J. and Alm, H. 2010. Auditory signs to support traffic awareness. IET Intell. Transp. Syst. 4, 4, 262--269.Google ScholarCross Ref
- Fernstrom, M. and Brazil, E. 2004. Human-computer interaction design based on interactive sonification—hearing actions or instruments/agents. In Proceedings of the International Workshop on Interactive Sonification.Google Scholar
- Fish, R. M. 1976. An audio display for the blind. IEEE Trans. Biomed. Eng. 23, 2, 144--154.Google ScholarCross Ref
- Frauenberger, C. and Stockman, T. 2009. Auditory display design—an investigation of a design pattern approach. Int. J. Hum.-Comput. St. 67, 11, 907--922. Google ScholarDigital Library
- Frohlich, P. and Hammer, F. 2004. Expressive text-to-speech: A user-centred approach to sound design in voice-enabled mobile applications. In Proceedings of the 2nd Symposium on Sound Design. 4 pages.Google Scholar
- Gaver, W. W. 1986. Auditory icons: Using sound in computer interfaces. Hum.-Comput. Interact. 2, 2, 167--177. Google ScholarDigital Library
- Gaver, W. W. 1988. Everyday listening and auditory icons. Ph.D. thesis. University of California. Google ScholarDigital Library
- Gaver, W. W. 1989. The SonicFinder, a prototype interface that uses auditory icons. Hum.-Comput. Interact. 4, 67--94. Google ScholarDigital Library
- Gaver, W. W. 1993. Synthesizing auditory icons. In Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Series. 228--235. Google ScholarDigital Library
- Gaver, W. W. 1991. Sound support for collaboration. In Proceedings of ECSCW'91, Amsterdam, The Netherlands, 293--308. Google ScholarDigital Library
- Gaver, W. W. 1997. Auditory interfaces. Hum.-Comput. Interact. 4, 67--94.Google ScholarDigital Library
- Ghez, C., Rikakis, T., du Bois, R. L., and Cook, P. R. 2000. An auditory display system for aiding interjoint coordination. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Godbout, A. and Boyd, J. E. 2010. Corrective sonic feedback for speed skating: A case study. In Proceedings of the International Conference on Auditory Display. 23--30.Google Scholar
- González, J., Yu, W., and Arieta, A. H. 2010. Multichannel audio biofeedback for dynamical coupling between prosthetic hands and their users. Ind. Robot 37, 2, 148--156.Google ScholarCross Ref
- Graham, R. 1999. Use of auditory icons as emergency warnings: Evaluation within a vehicle collision avoidance application. Ergonomics 42, 9, 1233--1248.Google ScholarCross Ref
- GW Micro. Window-Eyes. http://www.gwmicro.com/Window-Eyes/.Google Scholar
- Gygi, B. 2004. Studying environmental sounds the Watson way. J. Acoustical Soc. Am. 115, 5, 2574.Google ScholarCross Ref
- Gygi, B. and Shafiro, V. 2009. From signal to substance and back: Insights from environmental sound research to auditory display design. In Proceedings of the International Conference on Auditory Display. 240--251. Google ScholarDigital Library
- Gygi, B., Kidd, G. R., and Watson, C. S. 2004. Spectral-temporal factors in the identification of environmental sounds. J. Acoustical Soc. Am. 115, 3, 1252--1265.Google ScholarCross Ref
- Haas, E. C. and Edworthy, J. 1999. The perceived urgency and detection time of multitone auditory signals. In N. A. Stanton and J. Edworthy, Eds., Human Factors in Auditory Warnings. Gower Technical Press, 129--149.Google Scholar
- Harness, S. J., Pugh, K., Sherkat, N., and Whitrow, R. J. 1993. Enabling the use of Windows environment by the blind and partially sighted. In Proceedings of the IEEE Colloquium on Information Access for People with Disability. 1--3.Google Scholar
- Hearst, M. 1997. Dissonance on audio interfaces. IEEE Expert 12, 5. Google ScholarDigital Library
- Heller, L. M. and Wolf, L. 2002. When sound effects are better than the real thing. J. Acoustical Soc. Am. 111, 5, 2339.Google ScholarCross Ref
- Hellier, E. J., Edworthy, J., and Dennis, I. 1993. Improving auditory warning design: Quantifying and predicting the effects of different warning parameters on perceived urgency. Hum. Factors 35, 4, 693--706.Google ScholarCross Ref
- Hermann, T. 2002. Sonification for exploratory data analysis. Ph.D. thesis. Bielefeld University.Google Scholar
- Hermann, T. and Hunt, A. 2005. An introduction to interactive aonification. IEEE Multimedia 12, 2, 20--24. Google ScholarDigital Library
- Hermann, T. and Ritter, H. 1999. Listen to your data: Model based sonification for data analysis. In G. E. Lasker, Ed., Advances in Intelligent Computing and Multimedia Systems. 189--194.Google Scholar
- Hermann, T. and Zehe, S. 2011. Sonified aerobics—interactive sonification of coordinated body movements. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Hermann, T., Hunt, A., and Neuhoff, J. G. (Eds.). 2011. The Sonification Handbook. Logos Publishing House, Berlin.Google Scholar
- Hersh, M. and Johnson, M. (Eds.). 2008. Assistive Technology for Visually Impaired and Blind People. Springer, 51--58, 289--299. Google ScholarDigital Library
- Hunt, A. and Hermann, T. 2004. The importance of interaction in sonification. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Jameson, B. and Manduchi, R. 2010. Watch your head: A wearable collision warning system for the blind. In Proceedings of IEEE Sensors. 1922--1927.Google Scholar
- Jeon, M. and Walker, B. 2009. Spindex: Accelerated initial speech sounds improve navigation performance in auditory menus. Proc. Hum. Fact. Ergon. Soc. Annu. Meet. 53, 17, 1081--1085.Google ScholarCross Ref
- Jeon, M. and Walker, B. 2011. Spindex (speech index) improves auditory menu acceptance and navigation performance. ACM Trans. Access. Comput. 3, 3, 10. Google ScholarDigital Library
- Jylha, A. and Erkut, C. 2011. Auditory feedback in an interactive rhythmic tutoring system. In Proceedings of the 6th Audio Mostly Conference: A Conference on Interaction with Sound. 109--115. Google ScholarDigital Library
- Kahol, K., Tripathi, P., Panchanathan, S., and Goldberg, M. 2004. Formalizing cognitive and motor strategy of haptic exploratory movements of individuals who are blind. In Proceedings of the 3rd IEEE International Workshop on Haptic, Audio and Visual Environments and Their Applications. 25--30.Google Scholar
- Kay, L. 1973. The design and evaluation of a sensory aid to enhance spatial perception of the blind. Electrical Eng. Report 21, Department of Electrical Engineering, University of Canterbury, New Zealand.Google Scholar
- Kay, L. 1984. Electronic aids for blind persons: An interdisciplinary subject. IEEE Proc. A, Phys. Sci. Meas. Instrum. Manage. Educ., Rev. 131, 7, 559--576.Google Scholar
- Kim, J. K. and Zatorre, R. J. 2008. Generalized learning of visual-to-auditory substitution in sighted individuals. Brain Res. 1242, 263--275.Google ScholarCross Ref
- Kramer, G. 1993. Auditory Display: Sonification, Audification and Auditory Interfaces. Perseus Publishing. Google ScholarDigital Library
- Kummer, N., Kadish, D., Dulic, A., and Najjaran, H. 2012. The empathy machine. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC’12). 2265--2271.Google Scholar
- Kunkler-Peck, A. J. and Turvey, M. T. 2000. Hearing shape. J. Exp. Psychol. Hum. Percept. Perform. 26, 1, 279--294.Google ScholarCross Ref
- Lahav, O., Schloerb, D. W., Kumar, S., and Srinivasan, M. A. 2008. BlindAid: A learning environment for enabling people who are blind to explore and navigate through unknown real spaces. In Proceedings of the Conference on Virtual Rehabilitation. Vancouver, Canada, 193--197.Google Scholar
- Legroux, S., Manzolli, J., and Verschure, P. 2007. Interactive sonification of the spatial behavior of human and synthetic characters in a mixed-reality environment. In Proceedings of the 10th Annual International Workshop on Presence. 27--34.Google Scholar
- Leplatre, G. and Brewster, S. 1998. An investigation of using music to provide navigation cues. In Proceedings of the International Conference on Auditory Display. 10 pages. Google ScholarDigital Library
- Liard, C. and Beghdadi, A. 2001. An audiodisplay tool for visually impaired people: The sound screen system. In Proceedings of the 6th International Symposium on Signal Processing and Its Applications. 198--201.Google Scholar
- Loomis, J. M., Marston, J. R., Golledge, R. G., and Klatzky, R. L. 2005. Personal guidance system for people with visual impairment: A comparison of spatial displays for route guidance. J. Vis. Impair. Blind. 99, 219--232.Google ScholarCross Ref
- Lopez, M. J. and Pauletto, S. 2009. The design of an audio film for the visually impaired. In Proceedings of the International Conference on Auditory Display. 210--216.Google Scholar
- López, J. J., Cobos, M., and Pueo, B. 2010. Elevation in wave-field synthesis using HRTF cues. Acta Acust. United Ac. 96, 2, 340--350.Google ScholarCross Ref
- Massimino, M. 1992. Sensory substitution for force feedback in space teleoperation. Ph.D. thesis. MIT Department of Mechanical Engineering, Cambridge, MA.Google Scholar
- Massimino, M. 1995. Improved force perception through sensory substitution. Control Eng. Pract. 3, 2, 215--222.Google ScholarCross Ref
- McGee-Lennon, M. R. and Brewster, S. 2011. Reminders that make sense: Designing multimodal notifications for the home. In Proceedings of the 5th International Conference on Pervasive Computing Technologies for Healthcare. 495--501.Google Scholar
- McGee-Lennon, M. R., Wolters, M. K., McLachlan, R., Brewster, S., and Hall, C. 2011. Name that tune: Musicons as reminders in the home. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). 2803--2806. Google ScholarDigital Library
- McGookin, D. and Brewster, S. 2004. Understanding concurrent earcons: Applying auditory scene analysis principles to concurrent earcon recognition. ACM Trans. Appl. Percept. 1, 2, 130--155. Google ScholarDigital Library
- McKiel, F. 1992. Audio-enabled graphical user interface for the blind or visually impaired. In Proceedings of the Johns Hopkins National Search for Computing Applications to Assist Persons with Disabilities. 185--187.Google ScholarCross Ref
- McLachlan, R., McGee-Lennon, M., and Brewster, S. 2012. The sound of musicons: Investigating the design of musically derived audio cues. In Proceedings of the International Conference on Auditory Display. 148--155.Google Scholar
- Meijer, P. 1992. An experimental system for auditory image representations. IEEE Trans. Biomed. Eng. 39, 2, 112--121.Google ScholarCross Ref
- Moller, H. 1992. Fundamentals of binaural technology. Appl. Acoust. 36, 171--218.Google ScholarCross Ref
- Moller, H., Sorensen, M. F., Hammershoi, D., and Jensen, C. B. 1995. Head-related transfer functions of human subjects. J. Audio Eng. Soc. 43, 5, 300--321.Google Scholar
- Morrissette, D. L., Goodrich, G. L., and Hennessey, J. J. 1981. A follow-up study of the Mowat Sensor's applications, frequency of use, and maintenance reliability. J. Vis. Impair. Blind. 75, 211--247.Google Scholar
- Murphy, E. F. 1971. The VA-Bionic laser cane for the blind. In National Research Council (Ed.), Evaluation of Sensory Aids for the Visually Handicapped. National Academy of Sciences.Google Scholar
- Mustonen, M.-S. 2008. A review-based conceptual analysis of auditory signs and their design. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Mynatt, E. D. 1997. Transforming graphical interfaces into auditory interfaces for blind users. Hum.-Comput. Interact. 12, 1, 7--45. Google ScholarDigital Library
- Nees, M. A. and Walker, B. N. 2007. Listener, task, and auditory graph: Toward a conceptual model of auditory graph comprehension. In Proceedings of the International Conference on Auditory Display. 266--273.Google Scholar
- Németh, G., Olaszy, G., and Csapó, T. G. 2011. Spemoticons: Text-to-speech based emotional auditory cues. International Conference on Auditory Display (keynote lecture).Google Scholar
- O’Brien, J. F., Shen, C., and Gatchalian, C. M. 2002. Synthesizing sounds from rigid-body simulations. In Proceedings of the ACM Siggraph /Eurohaptics Symposium on Computer Animation. 175--182. Google ScholarDigital Library
- Palladino, D. K. and Walker, B. N. 2007. Learning rates for auditory menus enhanced with spearcons versus earcons. In Proceedings of the International Conference on Auditory Display. 6 pages.Google Scholar
- Parseihian, G. and Katz, B. F. G. 2012. Morphocons: A new sonification concept based on morphological earcons. J. Audio Eng. Soc. 60, 6, 409--418.Google Scholar
- Patterson, R. D. 1982. Guidelines for auditory warning systems on civil aircraft. CAA paper 82017. London Civil Aviation Authority.Google Scholar
- Patterson, R. D. 1989. Guidelines for the design of auditory warning sounds. Proc. Inst. Acoust. 11, 5, 17--24.Google Scholar
- Patterson, R. D. and Mayfield, T. F. 1990. Auditory warning sounds in the work environment. Phil. Trans. R. Soc. London 327, 1241, 485--492.Google Scholar
- Petrie, H. and Morley, S. 1998. The use of non-speech sounds in non-visual interfaces to the MS Windows GUI for blind computer users. In Proceedings of the International Conference on Auditory Display. 1--5. Google ScholarDigital Library
- Pressey, N. 1977. Mowat sensor. Focus 3, 35--39.Google Scholar
- Rocchesso, D., Bresin, R., and Fernstrom, M. 2003. Sounding objects. IEEE Multimedia 10, 2, 42--52. Google ScholarDigital Library
- Schafer, R. M. 1977. The Tuning of the World. McClelland and Stewart Limited.Google Scholar
- Schaeffer, P. 1966. Traite des objects musicaux. Editions du Seuil.Google Scholar
- Schaffert, N., Gehret, R., and Mattes, K. 2012. Modeling the rowing stroke cycle acoustically. J. Audio Eng. Soc. 60, 7/8, 551--560.Google Scholar
- Schaffert, N., Mattes, K., and Effenberg, A. O. 2010. A sound design for acoustic feedback in elite sports. In S. Ystad, M. Aramaki, R. Kronland-Martinet, and K. Jensen, Eds., Auditory Display (CMMR/ICAD 2009 post proceedings ed.). Lecture Notes in Computer Science (LNCS) 5954, Springer Verlag, Berlin, 143--165. Google ScholarDigital Library
- Schaffert, N., Mattes, K., and Effenberg, A. O. 2011. An investigation of online acoustic information for elite rowers in on-water training conditions. J. Hum. Sport Exerc. 6, 2, 392--405.Google ScholarCross Ref
- Sikora, C., Roberts, L., and Murray, L. T. 1995. Musical vs. real-world feedback signals. In Proceedings of the Conference on Human Factors in Computing Systems. 220--221. Google ScholarDigital Library
- Singh, D. 2010. Hybrid auditory based interaction framework for driver assistance system. J. Comput. Sci. 6, 12, 1499--1504.Google ScholarCross Ref
- Van den Doel, K., Kry, P. G., and Pai, D. K. 2001. FoleyAutomatic: Physically-based sound effects for interactive simulation and animation. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’01). 537--544. Google ScholarDigital Library
- Ventura, L. C. L. and Fernandes, P. R. 2011. Remote guide for guiding the visually Impaired. In Proceedings of the 2011 ISSNIP Biosignals and Biorobotics Conference (BRC’11). 1--5.Google Scholar
- Walker, B. N., Nance, A., and Lindsay, J. 2006. Spearcons: Speech-based earcons improve navigation performance in auditory menus. In Proceedings of the International Conference on Auditory Display. 63--68.Google Scholar
- Walker, B. N. and Lindsay, J. 2006. Navigation performance with a virtual auditory display: Effects of beacon sound, capture radius, and practice. Hum. Factors 48, 2, 265--278.Google ScholarCross Ref
- Wenzel, E. 1992. Localization in virtual acoustic displays. Presence 1, 1, 80--107. Google ScholarDigital Library
- Wenzel, E. M., Arruda, M., Kistler, D. J., and Wightman F. L. 1994. Localization using nonindividualized head-related transfer functions. J. Acoustical Soc. Am. 94, 1, 111--123.Google ScholarCross Ref
- Wersényi, G. 2003. Localization in a HRTF-based minimum audible angle listening test on a 2D sound screen for GUIB applications. Audio Engineering Society Convention Preprint Paper, No. 5902, New York.Google Scholar
- Wersényi, G. 2007a. Localization in a HRTF-based minimum-audible-angle listening test for GUIB applications. EJTA 1, 1--16. Available at http://www.ejta.org.Google Scholar
- Wersényi, G. 2007b. Localization in a HRTF-based virtual audio synthesis using additional high-pass and low-pass filtering of sound sources. J. Acoust. Sci. Technol. Jpn. 28, 4, 244--250.Google ScholarCross Ref
- Wersényi, G. 2008. Evaluation of user habits for creating auditory representations of different software applications for blind persons. In Proceedings of the International Conference on Auditory Display.Google Scholar
- Wersényi, G. 2009a. Evaluation of auditory representations for selected applications of a graphical user interface. In Proceedings of the International Conference on Auditory Display. 41--48.Google Scholar
- Wersényi, G. 2009b. Effect of emulated head-tracking for reducing localization errors in virtual audio simulation. IEEE Audio, Speech, Language Process. 17, 2, 247--252. Google ScholarDigital Library
- Wersényi, G. 2010. Auditory representations of a graphical user interface for a better human-computer interaction. In S. Ystad, M. Aramaki, R. Kronland-Martinet, and K. Jensen, Eds., Auditory Display (CMMR/ICAD 2009 post proceedings ed.). Lecture Notes in Computer Science (LNCS) 5954. Springer Verlag, Berlin, 80--102. Google ScholarDigital Library
- Wersényi, G. 2012. Virtual localization by blind persons. J. Audio Eng. Soc. 60, 7/8, 568--579.Google Scholar
- Wilson, J., Walker, B. N., Lindsay, J., Cambias, C., and Dellaert, F. 2007. SWAN: System for Wearable Audio Navigation. In Proceedings of the 11th International Symposium on Wearable Computers (ISWC’07). Google ScholarDigital Library
- W3: http://www.w3.org/.Google Scholar
Index Terms
- Overview of auditory representations in human-machine interfaces
Recommendations
An evaluation of earcons for use in auditory human-computer interfaces
CHI '93: Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing SystemsAn evaluation of earcons was carried out to see whether they are an effective means of communicating information in sound. An initial experiment showed that earcons were better than unstructured bursts of sound and that musical timbres were more ...
Graphic-to-Sound Sonification for Visual and Auditory Communication Design
AM '22: Proceedings of the 17th International Audio Mostly ConferenceI designed two sonification platforms designed for visual/auditory communication design studies and audiovisual art. The purpose of this study was to examine whether test participants can associate visuals and sound without any prior training and ...
The Duration of an Auditory Icon Can Affect How the Listener Interprets Its Meaning
Initially introduced in the field of informatics, an auditory icon consists of a short sound that is present in everyday life, used to represent a specific event, object, function, or action. Auditory icons have been studied in various fields, and overall,...
Comments