Skip to main content
Log in

Eyes-free environmental awareness for navigation

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

We consider the challenge of delivering location-based information through rich audio representations of the environment, and the associated opportunities that such an approach offers to support navigation tasks. This challenge is addressed by In-Situ Audio Services, or ISAS, a system intended primarily for use by the blind and visually impaired communities. It employs spatialized audio rendering to convey the relevant content, which may include information about the immediate surroundings, such as restaurants, cultural sites, public transportation locations, and other points of interest. Information is aggregated mostly from online data resources, converted using text-to-speech technology, and “displayed”, either as speech or more abstract audio icons, through a location-aware mobile device or smartphone. This is suitable not only for the specific constraints of the target population, but is equally useful for general mobile users whose visual attention is otherwise occupied with navigation. We designed and conducted an experiment to evaluate two techniques for delivering spatialized audio content to users via interactive auditory maps: the shockwave mode and the radar mode. While neither mode proved to be significantly better than the other, subjects proved competent at navigating the maps using these rendering strategies, and reacted positively to the system, demonstrating that spatial audio can be an effective technique for conveying location-based information. The results of this experiment and its implications to our project are described here.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bregman AS (1994) Auditory scene analysis: The perceptual organization of sound. The MIT Press, Cambridge

    Google Scholar 

  2. Brewster SA (1998) Using nonspeech sounds to provide navigation cues. ACM Trans Comput-Hum Interact 5:224–259

    Article  MathSciNet  Google Scholar 

  3. Frauenberger C, Putz V, Höldrich R (2004) Spatial auditory displays—a study on the use of virtual audio environments as interfaces for users with visual disabilities. In: DAFx04 proceedings, Naples, Italy, October 5–8 2004. 7th Int. Conference on Digital Audio Effects (DAFx’04), 7th Int. Conference on Digital Audio Effects (DAFx’04)

  4. Hermann T, Ritter H (1999) Listen to your data: Model-based sonification for data analysis. In: Lasker GE (ed) Advances in intelligent computing and multimedia systems, pp 189–194, Baden-Baden, Germany, 08 1999. Int Inst for Advanced Studies in System Research and Cybernetics

    Google Scholar 

  5. Kish D (2003) Sonic echolocation: A modern review and synthesis of the literature. http://www.worldaccessfortheblind.org/sites/default/files/echolocationreview.htm

  6. Morland C, Mountain D (2008) Design of a sonar system for visually impaired humans. In: Proceedings of the 14th international conference on auditory display, Paris, France.

    Google Scholar 

  7. Mountford SJ, Gaver WW (1990) Talking and listening to computers. Addison-Wesley, Massachusetts

    Google Scholar 

  8. Nickerson LV, Stockman T, Thiebaut J (2007) Sonifying the ndon underground real-time-disruption map. In: Scavone GP (ed) Proceedings of the 13th international conference on auditory display (ICAD2007), pp 252–257, Montreal, Canada, 2007. Schulich School of Music, McGill University

    Google Scholar 

  9. Saerberg S (2010) Just go straight ahead” how blind and sighted pedestrians negotiate space. Senses Soc 5(3):364–381

    Article  Google Scholar 

  10. Schafer RM (1977) The Tuning of the World, 1 edn. Random House, New York

    Google Scholar 

  11. Stockman T (2010) Listening to people, objects and interactions. In: Proceedings of ISon 2010, 3rd interactive sonification workshop. KTH Stockholm Sweden, April 2010

  12. Tannen RS (1998) Breaking the sound barrier: designing auditory displays for global usability. In: 4th conference on human factors and the web, Basking Ridge, USA

  13. Walker A, Brewster S (2000) Spatial audio in small screen device displays. Pers Ubiquitous Comput 4:144–154. doi:10.1007/BF01324121

    Article  Google Scholar 

  14. Walker BN, Nance A, Lindsay J (2006) Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of the international conference on auditory display, pp 95–98

    Google Scholar 

  15. Yalla P, Walker BN (2008) Advanced auditory menus: design and evaluation of auditory scroll bars. In: Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility, Assets ’08. ACM, New York, pp 105–112

    Chapter  Google Scholar 

  16. Zhao H, Plaisant C, Schneiderman B, Duraiswami R (2004) Sonification of geo-referenced data for auditory information seeking: Design principle and pilot study. In: Barrass S, Vickers P (eds) Proceedings of the 10th international conference on auditory display (ICAD2004), Sydney, Australia International Community for Auditory Display (ICAD)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dalia El-Shimy.

Rights and permissions

Reprints and permissions

About this article

Cite this article

El-Shimy, D., Grond, F., Olmos, A. et al. Eyes-free environmental awareness for navigation. J Multimodal User Interfaces 5, 131–141 (2012). https://doi.org/10.1007/s12193-011-0065-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-011-0065-5

Keywords

Navigation