Skip to main content
Log in

Psychoacoustic auditory display for navigation: an auditory assistance system for spatial orientation tasks

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

A psychoacoustic auditory display for navigation is presented. Interactive sonification guides users to an invisible target location in two-dimensional space. Orthogonal spatial dimensions are mapped to perceptual auditory qualities that are orthogonal as well. The psychoacoustic auditory display could serve as an alternative or complement to conventional assistance systems for vehicle or airplane control, or for minimally invasive surgery. The approach is evaluated by an experiment, which compares the performance of 18 participants approaching (i) a visually presented target (ii) an invisible target guided by sound. Results demonstrate that users are able to integrate the sonified information to find the right angle and distance, or to segregate both spatial axes and interpret one at a time. Auditory navigation takes significantly longer than visual navigation, but path lengths are not significantly different.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. Visit https://tinyurl.com/ycwmdh8r.

References

  1. Anderson J, Sanderson P (2004) Designing sonification for effective attentional control in complex work domains. In: Proceedings of the human factors and ergonomins society 48th annual meeting, New Orleans, LA. https://doi.org/10.1037/e577082012-006

  2. Aures W (1985) Berechnungsverfahren für den sensorischen Wohlklang beliebiger Schallsignale (A model for calculating the sensory euphony of various sounds). Acustica 59(2):130–141

    Google Scholar 

  3. Bader R (2013) Nonlinearities and synchronization in musical acoustics and music psychology. Springer, Berlin. https://doi.org/10.1007/978-3-642-36098-5

    Book  Google Scholar 

  4. Barrass S (2005) A perceptual framework for the auditory display of scientific data. ACM Trans Appl Percept 2(4):389–402. https://doi.org/10.1145/1101530.1101532

    Article  Google Scholar 

  5. Beauchamp JW (1982) Synthesis by spectral amplitude and ”brightness” matching of analyzed musical instrument tones. J Audio Eng Soc 30(6):396–406

    Google Scholar 

  6. Black D, Hettig J, Luz M, Hansen C, Kikinis R, Hahn H (2017) Auditory feedback to support image-guided medical needle placement. Int J Comput Assist Radiol Surg 12:1655–1663. https://doi.org/10.1007/s11548-017-1537-1

    Article  Google Scholar 

  7. Black D, Issawi JA, Hansen C, Rieder C, Hahn H (2013) Auditory support for navigated radiofrequency ablation. In: Freysinger W (ed) CURAC—12. Jahrestagung der Deutschen Gesellschaft für Computer-und Roboter Assistierte Chirurgie. Innsbruck, pp 30–33. https://www.curac.org/images/advportfoliopro/images/CURAC2013/Proceedings%20CURAC%202013.pdf. Accessed 23 Nov 2018

  8. Bregman AS (1990) Auditory scene analysis. MIT Press, Massachusetts

  9. Brungart DS, Simpson BD (2008) Design, validation, and in-flight evaluation of an auditory attitude indicator based on pilot-selected music. In: Proceedings of the 14th international conference on auditory display (ICAD2008), Paris. http://hdl.handle.net/1853/49897

  10. Belz SM, Robinson GS, Casali JG (1999) A new class of auditory warning signals for complex systems: auditory icons. Hum Factors 41(4):608–618

    Article  Google Scholar 

  11. Cho B, Oka M, Matsumoto N, Ouchida R, Hong J, Hashizume M (2013) Warning navigation system using real-time safe region monitoring for otologic surgery. Int J Comput Assist Radiol Surg 8(3):395–405. https://doi.org/10.1007/s11548-012-0797-z

    Article  Google Scholar 

  12. Dale R, Kehoe C, Spivey MJ (2007) Graded motor responses in the time course of categorizing atypical exemplars. Mem Cognit 35(1):15–28. https://doi.org/10.3758/BF03195938

    Article  Google Scholar 

  13. Daniel P, Weber R (1997) Psychoacoustical roughness: implementation of an optimized model. Acta Acust United Acust 83(1):113–123

    Google Scholar 

  14. Ferguson S, Cabrera D, Beilharz K, Song HJ (2006) Using psychoacoustical models for information sonification. In: Proceedings of the 12th international conference on auditory display (ICAD2006), London. http://hdl.handle.net/1853/50694. Accessed 23 Nov 2018

  15. Florez L (1936) True blind flight. J Aeronaut Sci 3(5):168–170. https://doi.org/10.2514/8.176

    Article  Google Scholar 

  16. Freeman JB, Ambady N (2010) Mousetracker: software for studying real-time mental processing using a computer mouse-tracking method. Behav Res Methods 42(1):226–241. https://doi.org/10.3758/BRM.42.1.226

    Article  Google Scholar 

  17. Hansen C, Black D, Lange C, Rieber F, Lamadé W, Donati M, Oldhafer KJ, Hahn HK (2013) Auditory support for resection guidance in navigated liver surgery. Int J Med Robot Comput Assist Surg 9(1):36–43. https://doi.org/10.1002/rcs.1466

    Article  Google Scholar 

  18. Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. In: Hancock PA, Meshkati N (eds) Human mental workload. North Holland Press, Amsterdam

    Google Scholar 

  19. Hellier EJ, Edworthy J, Dennis I (1993) Improving auditory warning design: quantifying and predicting the effects of different warning parameters on perceived urgency. Hum Factors 35(4):693–706

    Article  Google Scholar 

  20. Jacobson D (2012) Lloyd relaxation of voronoi diagrams. Wolfram Demonstrations Project http://demonstrations.wolfram.com/LloydRelaxationOfVoronoiDiagrams/. Accessed 23 Nov 2018

  21. Kuppanda T, Degara N, Worrall D, Thoshkahna B, Müller M (2015) Virtual reality platform for sonification evaluation. In: Proceedings of the 21st international conference on auditory display (ICAD2015). Graz, pp 117–124. http://hdl.handle.net/1853/54116. Accessed 23 Nov 2018

  22. Lawson BD (2014) Tactile displays for cueing self-motion and looming: what would Gibson think? In: Stanney K, Hale KS (eds) 5th international conference on applied human factors and ergonomics, pp 3–13

  23. Leman M (2000) Visualization and calculation of the roughness of acoustical musical signals using the synchronization index model (sim). In: Proceedings of the COST G-6 conference on digital audio effects (DAFX-00), Verona

  24. Levinson SC (2003) Space in language and cognition. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9780511613609

    Book  Google Scholar 

  25. Lundkvist A, Johnsson R, Nykänen A, Stridfelt J (2017) 3d auditory displays for parking assistance systems. SAE Int J Passeng Cars Electron Electr Syst 10:17–23. https://doi.org/10.4271/2017-01-9627

    Article  Google Scholar 

  26. Lundkvist A, Nykänen A, Johnsson R (2011) 3d-sound in car compartments based on loudspeaker reproduction using crosstalk cancellation. In: 130th audio engineering society convention, London. http://www.aes.org/e-lib/browse.cfm?elib=15802. Accessed 23 Nov 2018

  27. Meyer J (2009) Acoustics and the performance of music. Manual for scousticians audio engineers, musicians, architects and musical instrument makers, 5th edn. Springer, Bergkirchen. https://doi.org/10.1007/978-0-387-09517-2

    Google Scholar 

  28. Nagel F, Stöter FR, Degara N, Balke S, Worrall D (2014) Fast and accurate guidance: response times to navigational sounds. In: Proceedings of the 20th international conference on auditory display (ICAD2014). New York, NY. http://hdl.handle.net/1853/52058. Accessed 23 Nov 2018

  29. Parseihian G, Gondre C, Aramaki M, Ystad S, Kronland-Martinet R (2016) Comparison and evaluation of sonification strategies for guidance tasks. IEEE Trans Multimed 18(4):674–686. https://doi.org/10.1109/TMM.2016.2531978

    Article  Google Scholar 

  30. Perrin DP, Vasilyev NV, Novotny P, Stoll J, Howe RD, Dupont PE, Salgo IS, del Nido PJ (2009) Image guided surgical interventions. Curr Probl Surg 46(9):730–766. https://doi.org/10.1067/j.cpsurg.2009.04.001

    Article  Google Scholar 

  31. Sanderson PM, Watson MO, Russell WJ (2005) Advanced patient monitoring displays: tools for continuous informing. Anesth Analg 101(1):161–168. https://doi.org/10.1213/01.ANE.0000154080.67496.AE

    Article  Google Scholar 

  32. Scheminzky F (1943) Die Welt des Schalls. Das Bergland-Buch, Salzburg

    Google Scholar 

  33. Shepard RN (1964) Circularity in judgments of relative pitch. J Acoust Soc Am 36(12):2346–2353. https://doi.org/10.1121/1.1919362

    Article  Google Scholar 

  34. Strauß G, Schaller S, Zaminer B, Heininger S, Hofer M, Manzey D, Meixensberger J, Dietz A, Lüth T (2011) Klinische erfahrungen mit einem kollisionswarnsystem. HNO 59(5):470–479. https://doi.org/10.1007/s00106-010-2237-0

    Article  Google Scholar 

  35. Wegner K (1998) Surgical navigation system and method using audio feedback. In: ICAD, Glasgow. https://smartech.gatech.edu/handle/1853/50733. Accessed 23 Nov 2018

  36. Ziemer T, Black D (2017) Psychoacoustically motivated sonification for surgeons. Int J Comput Assist Radiol Surg 12:265–266. https://doi.org/10.1007/s11548-017-1588-3

    Google Scholar 

  37. Ziemer T, Black D, Schultheis H (2017) Psychoacoustic sonification design for navigation in surgical interventions. https://doi.org/10.1121/2.0000557

  38. Ziemer T, Schultheis H (2018) A psychoacoustic auditory display for navigation. In: Proceedings of the 24th international conference on auditory display (ICAD2018). Houghton, MI. https://doi.org/10.21785/icad2018.007

  39. Ziemer T, Schultheis H, Black D, Kikinis R (2018) Psychoacoustical interactive sonification for short range navigation. Acta Acust United Acust 104(6):1075–1093. https://doi.org/10.3813/AAA.919273

    Article  Google Scholar 

  40. Zwicker E, Fastl H (1999) Psychoacoustics. Facts and models, 2nd edn. Springer, Berlin. https://doi.org/10.1007/978-3-662-09562-1

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim Ziemer.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ziemer, T., Schultheis, H. Psychoacoustic auditory display for navigation: an auditory assistance system for spatial orientation tasks. J Multimodal User Interfaces 13, 205–218 (2019). https://doi.org/10.1007/s12193-018-0282-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-018-0282-2

Keywords

Mathematics Subject Classification

Navigation