skip to main content
10.1145/3561212.3561216acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
short-paper

Matching auditory and visual room size, distance, and source orientation in virtual reality

Authors Info & Claims
Published:10 October 2022Publication History

ABSTRACT

Matching visual and auditory cues in virtual reality (VR) is important to provide plausibility and create the impression of presence in a scene. This paper presents an experiment in VR, in which participants match acoustic and visual room size, distance, and orientation of a directive sound source in a simulated concert hall. The simulation is fully interactive and allows the participants to move with 6 degrees of freedom. For all three parameters, the experiment was done in both directions: adjusting the acoustic parameters to given visual settings and adjusting the visual parameters to given acoustic settings. The results show that the adjustment generally works in both directions. However, for distance the auditory adjustment works better and does not reveal the typical compression. Regarding room size, results agree with just noticeable differences in reverberation time known from real-world experiments.

References

  1. Niels W Adelman-Larsen and Jens Jørgen Dammerud. 2011. A survey of reverberation times in 50 European venues presenting pop & rock concerts. In Proceedings of Forum Acusticum.Google ScholarGoogle Scholar
  2. Leo L Beranek. 1992. Concert hall acoustics—1992. The Journal of the Acoustical Society of America 92, 1 (1992), 1–39.Google ScholarGoogle ScholarCross RefCross Ref
  3. Changan Chen, Ruohan Gao, Paul Calamia, and Kristen Grauman. 2022. Visual Acoustic Matching. arXiv preprint arXiv:2202.06875(2022).Google ScholarGoogle Scholar
  4. EBU. 2008. EBU SQAM CD: Sound Quality Assessment Material recordings for subjective tests. https://tech.ebu.ch/publications/sqamcdGoogle ScholarGoogle Scholar
  5. Kajetan Enge, Matthias Frank, and Robert Höldrich. 2020. Listening experiment on the plausibility of acoustic modeling in virtual reality. In Fortschritte der Akustik, DAGA.Google ScholarGoogle Scholar
  6. Matthias Frank and Manuel Brandner. 2019. Perceptual Evaluation of Spatial Resolution in Directivity Patterns. In Fortschritte der Akustik, DAGA. Rostock.Google ScholarGoogle Scholar
  7. Matthias Frank, Manuel Brandner, and Franz Zotter. 2022. Perceptual Evaluation of Spatial Resolution in Early Reflections. In Fortschritte der Akustik, DAGA.Google ScholarGoogle Scholar
  8. Jean-Marc Jot and Antoine Chaigne. 1991. Digital delay networks for designing artificial reverberators. In 90th AES Conv., prepr. 3030. Paris.Google ScholarGoogle Scholar
  9. Andrew J Kolarik, Brian CJ Moore, Pavel Zahorik, Silvia Cirstea, and Shahina Pardhan. 2016. Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss. Attention, Perception, & Psychophysics 78, 2 (2016), 373–395.Google ScholarGoogle ScholarCross RefCross Ref
  10. Antti Kuusinen and Tapio Lokki. 2015. Investigation of auditory distance perception and preferences in concert halls by using virtual acoustics. The Journal of the Acoustical Society of America 138, 5 (2015), 3148–3159.Google ScholarGoogle ScholarCross RefCross Ref
  11. Erik Larsen, Nandini Iyer, Charissa R Lansing, and Albert S Feng. 2008. On the minimum audible difference in direct-to-reverberant energy ratio. The Journal of the Acoustical Society of America 124, 1 (2008), 450–461.Google ScholarGoogle ScholarCross RefCross Ref
  12. Pontus Larsson, Daniel Västfjäll, Pierre Olsson, Mendel Kleiner, 2007. When what you hear is what you see: Presence and auditory-visual integration in virtual environments. In Proceedings of the 10th annual international workshop on presence. 11–18.Google ScholarGoogle Scholar
  13. Zihou Meng, Fengjie Zhao, and Mu He. 2006. The just noticeable difference of noise length and reverberation perception. In 2006 International Symposium on Communications and Information Technologies. IEEE, 418–421.Google ScholarGoogle ScholarCross RefCross Ref
  14. Djordje Perinovic and Matthias Frank. 2021. Spatial Resolution of Diffuse Reverberation in Binaural Ambisonic Playback. In Fortschritte der Akustik, DAGA. Munich.Google ScholarGoogle Scholar
  15. Christian Schörkhuber, Markus Zaunschirm, and Robert Höldrich. 2018. Binaural Rendering of Ambisonic Signals via Magnitude Least Squares. In Fortschritte der Akustik, DAGA. Munich.Google ScholarGoogle Scholar
  16. Magne Skålevik. 2010. Reverberation Time–The mother of all room acoustic parameters. In Proceedings of 20th International Congress on Acoustic, ICA, Vol. 10.Google ScholarGoogle Scholar
  17. John Stautner and Miller Puckette. 1982. Designing Multi-Channel Reverberators. Computer Music Journal 6, 1 (1982), 52–65.Google ScholarGoogle ScholarCross RefCross Ref
  18. Stephan Werner, Florian Klein, Thomas Mayenfels, and Karlheinz Brandenburg. 2016. A summary on acoustic room divergence and its effect on externalization of auditory events. In 2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX). IEEE, 1–6.Google ScholarGoogle ScholarCross RefCross Ref
  19. Pavel Zahorik and Frederic L Wightman. 2001. Loudness constancy with varying sound source distance. Nature neuroscience 4, 1 (2001), 78–83.Google ScholarGoogle Scholar
  20. Markus Zaunschirm, Christian Schörkhuber, and Robert Höldrich. 2018. Binaural rendering of Ambisonic signals by head-related impulse response time alignment and a diffuseness constraint. J. Acoust. Soc. Am. 143, 6 (2018), 3616–3627.Google ScholarGoogle ScholarCross RefCross Ref
  21. Franz Zotter and Matthias Frank. 2015. Investigation of auditory objects caused by directional sound sources in rooms. Acta Physica Polonica A 128, 1 (2015), A5–A10.Google ScholarGoogle ScholarCross RefCross Ref
  22. Franz Zotter and Matthias Frank. 2019. Ambisonics - A Practical 3D Audio Theory for Recording, Studio Production, Sound Reinforcement, and Virtual Reality. Springer.Google ScholarGoogle Scholar
  23. Franz Zotter, Matthias Frank, Andreas Fuchs, and Daniel Rudrich. 2014. Preliminary study on the perception of orientation-changing directional sound sources in rooms. In Proc. of forum acusticum, Kraków.Google ScholarGoogle Scholar

Index Terms

  1. Matching auditory and visual room size, distance, and source orientation in virtual reality

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      AM '22: Proceedings of the 17th International Audio Mostly Conference
      September 2022
      245 pages
      ISBN:9781450397018
      DOI:10.1145/3561212

      Copyright © 2022 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 10 October 2022

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • short-paper
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate177of275submissions,64%
    • Article Metrics

      • Downloads (Last 12 months)61
      • Downloads (Last 6 weeks)8

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format