Skip to main content
Log in

Towards high-fidelity multi-sensory virtual environments

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Virtual environments are playing an increasingly important role for training people about real world situations, especially through the use of serious games. A key concern is thus the level of realism that virtual environments require in order to have an accurate match of what the user can expect in the real world with what they perceive in the virtual one. Failure to achieve the right level of realism runs the real risk that the user may adopt a different reaction strategy in the virtual world than would be desired in reality.

High-fidelity, physically-based rendering has the potential to deliver the same perceptual quality of an image as if you were “there” in the real world scene being portrayed. However, our perception of an environment is not only what we see, but may be significantly influenced by other sensory inputs, including sound, smell, feel, and even taste. Computation and delivery of all sensory stimuli at interactive rates is a computationally complex problem. To achieve true physical accuracy for each of the senses individually for any complex scene in real-time is simply beyond the ability of current standard desktop computers. This paper discusses how human perception, and in particular any cross-modal effects in multi-sensory perception, can be exploited to selectively deliver high-fidelity virtual environments. Selective delivery enables those parts of a scene which the user is attending to, to be computed in high quality. The remainder of the scene is delivered in lower quality, at a significantly reduced computational cost, without the user being aware of this quality difference.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barfield, W., Danas, E.: Comments on the use of olfactory displays for virtual environments. Presence 5(1), 109–121 (1995)

    Google Scholar 

  2. Bartz, D., Cunningham, D., Fischer, J., Wallraven, C.: The role of perception for computer graphics. In: Eurographics 2008, State of the Art Reports (2008)

  3. Calvert, G., Thesen, T.: Multisensory integration: Methodological approaches and emerging principles in the human brain. J. Physiol. 98, 191–205 (2004)

    Google Scholar 

  4. Cater, K., Chalmers, A., Ward, G.: Exploiting visual tasks for selective rendering. In: Eurographics Symposium on Rendering. Eurographics, pp. 270–280 (2003)

  5. Chalmers, A., Debattista, K., Mastoropoulou, G., dos Santos, L.: There-reality: selective rendering in high fidelity virtual environments. Int. J. Virtual Real. 6(1), 1–10 (2007)

    Google Scholar 

  6. Chalmers, A., Howard, D., Moir, C.: Real virtuality: A step change from virtual reality. In: SCCG’09 (2009), ACM SIGGRAPH, pp. 15–22

  7. Chen, Y.: Olfactory display: development and application in virtual reality therapy. In: ICAT’06: Proceedings of the 16th International Conference on Artificial Reality and Telexistence. IEEE Press, New York (2006)

    Google Scholar 

  8. Dachsbacher, C., Stamminger, M., Drettakis, G., Durand, F.: Implicit visibility and antiradiance for interactive global illumination. In: SIGGRAPH ’07: ACM SIGGRAPH 2007 papers, p. 61. ACM, New York (2007)

    Google Scholar 

  9. Debattista, K.: Selective rendering for high-fidelity graphics. Ph.D. thesis, University of Bristol, Bristol, UK (2006)

  10. Debattista, K., Sundstedt, V., Santos, L.P., Chalmers, A.: Selective component-based rendering. In: GRAPHITE, 3rd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, pp. 13–22. ACM, New York (2005)

    Chapter  Google Scholar 

  11. Dinh, H.Q., Walker, N.C.S., Kobayashi, A., Hodges, L.: Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments. In: IEEE Virtual Reality 1999, pp. 222–228. IEEE Press, New York (1999)

    Google Scholar 

  12. Ellis, G., Chalmers, A.: The effect of translational ego-motion on the perception of high fidelity animations. In: SCCG 2006 (2006)

  13. Ellis, G., Chalmers, A., Debattista, K.: The effect of rotational ego-motion on the perception of high fidelity animations. In: APGV 2006. APGV (July 2006)

  14. Hulusic, V., Aranha, M., Chalmers, A.: The influence of cross-modal interaction on perceived rendering quality thresholds. In: WSCG’2008 Full Papers Proceedings, pp. 41–48 (2008)

  15. Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. In: Pattern Analysis and Machine Intelligence, vol. 20, pp. 1254–1259 (1998)

  16. James, W.: The Principles of Psychology, vol. 1. Holt, New York (1890)

    Google Scholar 

  17. Kajiya, J.T.: The rendering equation. In: SIGGRAPH ’86: Proceedings of the 13th Annual Conference on Computer Graphics and Interactive Techniques, pp. 143–150. ACM Press, New York (1986)

    Chapter  Google Scholar 

  18. Longhurst, P., Debattista, K., Chalmers, A.: A GPU based saliency map for high-fidelity selective rendering. In: AFRIGRAPH 2006 4th International Conference on Computer Graphics, Virtual Reality, Visualisation and Interaction in Africa, pp. 21–29. ACM SIGGRAPH (January 2006)

  19. Mack, A., Rock, I.: Inattentional Blindness. MIT Press, Cambridge (1998)

    Google Scholar 

  20. Marsland, S., Nehmzow, U., Shapiro, J.: Novelty detection on a mobile robot using habituation. In: From Animals to Animats: The 6th International Conference on Simulation of Adaptive Behaviour (2000)

  21. Mastoropoulou, G., Chalmers, A.: The effect of music on the perception of display rate and duration of animated sequences: an experimental study. In: TPCG’04: Theory and Practice of Computer Graphics 2004, pp. 128–134. IEEE Press, New York (2003)

    Google Scholar 

  22. Mastoropoulou, G., Debattista, K., Chalmers, A., Troscianko, T.: The influence of sound effects on the perceived smoothness of rendered animations. In: APGV, pp. 9–15 (2005)

  23. Mastoropoulou, G., Debattista, K., Chalmers, A., Troscianko, T.: Auditory bias of visual attention for perceptually-guided selective rendering of animations. In: GRAPHITE 2005, sponsored by ACM SIGGRAPH, Dunedin, New Zealand. ACM Press, New York (2005)

    Google Scholar 

  24. Mastoropoulou, G., Debattista, K., Chalmers, A., Troscianko, T.: Auditory bias of visual attention for perceptually guided selective rendering of animations. In: Graphite 2005, pp. 363–369 (2005)

  25. McGurk, H., MacDonald, J.: Hearing lips and seeing voices. Nature 264, 746–748 (1976)

    Article  Google Scholar 

  26. Nakamoto, T., Otaguro, S., Kinoshita, M., Nagahama, M., Ohinishi, K., Ishida, T.: Cooking up an interactive olfactory game display. In: IEEE Computer Graphics and Applications (2008)

  27. Nunez, D., Blake, E.H.: Conceptual priming as a determinant of presence in virtual environments. In: Afrigraph 2003, ACM SIGGRAPH, pp. 101–108 (January 2003)

  28. Pair, J., Allen, B., Dautricourt, M., Treskunov, A., Liewer, M., Graap, K., Reger, G., Rizzo, A.: A virtual reality exposure therapy application for Iraq war post traumatic stress disorder. In: IEEE Virtual Reality 2006. IEEE Press, New York (2006)

    Google Scholar 

  29. Pan, M., Wang, R., Liu, X., Peng, Q., Bao, H.: Precomputed radiance transfer field for rendering inter-reactions in dynamic scenes. Comput. Graph. Forum 26(3) (2007)

  30. Ramic, B., Chalmers, A., Hasic, J., Rizvic, S.: Selective rendering in a multimodal environment: scent and graphics. In: SCCG 2007: Spring Conference on Computer Graphics, pp. 189–192 (2007)

  31. Ramic-Brkic, B., Chalmers, A., Boulanger, K., Patttanaik, S., Covington, J.: Cross-modal affects of smell on real-time rendering of grass. In: SCCG’09, ACM SIGGRAPH, pp. 175–179 (2009)

  32. Rimell, S., Howard, D., Tyrrell, A., Kirk, P., Hunt, A.: Cymatic: restoring the physical manifestation of digital sound using haptic interfaces to control a new computer based musical instrument. In: ICMC02: International Computer Music Conference (2002)

  33. Ritschel, T., Grosch, T., Kim, M.H., Seidel, H.-P., Dachsbacher, C., Kautz, J.: Imperfect shadow maps for efficient computation of indirect illumination. ACM Trans. Graph. 27(5), 1–8 (2008)

    Article  Google Scholar 

  34. Storms, R.: Auditory–visual cross-modal perception phenomena. Ph.D. thesis, Naval Postgraduate School, Monterey, California (1998)

  35. Sundstedt, V., Debattista, K., Longhurst, P., Chalmers, A., Troscianko, T.: Visual attention for efficient high-fidelity graphics. In: Spring Conference on Computer Graphics (SCCG 2005) (May 2005)

  36. Wald, I., Mark, W.R., Günther, J., Boulos, S., Ize, T., Hunt, W., Parker, S.G., Shirley, P.: State of the Art in Ray Tracing Animated Scenes

  37. Ward, G., Shakespeare, R.: Rendering with Radiance: The Art and Science of Lighting Visualization. Morgan Kaufmann, San Francisco (1998)

    Google Scholar 

  38. Washburn, D., Jones, L., Satya, R., Bowers, C., Cortes, A.: Olfactory use in virtual environment training. Model. Simul. Mag. 2(3) (2003)

  39. Whitted, T.: An improved illumination model for shaded display. In: SIGGRAPH ’80, p. 14. ACM Press, New York (1980)

    Google Scholar 

  40. Winkler, S., Faller, C.: Audiovisual quality evaluation of low-bitrate video. In: SPIE/IS&T Human Vision and Electronic Imaging, vol. 5666, pp. 139–148. SPIE, Bellingham (2005)

    Google Scholar 

  41. Yarbus, A.: Eye movements during perception of complex objects. In: Eye Movements and Vision, pp. 171–196 (1967)

  42. Yu, Y., Debevec, P., Malik, P., Hawkins, T.: Inverse global illumination: Recovering reflectance models of real scenes from photographs. In: ACMSIGGRAPH 1999, pp. 215–227 (1999)

  43. Zybura, M., Eskeland, A.: Olfaction for virtual reality. In: Quarter Project Industrial Engineering, vol. 543 (1999)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alan Chalmers.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chalmers, A., Debattista, K. & Ramic-Brkic, B. Towards high-fidelity multi-sensory virtual environments. Vis Comput 25, 1101–1108 (2009). https://doi.org/10.1007/s00371-009-0389-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-009-0389-2

Keywords

Navigation