Skip to main content

Subtle Visual Attention Guidance in VR

  • Chapter
  • First Online:
Book cover Real VR – Immersive Digital Reality

Abstract

The research field of visual attention guidance in virtual reality (VR) explores possibilities to help viewers finding their way through immersive environments. A specialized area within this field —subtle guidance— arises with the goal to achieve this with least possible distraction, to prevent misrepresentation of actual scene content as well as degradation of immersion and presence in VR. This chapter provides an introduction to the general topic, commonly used terminology, and coarsely introduces some exemplary approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alais, D., Blake, R.: Binocular Rivalry. MIT Press, Cambridge (2005)

    Google Scholar 

  2. Bailey, R., McNamara, A., Sudarsanam, N., Grimm, C.: Subtle gaze direction. ACM Trans. Graph. (TOG) 28(4), 100 (2009)

    Article  Google Scholar 

  3. Barth, E., Dorr, M., Böhme, M., Gegenfurtner, K., Martinetz, T.: Guiding the mind’s eye: improving communication and vision by external control of the scanpath. In: Human Vision and Electronic Imaging XI, vol. 6057, p. 60570D. International Society for Optics and Photonics (2006)

    Google Scholar 

  4. Biocca, F., Owen, C., Tang, A., Bohil, C.: Attention issues in spatial information systems: directing mobile users’ visual attention using augmented reality. J. Manag. Inf. Syst. 23(4), 163–184 (2007)

    Article  Google Scholar 

  5. Blake, R.: A neural theory of binocular rivalry. Psychol. Rev. 96(1), 145 (1989)

    Article  Google Scholar 

  6. Dorr, M., Martinetz, T., Gegenfurtner, K., Barth, E.: Guidance of eye movements on a gaze-contingent display. In: Ilg, U.J., Bülthoff, H.H., Mallot, H.A. (eds.) Dynamic Perception Workshop of the GI Section “Computer Vision”, pp. 89–94 (2004)

    Google Scholar 

  7. Dorr, M., Rasche, C., Barth, E.: A gaze-contingent, acuity-adjusted mouse cursor. In: COGAIN2009 Proceedings, p. 39 (2009)

    Google Scholar 

  8. Dorr, M., Vig, E., Gegenfurtner, K.R., Martinetz, T., Barth, E.: Eye movement modelling and gaze guidance. In: Fourth International Workshop on Human-Computer Conversation (2008)

    Google Scholar 

  9. Fiorentini, A., Baumgartner, G., Magnussen, S., Schiller, P.H., Thomas, J.P.: 7 - the perception of brightness and darkness: relations to neuronal receptive fields. In: Spillmann, L., Werner, J.S. (eds.) Visual Perception, pp. 129–161. Academic Press, San Diego (1990). https://doi.org/10.1016/B978-0-12-657675-7.50013-2

  10. Fukuda, T.: Relation between fucker fusion threshold and retinal positions. Percept. Mot. Skills 49(1), 3–17 (1979)

    Article  Google Scholar 

  11. Grogorick, S., Albuquerque, G., Magnor, M.: Comparing unobtrusive gaze guiding stimuli in head-mounted displays. In: Proceedings of IEEE International Conference on Image Processing (ICIP) (October 2018). https://doi.org/10.1109/ICIP.2018.8451784

  12. Grogorick, S., Albuquerque, G., Tauscher, J.P., Magnor, M.: Comparison of unobtrusive visual guidance methods in an immersive dome environment. ACM Trans. Appl. Percept. (TAP) 15(4), 27 (2018)

    Google Scholar 

  13. Grogorick, S., Stengel, M., Eisemann, E., Magnor, M.: Subtle gaze guidance for immersive environments. In: Proceedings of the ACM Symposium on Applied Perception, pp. 4:1–4:7. ACM (2017)

    Google Scholar 

  14. Grogorick, S., Tauscher, J.P., Albuquerque, G., Kassubeck, M., Magnor, M.: Towards VR attention guidance: environment-dependent perceptual threshold for stereo inverse brightness modulation. In: Proceedings of ACM Symposium on Applied Perception (SAP) (September 2019). https://doi.org/10.1145/3343036.3343137

  15. Gutwin, C., Cockburn, A., Coveney, A.: Peripheral popout: the influence of visual angle and stimulus intensity on popout effects. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 208–219. ACM (2017)

    Google Scholar 

  16. Hata, H., Koike, H., Sato, Y.: Visual guidance with unnoticed blur effect. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, pp. 28–35. ACM (2016)

    Google Scholar 

  17. Hoffmann, R., Baudisch, P., Weld, D.S.: Evaluating visual cues for window switching on large screens. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 929–938. ACM (2008)

    Google Scholar 

  18. Iordanescu, L., Guzman-Martinez, E., Grabowecky, M., Suzuki, S.: Characteristic sounds facilitate visual search. Psychon. Bull. Rev. 15(3), 548–554 (2008)

    Article  Google Scholar 

  19. Jarodzka, H., van Gog, T., Dorr, M., Scheiter, K., Gerjets, P.: Learning to see: guiding students’ attention via a model’s eye movements fosters learning. Learn. Instr. 25, 62–70 (2013)

    Article  Google Scholar 

  20. Khan, A., Matejka, J., Fitzmaurice, G., Kurtenbach, G.: Spotlight: directing users’ attention on large displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 791–798. ACM (2005)

    Google Scholar 

  21. Krekhov, A., Cmentowski, S., Waschk, A., Krüger, J.: Deadeye visualization revisited: investigation of preattentiveness and applicability in virtual environments. IEEE Trans. Visual. Comput. Graph. 26(1), 547–557 (2019). https://doi.org/10.1109/TVCG.2019.2934370

    Article  Google Scholar 

  22. Krekhov, A., Krüger, J.: Deadeye: a novel preattentive visualization technique based on dichoptic presentation. IEEE Trans. Visual. Comput. Graph. 25(1), 936–945 (2018)

    Article  Google Scholar 

  23. Levi, D.M., Klein, S.A., Aitsebaomo, A.: Vernier acuity, crowding and cortical magnification. Vis. Res. 25(7), 963–977 (1985)

    Article  Google Scholar 

  24. Lin, Y.C., Chang, Y.J., Hu, H.N., Cheng, H.T., Huang, C.W., Sun, M.: Tell me where to look: investigating ways for assisting focus in 360\(^\circ \) video. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2535–2545. ACM (2017)

    Google Scholar 

  25. Lintu, A., Carbonell, N.: Gaze guidance through peripheral stimuli (2009). https://hal.inria.fr/inria-00421151. Working paper or preprint

  26. Logothetis, N.K., Leopold, D.A., Sheinberg, D.L.: What is rivalling during binocular rivalry? Nature 380(6575), 621 (1996)

    Article  Google Scholar 

  27. McNamara, A., Bailey, R., Grimm, C.: Improving search task performance using subtle gaze direction. In: Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, pp. 51–56. ACM (2008)

    Google Scholar 

  28. Nielsen, L.T., et al.: Missing the point: an exploration of how to guide users’ attention during cinematic virtual reality. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 229–232. ACM (2016)

    Google Scholar 

  29. Ogden, T.E., Miller, R.F.: Studies of the optic nerve of the rhesus monkey: nerve fiber spectrum and physiological properties. Vis. Res. 6(9–10), 485 (1966). IN2

    Article  Google Scholar 

  30. Paffen, C.L., Hooge, I.T., Benjamins, J.S., Hogendoorn, H.: A search asymmetry for interocular conflict. Atten. Percept. Psychophys. 73(4), 1042–1053 (2011)

    Article  Google Scholar 

  31. Pausch, R., Snoddy, J., Taylor, R., Watson, S., Haseltine, E.: Disney’s Aladdin: first steps toward storytelling in virtual reality. In: Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques, pp. 193–203. ACM (1996)

    Google Scholar 

  32. Rosenholtz, R.: Capabilities and limitations of peripheral vision. Ann. Rev. Vis. Sci. 2, 437–457 (2016)

    Article  Google Scholar 

  33. Rothe, S., Hußmann, H.: Guiding the viewer in cinematic virtual reality by diegetic cues. In: De Paolis, L.T., Bourdot, P. (eds.) AVR 2018. LNCS, vol. 10850, pp. 101–117. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-95270-3_7

    Chapter  Google Scholar 

  34. Sato, Y., Sugano, Y., Sugimoto, A., Kuno, Y., Koike, H.: Sensing and controlling human gaze in daily living space for human-harmonized information environments. In: Nishida, T. (ed.) Human-Harmonized Information Technology, vol. 1, pp. 199–237. Springer, Tokyo (2016). https://doi.org/10.1007/978-4-431-55867-5_8

    Chapter  Google Scholar 

  35. Sekuler, R., Anstis, S., Braddick, O.J., Brandt, T., Movshon, J.A., Orban, G.: 9 - the perception of motion. In: Spillmann, L., Werner, J.S. (eds.) Visual Perception, pp. 205–230. Academic Press, San Diego (1990). https://doi.org/10.1016/B978-0-12-657675-7.50015-6. http://www.sciencedirect.com/science/article/pii/B9780126576757500156

  36. Sheikh, A., Brown, A., Watson, Z., Evans, M.: Directing attention in 360-degree video. In: Proceedings of the IBC 2016 Conference. IET (2016). https://doi.org/10.1049/ibc.2016.0029

  37. Speicher, M., Rosenberg, C., Degraen, D., Daiber, F., Krúger, A.: Exploring visual guidance in 360-degree videos. In: Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video, pp. 1–12. ACM (2019)

    Google Scholar 

  38. Strasburger, H., Rentschler, I., Jüttner, M.: Peripheral vision and pattern recognition: a review. J. Vis. 11(5), 13–13 (2011). https://doi.org/10.1167/11.5.13

    Article  Google Scholar 

  39. Tyler, C.W., Hamer, R.D.: Eccentricity and the Ferry-Porter law. JOSA A 10(9), 2084–2087 (1993)

    Article  Google Scholar 

  40. Tynan, P.D., Sekuler, R.: Motion processing in peripheral vision: reaction time and perceived velocity. Vis. Res. 22(1), 61–68 (1982)

    Article  Google Scholar 

  41. Vig, E., Dorr, M., Barth, E.: Learned saliency transformations for gaze guidance. In: IS&T/SPIE Electronic Imaging, pp. 78650W–78650W. International Society for Optics and Photonics (2011)

    Google Scholar 

  42. Waldin, N., Waldner, M., Viola, I.: Flicker observer effect: guiding attention through high frequency flicker in images. In: Computer Graphics Forum, vol. 36, pp. 467–476. Wiley Online Library (2017)

    Google Scholar 

  43. Wernert, E.A., Hanson, A.J.: A framework for assisted exploration with collaboration. In: Proceedings of the Conference on Visualization 1999: Celebrating Ten Years, pp. 241–248. IEEE Computer Society Press (1999)

    Google Scholar 

  44. Weymouth, F.W.: Visual sensory units and the minimum angle of resolution. Optom. Vis. Sci. 40(9), 550–568 (1963)

    Article  Google Scholar 

Download references

Acknowledgements

The authors gratefully acknowledge funding by the German Science Foundation (DFG MA2555/15-1 “Immersive Digital Reality” and DFG INST 188/409-1 FUGG “ICG Dome”).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steve Grogorick .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Grogorick, S., Magnor, M. (2020). Subtle Visual Attention Guidance in VR. In: Magnor, M., Sorkine-Hornung, A. (eds) Real VR – Immersive Digital Reality. Lecture Notes in Computer Science(), vol 11900. Springer, Cham. https://doi.org/10.1007/978-3-030-41816-8_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41816-8_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41815-1

  • Online ISBN: 978-3-030-41816-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics