Skip to main content

Gaze Visualization for Immersive Video

  • Conference paper
  • First Online:

Part of the book series: Mathematics and Visualization ((MATHVISUAL))

Abstract

In contrast to traditional video, immersive video allows viewers to interactively control their field of view in a 360 panoramic scene. However, established methods for the comparative evaluation of gaze data for video require that all participants observe the same viewing area. We therefore propose new specialized visualizations and a novel visual analytics framework for the combined analysis of head movement and gaze data. A novel View Similarity visualization highlights viewing areas branching and joining over time, while three additional visualizations provide global and spatial context. These new visualizations, along with established gaze evaluation techniques, allow analysts to investigate the storytelling of immersive videos. We demonstrate the usefulness of our approach using head movement and gaze data recorded for both amateur panoramic videos, as well as professionally composited immersive videos.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Ábaco Digital Zaragoza, VIDEO 360: RAFTING, 2015. youtube.com/watch?v=h0xo8QEPrk0, vis. 27 Jul 2015

  2. Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: Proceedings of EuroVis, vol. 2014 (2014)

    Google Scholar 

  3. Burch, M., Kull, A., Weiskopf, D.: Aoi rivers for visualizing dynamic eye gaze frequencies. In: Computer Graphics Forum, vol. 32, pp. 281–290. Wiley Online Library, Chichester (2013)

    Google Scholar 

  4. Cheon, M., Lee, J.-S.: Temporal resolution vs. visual saliency in videos: analysis of gaze patterns and evaluation of saliency models. Signal Process. Image Commun. 39, 405–417 (2015)

    Google Scholar 

  5. ChimpanZés de Gaveta, Um Menino Trilha: ChimpanZés de Gaveta, 2015. youtube.com/watch?v=q72AwheNYPk, vis. 18 Feb 2016

  6. Cox, T., Cox, A.: Multidimensional Scaling, 2nd edn. Taylor & Francis, Boca Raton (2010)

    Google Scholar 

  7. Duchowski, A.T., Medlin, E., Cournia, N., Gramopadhye, A., Melloy, B., Nair, S.: 3d eye movement analysis for vr visual inspection training. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 103–110. ACM (2002)

    Google Scholar 

  8. Duchowski, A.T., Price, M.M., Meyer, M., Orero, P.: Aggregate gaze visualization with real-time heatmaps. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 13–20. ACM (2012)

    Google Scholar 

  9. FOVE: The world’s first eye tracking virtual reality headset, 2015. getfove.com, vis. 29 Jul 2015

  10. Google Jump, 2015. google.com/cardboard/jump, vis. 29 Jul 2015

  11. Itoh, K., Hansen, J.P., Nielsen, F.: Cognitive modelling of a ship navigator based on protocol and eye-movement analysis. Le Travail Humain, pp. 99–127. Presses Universitaires de France, Paris (1998)

    Google Scholar 

  12. Itoh, K., Tanaka, H., Seki, M.: Eye-movement analysis of track monitoring patterns of night train operators: effects of geographic knowledge and fatigue. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 44, pp. 360–363. SAGE Publications (2000)

    Google Scholar 

  13. Kurzhals, K., Weiskopf, D.: Space-time visual analytics of eye-tracking data for dynamic stimuli. IEEE Trans. Vis. Comput. Graph. 19 (12), 2129–2138 (2013)

    Article  Google Scholar 

  14. Kurzhals, K., Weiskopf, D.: Aoi transition trees. In: Proceedings of the 41st Graphics Interface Conference, pp. 41–48. Canadian Information Processing Society (2015)

    Google Scholar 

  15. LaViola Jr, J.J.: A discussion of cybersickness in virtual environments. ACM SIGCHI Bull. 32 (1), 47–56 (2000)

    Google Scholar 

  16. Mackworth, J.F., Mackworth, N.: Eye fixations recorded on changing visual scenes by the television eye-marker. JOSA 48 (7), 439–444 (1958)

    Article  Google Scholar 

  17. Noton, D., Stark, L.: Scanpaths in eye movements during pattern perception. Science 171 (3968), 308–311 (1971)

    Article  Google Scholar 

  18. Nyström, M., Holmqvist, K.: Effect of compressed offline foveated video on viewing behavior and subjective quality. ACM Trans. Multimed. Comput. Commun. Appl. (TOMM) 6 (1), 4 (2010)

    Google Scholar 

  19. Oculus Story Studio, 2015. oculus.com/storystudio, vis. 29 Jul 2015

  20. Perazzi, F., Sorkine-Hornung, A., Zimmer, H., Kaufmann, P., Wang, O., Watson, S., Gross, M.: Panoramic video from unstructured camera arrays. In: Computer Graphics Forum, vol. 34, pp. 57–68. Wiley Online Library, Chichester (2015)

    Google Scholar 

  21. Pfeiffer, T.: Measuring and visualizing attention in space with 3D attention volumes. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 29–36. ACM (2012)

    Google Scholar 

  22. Ramloll, R., Trepagnier, C., Sebrechts, M., Beedasy, J.: Gaze data visualization tools: opportunities and challenges. In: Proceedings of the Eighth International Conference on Information Visualisation, IV 2004, pp. 173–180. IEEE (2004)

    Google Scholar 

  23. Schulz, C., Schneider, E., Fritz, L., Vockeroth, J., Hapfelmeier, A., Brandt, T., Kochs, E., Schneider, G.: Visual attention of anaesthetists during simulated critical incidents. Br. J. Anaesth. 106 (6), 807–813 (2011)

    Article  Google Scholar 

  24. Shoemake, K.: Arcball: a user interface for specifying three-dimensional orientation using a mouse. In: Graphics Interface, vol. 92, pp. 151–156. Morgan Kaufmann Publishers, San Francisco (1992)

    Google Scholar 

  25. Smith, T., Henderson, J.: Attentional synchrony in static and dynamic scenes. J. Vis. 8 (6), 773–773 (2008)

    Article  Google Scholar 

  26. Soyka, F., Kokkinara, E., Leyrer, M., Buelthoff, H., Slater, M., Mohler, B.: Turbulent motions cannot shake vr. In: Virtual Reality (VR), pp. 33–40. IEEE (2015)

    Google Scholar 

  27. Stellmach, S., Nacke, L., Dachselt, R.: Advanced gaze visualizations for three-dimensional virtual environments. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pp. 109–112. ACM (2010)

    Google Scholar 

  28. Stengel, M., Grogorick, S., Rogge, L., Magnor, M.: A nonobscuring eye tracking solution for wide field-of-view head-mounted displays. In: Eurographics 2014-Posters, pp. 7–8. The Eurographics Association (2014)

    Google Scholar 

  29. Tory, M., Atkins, M.S., Kirkpatrick, A.E., Nicolaou, M., Yang, G.-Z.: Eyegaze analysis of displays with combined 2D and 3D views. In: Visualization (VIS’05), pp. 519–526. IEEE (2005)

    Google Scholar 

  30. Youtube creator blog, 2015. youtubecreator.blogspot.de/2015/03/a-new-way-to-see-and-share-your-world.html, vis. 27 Jul 2015

  31. Weibel, N., Fouse, A., Emmenegger, C., Kimmich, S., Hutchins, E.: Let’s look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 107–114. ACM (2012)

    Google Scholar 

Download references

Acknowledgements

The authors thank Laura Saenger, Flávio Bezerra and Eduard Tucholke for permission to use the short film “UM MENINO”. The authors gratefully acknowledge funding by the German Science Foundation from project DFG MA2555/6-2 within the strategic research initiative on Scalable Visual Analytics and funding from the European Union’s Seventh Framework Programme FP7/2007-2013 under grant agreement no. 256941, Reality CG.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas Löwe .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Löwe, T., Stengel, M., Förster, EC., Grogorick, S., Magnor, M. (2017). Gaze Visualization for Immersive Video. In: Burch, M., Chuang, L., Fisher, B., Schmidt, A., Weiskopf, D. (eds) Eye Tracking and Visualization. ETVIS 2015. Mathematics and Visualization. Springer, Cham. https://doi.org/10.1007/978-3-319-47024-5_4

Download citation

Publish with us

Policies and ethics