Skip to main content

Gaze-Dependent Tone Mapping

  • Conference paper
Image Analysis and Recognition (ICIAR 2013)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7950))

Included in the following conference series:

Abstract

In this paper we model the process of temporal adaptation of the human visual system to varying luminance conditions. An eye tracker is used to capture the location of an observer’s gaze in a high dynamic range image displayed on the screen. We apply a novel technique of eye tracker data filtering to avoid flickering caused by incorrect gaze estimation. Temporary adaptation luminance is then determined in the area surrounding the gaze point. We use its value to compress the high dynamic range image and display it on the low dynamic range display. The applied tone mapping technique uses a global compression curve in which location is shifted along the luminance axis according to a value of the adaptation luminance. This technique models the natural process of adaptation occurring in the human eyes, also taking into account the time-dependent visual adaptation to dark and bright backgrounds.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Mikami, T., Hirai, K., Nakaguchi, T., Tsumura, N.: Real-time tone-mapping of high dynamic range image using gazing area information. In: Proc. International Conference on Computer and Information (2010)

    Google Scholar 

  2. Duchowski, T.A.: Eye Tracking Methodology: Theory and Practice, 2nd edn. Springer (2007)

    Google Scholar 

  3. Rahardja, S., Farbiz, F., Manders, C., Zhiyong, H., Ling, J.N., Khan, I.R., Ping, O.E., Peng, S.: Eye HDR: gaze-adaptive system for displaying high-dynamic-range images. In: ACM SIGGRAPH ASIA 2009 Art Gallery & Emerging Technologies, p. 68 (2009)

    Google Scholar 

  4. Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA), New York, pp. 71–78 (2000)

    Google Scholar 

  5. Mantiuk, R.K., Krawczyk, G., Mantiuk, R., Seidel, H.P.: High dynamic range imaging pipeline: Perception-motivated representation of visual content. In: Proc. of Human Vision and Electronic Imaging XII, vol. 6492(649212) (2007)

    Google Scholar 

  6. Mantiuk, R., Bazyluk, B., Mantiuk, R.K.: Gaze-Dependent Object Tracking for Real Time Rendering. Computer Graphics Forum (Proc. of Eurographics 2013) (2013)

    Google Scholar 

  7. Reinhard, E., Ward, G., Pattanaik, S., Debevec, P.: High Dynamic Range Imaging. In: Data Acquisition, Manipulation, and Display. Morgan Kaufmann (2005)

    Google Scholar 

  8. Mantiuk, R., Janus, S.: Gaze-dependent Ambient Occlusion. In: Bebis, G., Boyle, R., Parvin, B., Koracin, D., Fowlkes, C., Wang, S., Choi, M.-H., Mantler, S., Schulze, J., Acevedo, D., Mueller, K., Papka, M. (eds.) ISVC 2012, Part I. LNCS, vol. 7431, pp. 523–532. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  9. Mantiuk, R., Bazyluk, B., Tomaszewska, A.: Gaze-Dependent Depth-of-Field Effect Rendering in Virtual Environments. In: Ma, M., Fradinho Oliveira, M., Madeiras Pereira, J. (eds.) SGDA 2011. LNCS, vol. 6944, pp. 1–12. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  10. Yamauchi, T., Mikami, T., Ouda, O., Nakaguchi, T., Tsumura, N.: Improvement and evaluation of real-time tone mapping for high dynamic range images using gaze information. In: Koch, R., Huang, F. (eds.) ACCV 2010 Workshops, Part I. LNCS, vol. 6468, pp. 440–449. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  11. Mantiuk, R., Kowalik, M., Nowosielski, A., Bazyluk, B.: Do-It-Yourself Eye Tracker: Low-Cost Pupil-Based Eye Tracker for Computer Graphics Applications. In: Schoeffmann, K., Merialdo, B., Hauptmann, A.G., Ngo, C.-W., Andreopoulos, Y., Breiteneder, C. (eds.) MMM 2012. LNCS, vol. 7131, pp. 115–125. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  12. Pattanaik, S.N., Tumblin, J., Yee, H., Greenberg, D.P.: Time-dependent visual adaptation for fast realistic image display. In: Proc. of SIGGRAPH 2000, pp. 47–54 (2000)

    Google Scholar 

  13. Krawczyk, G., Myszkowski, K., Seidel, H.P.: Perceptual effects in real-time tone mapping. In: Proceedings of the 21st Spring Conference on Computer Graphics, Budmerice, Slovakia, pp. 195–202 (2005)

    Google Scholar 

  14. Hood, D.C.: Lower-level visual processing and models of light adaptation. Annu. Rev. Psychology 49, 503–535 (1998)

    Article  Google Scholar 

  15. Ward, G.: A contrast-based scalefactor for luminance display. In: Graphics Gems IV, pp. 415–421 (1994)

    Google Scholar 

  16. Ferwerda, J.A., Pattanaik, S.N., Shirley, P., Greenberg, D.P.: A model of visual adaptation for realistic image synthesis. In: Proc. of SIGGRAPH 1996, pp. 249–258. ACM, New York (1996)

    Google Scholar 

  17. Mantiuk, R., Mantiuk, R.K., Tomaszewska, A., Heidrich, W.: Color Correction for Tone Mapping. Computer Graphics Forum (Proc. of Eurographics 2009) 28(2), 193–202 (2009)

    Article  Google Scholar 

  18. Reinhard, E., Stark, M., Shirley, P., Ferwerda, J.: Photographic tone reproduction for digital images. ACM Transactions on Graphics 21(3), 267–276 (2002)

    Article  Google Scholar 

  19. Shapley, R., Enroth-Cugell, C.: Visual adaptation and retinal gain controls. Progress in Retinal Research 3, 263–346 (1984)

    Article  Google Scholar 

  20. Ferradans, S., Bertalmio, M., Provenzi, E., Caselles, V.: An Analysis of Visual Adaptation and Contrast Perception for Tone Mapping. IEEE Trans. Pattern Anal. Mach. Intell. 33(10), 2002–2012 (2011)

    Article  Google Scholar 

  21. Mantiuk, R., Pająk, D.: Acceleration of high dynamic range imaging pipeline based on multi-threading and SIMD technologies. In: Bubak, M., van Albada, G.D., Dongarra, J., Sloot, P.M.A. (eds.) ICCS 2008, Part I. LNCS, vol. 5101, pp. 780–789. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  22. Naka, K.-I., Rushton, W.A.H.: S-potentials from luminosity units in the retina of fish (Cyprinidae). J. Physiol. 185, 587–599 (1966)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mantiuk, R., Markowski, M. (2013). Gaze-Dependent Tone Mapping. In: Kamel, M., Campilho, A. (eds) Image Analysis and Recognition. ICIAR 2013. Lecture Notes in Computer Science, vol 7950. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39094-4_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-39094-4_48

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-39093-7

  • Online ISBN: 978-3-642-39094-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics