Skip to main content
Log in

A novel method for fusion of differently exposed images based on spatial distribution of intensity for ubiquitous multimedia

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Exposure fusion is an efficient way to produce a high-quality image for common Low Dynamic Range (LDR) output devices from multiple differently exposed LDR images of the same scene, which has significant potential to be applied in the ubiquitous multimedia area. Generating the fused image with high local contrast from fewer exposed images is still a challenging task. A novel method is proposed in this paper to fuse two differently exposed images based on the spatial distribution of intensity, which consists of two steps. First, the weights are computed based on the background context of the average image for producing the initial fused image. Then, we propose to enhance the initial fused image through removing the background context and efficiently refuse them. So the proposed method improves the local contrast in the dark region and keeps the color in the bright region. Experimental results and comparisons with the existing exposure fusion methods demonstrate that the proposed method has better performance and is convenient for GPU realization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Aydin TO, Mantiuk R, Myszkowski K, Seidel HP (2008) Dynamic range independent image quality assessment. ACM Trans Graph 27(3):69. doi:10.1145/1360612.1360668

    Article  Google Scholar 

  2. Burt P, Adelson E (1983) The Laplacian pyramid as a compact image code. IEEE Trans Commun 31(4):532–540. doi:10.1109/TCOM.1983.1095851

    Article  Google Scholar 

  3. Ferwerda JA (2001) Elements of early vision for computer graphics. IEEE Comput Graph Appl 21(5):22–33. doi:10.1109/38.946628

    Article  Google Scholar 

  4. Goshtasby AA (2005) Fusion of multi-exposure images. Image Vis Comput 23(6):611–618. doi:10.1016/j.imavis.2005.02.004

    Article  Google Scholar 

  5. Hall R (1989) Illumination and color in computer generated imagery. Springer, New York

    Google Scholar 

  6. Hui LH, Xiao X (2010) Design of a image acquisition system with high dynamic range. In: IEEE International Conference on Multimedia Communications, pp 222–225

  7. Jo KH, Vavilin A (2011) HDR image generation based on intensity clustering and local feature analysis. Comput Human Behav 27(5):1507–1511. doi:10.1016/j.chb.2010.10.015

    Article  Google Scholar 

  8. Kakarala R, Hebbalaguppe R (2011) A method for fusing a pair of images in the jpeg domain. J Real-Time Image Proc 1:1–11. doi:10.1007/s11554-011-0231-8

    Google Scholar 

  9. Mertens T, Kautz J, Van Reeth F (2007) Exposure fusion. In: Proc of the 15th Pacific Conference on Computer Graphics and Applications, pp 382–390

  10. Mertens T, Kautz J, Van Reeth F (2009) Exposure fusion: a simple and practical alternative to high dynamic range photography. Comput Graph Forum 28(1):161–171. doi:10.1111/j.1467-8659.2008.01171.x

    Article  Google Scholar 

  11. Seetzen H, Heidrich W, Stuerzlinger W, Ward G, Whitehead L, Trentacoste M, Ghosh A, Vorozcovs A (2004) High dynamic range display systems. ACM Trans Graph 23(3):760–768. doi:10.1145/1015706.1015797

    Article  Google Scholar 

  12. Shen R, Cheng I, Shi J, Basu A (2011) Generalized random walks for fusion of multi-exposure images. IEEE Trans Image Process 20(12):3634–3646. doi:10.1109/TIP.2011.2150235

    Article  MathSciNet  Google Scholar 

  13. Song M, Tao D, Chen C, Bu J, Luo J, Zhang C (2012) Probabilistic exposure fusion. IEEE Trans Image Process 21(1):341–357. doi:10.1109/TIP.2011.2157514

    Article  MathSciNet  Google Scholar 

  14. Vavilin A, Jo KH (2008) Recursive HDR image generation from differently exposed images. In: IEEE International Conference on Computer Graphics and Vision, pp 23–27

  15. Wang J, Feng S, Bao Q (2010) Pyramidal dual-tree directional filter bank based exposure fusion for two complementary images. In: IEEE 10th International Conference on Signal Processing, pp 1082–1085

Download references

Acknowledgements

The authors would like to thank Professor Geyong Min for valuable advice. This work has been partially supported by the National Natural Science Foundation of China (Grant No. 61075010), and by the Fundamental Research Funds for the Central Universities (HUST: CXY12Q030 and CXY12Q031).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Enmin Song.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yu, M., Song, E., Jin, R. et al. A novel method for fusion of differently exposed images based on spatial distribution of intensity for ubiquitous multimedia. Multimed Tools Appl 74, 2745–2761 (2015). https://doi.org/10.1007/s11042-013-1660-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-013-1660-0

Keywords

Navigation