Abstract
This paper presents a novel and fast EMD-based (empirical mode decomposition-based) image fusion approach via morphological filter. Firstly, we develop a multi-channel bidimensional EMD method based on morphological filter to conduct image fusion. It uses the morphological expansion and erosion filters to compute the upper and lower envelopes of a multi-channel image in the sifting processing, and can decompose the input source images into several intrinsic mode functions (IMFs) with different scales and a residue. It significantly improves the computation efficiency of EMD for multi-channel images while maintaining the decomposition quality. Secondly, we adopt a patch-based fusion strategy with overlapping partition to fuse the IMFs and residue instead of the pixel-based fusion way usually used in EMD-based image fusion, where an energy-based maximum selection rule is designed to fuse the IMFs, and the feature information extracted by IMFs is used as a guide to merge the residue. Such strategy can extract the salient information of the source images well and can also reduce the spatial artifacts introduced by the noisy characteristics of the pixel-wise maps. A large number of comparative experiments on the fusion of several commonly used image data sets with multi-focus and multi-modal images, show that our newly proposed fusion method can obtain much better results than the existing EMD-based image fusion approaches. Furthermore, it is very competitive with the state-of-the-art image fusion methods in visualization, objective metrics, and time performance. The code of the proposed method can be downloaded from: https://github.com/neepuhjp/MFMBEMD-ImageFusion.












Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Goshtasby, A.A., Nikolov, S.: Image fusion: advances in the state of the art. Inf. Fus. 8(2), 114–118 (2007)
Ma, J., Ma, Y., Li, C.: Infrared and visible image fusion methods and applications: a survey, Inf. Fus. pp 153–178 (2019)
Huang, N.E., Shen, Z., Long, S.R., Wu, M.C., Shih, H.H., Zheng, Q., Yen, N.C., Tung, C.C., Liu, H.H.: The empirical mode decomposition and the hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. Math. Phys. Eng. Sci. 454(1971), 903–995 (1998)
Ahmed, M.U., Mandic, D.P.: Image fusion based on fast and adaptive bidimensional empirical mode decomposition, In: 2010 13th International Conference on Information Fusion, 2010, pp. 1–6
Yeh, M.H.: The complex bidimensional empirical mode decomposition. Signal Process. 92(2), 523–541 (2012)
Qin, X., Zheng, J., Hu, G., Wang, J.: Multi-focus image fusion based on window empirical mode decomposition. Infrared Phys. Technol. 85, 251–260 (2017)
Rehman, N., Ehsan, S., Abdullah, S., Akhtar, M., Mandic, D., Mcdonald-Maier, K.: Multi-scale pixel-based image fusion using multivariate empirical mode decomposition. Sensors 15(5), 10923–10947 (2015)
Pan, J., Tang, Y.Y.: A mean approximation based bidimensional empirical mode decomposition with application to image fusion. Digital Signal Process. 50, 61–71 (2016)
Wang, P., Fu, H., Zhang, K.: A pixel-level entropy-weighted image fusion algorithm based on bidimensional ensemble empirical mode decomposition. Int. J. Distrib. Sens. Netw. 14(12), 1–16 (2018)
Xia, Y., Zhang, B., Pei, W., Mandic, D.P.: Bidimensional multivariate empirical mode decomposition with applications in multi-scale image fusion. IEEE Access 7, 114261–114270 (2019)
Zhu, P., Liu, L., Zhou, X.: Infrared polarization and intensity image fusion based on bivariate bemd and sparse representation. Multimed. Tools Appl. 80, 4455–4471 (2021)
Nunes, J.C., Bouaoune, Y., Delechelle, E., Niang, O., Bunel, P.: Image analysis by bidimensional empirical mode decomposition. Image Vis. Comput. 21(12), 1019–1026 (2003)
Al-Baddai, S., Al-Subari, K., Tom, A.M., Sol-Casals, J., Lang, E.W.: A green function-based bi-dimensional empirical mode decomposition. Inf. Sci. 348, 305–321 (2016)
Hu, J., Wang, X., Qin, H.: Improved, feature-centric emd for 3d surface modeling and processing. Graph. Models 76(5), 340–354 (2014)
Hu, J., Wang, X., Qin, H.: Novel and efficient computation of hilbert-huang transform on surfaces. Comput. Aided Geom. Design 43, 95–108 (2016)
Wang, X., Hu, J., Guo, L., Zhang, D., Hong, Q., Hao, A.: Feature-preserving, mesh-free empirical mode decomposition for point clouds and its applications. Comput. Aided Geomet. Design 59, 1–16 (2018)
Wang, X., Hu, K., Hu, J., Du, L., Ho, A.T.S., Qin, H.: Robust and blind image watermarking via circular embedding and bidimensional empirical mode decomposition. Vis. Comput. 36(19), 2201–2214 (2020)
Wang, X., Hu, K., Hu, J., Du, L., Ho, A.T.S., Qin, H.: A novel robust zero-watermarking algorithm for medical images. Vis. Comput. 37, 2841–2853 (2021)
Bhuiyan, S., Adhami, R.R., Khan, J.F.: Fast and adaptive bidimensional empirical mode decomposition using order-statistics filter based envelope estimation. Eurasip J. Adv. Signal Process. 2008(164), 1–18 (2008)
Trusiak, M., Wielgus, M., Patorski, K.: Advanced processing of optical fringe patterns by automated selective reconstruction and enhanced fast empirical mode decomposition. Opt. Lasers Eng. 52, 230–240 (2014)
Mandic, D.P., ur Rehman, N., Wu, Z., Huang, N.E.: Empirical mode decomposition-based time-frequency analysis of multivariate signals: the power of adaptive data analysis. IEEE Signal Process. Mag. 30(6), 74–86 (2013)
Rehman, N., Mandic, D.P.: Multivariate empirical mode decomposition. Proc. R. Soc. A Math. Phys. Eng. Sci. 466(2117), 1291–1302 (2010)
Bhuiyan, S., Khan, J.F., Alam, M.S., Adhami, R.R.: Color image trend adjustment using a color bidimensional empirical mode decomposition method. J. Electron. Imaging 21(3), 234–242 (2012)
Liu, Y., Chen, X., Wang, Z., Wang, Z.J., Ward, R.K., Wang, X.: Deep learning for pixel-level image fusion: recent advances and future prospects. Inf. Fus. 42, 158–173 (2018)
Yu, L., Lei, W., Juan, C., Chang, L., Xun, C.: Multi-focus image fusion: a survey of the state of the art. Inf. Fus. 64, 71–91 (2020)
Sufyan, A., Imran, M., Shah, S.A., Shahwani, H., Wadood, A.A.: A novel multimodality anatomical image fusion method based on contrast and structure extraction. Int. J. Imag. Syst. Technol. 32(1), 324–342 (2022)
Li, X., Li, H., Yu, Z., Kong, Y.: Multifocus image fusion scheme based on the multiscale curvature in nonsubsampled contourlet transform domain. Opt. Eng. 54(7), 1–15 (2015)
Li, H., Chai, Y., Li, Z.: Multi-focus image fusion based on nonsubsampled contourlet transform and focused regions detection. Optik Int. J. Light Electron Opt. 124(1), 40–51 (2013)
Nencini, F., Garzelli, A., Baronti, S., Alparone, L.: Remote sensing image fusion using the curvelet transform. Inf. Fus. 8(2), 143–156 (2007)
Liu, Y., Liu, S., Wang, Z.: A general framework for image fusion based on multi-scale transform and sparse representation. Inf. Fus. 24, 147–164 (2015)
Naidu, V.: Multi-resolution image fusion by fft, In: International Conference on Image Information Processing 2011, 1–6 (2011)
Li, H., Manjunath, B.S., Mitra, S.: Multisensor image fusion using the wavelet transform. Gr. Models Image Process. 57(3), 235–245 (1995)
Lewis, J.J., OCallaghan, R., Nikolov, S.G., Bull, D.R., Canagarajah, N.: Pixel- and region-based image fusion with complex wavelets. Inf. Fus. 8(2), 119–130 (2007)
Yu, Z.A., Yu, L.B., Peng, S.C., Han, Y.A., Xz, D., Li, Z.A.: Ifcnn: a general image fusion framework based on convolutional neural network. Inf. Fus. 54, 99–118 (2020)
Haghighat, M.B.A., Aghagolzadeh, A., Seyedarabi, H.: A non-reference image fusion metric based on mutual information of image features. Comput. Electr. Eng. 37(5), 744–756 (2011)
Xydeas, C.S., Petrovic, V.: Objective image fusion performance measure. Electron. Lett. 36(4), 308–309 (2000)
Yang, C., Zhang, J.-Q., Wang, X.-R., Liu, X.: A novel similarity based quality metric for image fusion. Inf. Fus. 9(2), 156–160 (2008)
Han, Y., Cai, Y., Cao, Y., Xu, X.: A new image fusion performance metric based on visual information fidelity. Inf. Fus. 14(2), 127–135 (2013)
Ma, J., Zhou, Z., Wang, B., Miao, L., Zong, H.: Multi-focus image fusion using boosted random walks-based algorithm with two-scale focus maps. Neurocomputing 335, 9–20 (2019)
Lai, R., Li, Y., Guan, J., Xiong, A.: Multi-scale visual attention deep convolutional neural network for multi-focus image fusion. IEEE Access 7, 114385–114399 (2019)
Zhu, Z., Zheng, M., Qi, G., Wang, D., Xiang, Y.: A phase congruency and local laplacian energy based multi-modality medical image fusion method in nsct domain. IEEE Access 7, 20811–20824 (2019)
Zhan, K., Kong, L., Liu, B., He, Y.: Multimodal image seamless fusion. J. Electron. Imaging 28(2), 1–9 (2019)
Li, H., Qi, X., Xie, W.: Fast infrared and visible image fusion with structural decomposition. Knowl.-Based Syst. 204, 106182 (2020)
Acknowledgements
We would like to thank the anonymous reviewers for their helpful comments. This work is supported in part by National Science Foundation of USA (IIS-1812606, IIS-1715985); National Natural Science Foundation of China (No. 61672149, 61532002, 61602344, 61802279); the Open Project Program of the State Key Lab of CAD &CG(No. A2105), Zhejiang University.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of interest
The authors declare no conflict of interest in this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Xie, Q., Hu, J., Wang, X. et al. Novel and fast EMD-based image fusion via morphological filter. Vis Comput 39, 4249–4265 (2023). https://doi.org/10.1007/s00371-022-02588-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00371-022-02588-x