Loading [a11y]/accessibility-menu.js
MAGAN: Multiattention Generative Adversarial Network for Infrared and Visible Image Fusion | IEEE Journals & Magazine | IEEE Xplore

MAGAN: Multiattention Generative Adversarial Network for Infrared and Visible Image Fusion


Abstract:

Deep learning has been widely used in infrared and visible image fusion owing to its strong feature extraction and generalization capabilities. However, it is difficult t...Show More

Abstract:

Deep learning has been widely used in infrared and visible image fusion owing to its strong feature extraction and generalization capabilities. However, it is difficult to directly extract specific image features from different modal images. Therefore, according to the characteristics of infrared and visible images, this article proposes a multiattention generative adversarial network (MAGAN) for infrared and visible image fusion, which is composed of a multiattention generator and two multiattention discriminators. The multiattention generator gradually realizes the extraction and fusion of image features by constructing two modules: a triple-path feature prefusion module (TFPM) and a feature emphasis fusion module (FEFM). The two multiattention discriminators are constructed to ensure that the fused images retain the salient targets and the texture information from the source images. In MAGAN, an intensity attention and a texture attention are designed to extract the specific features of the source images to retain more intensity and texture information in the fused image. In addition, a saliency target intensity loss is defined to ensure that the fused images obtain more accurate salient information from infrared images. Experimental results on two public datasets show that the proposed MAGAN outperforms some state-of-the-art models in terms of visual effects and quantitative metrics.
Article Sequence Number: 5016614
Date of Publication: 02 June 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.