Loading [a11y]/accessibility-menu.js
A Deep Multiresolution Representation Framework for Pansharpening | IEEE Journals & Magazine | IEEE Xplore

A Deep Multiresolution Representation Framework for Pansharpening


Abstract:

Pansharpening aims at merging the spectral information from a low-resolution multispectral (LRMS) image with the spatial details from a panchromatic (PAN) image to produc...Show More

Abstract:

Pansharpening aims at merging the spectral information from a low-resolution multispectral (LRMS) image with the spatial details from a panchromatic (PAN) image to produce a high-resolution multispectral (HRMS) image. Regrettably, existing techniques tend to concentrate on utilizing spectral and spatial information at a single resolution to reconstruct HRMS images, which leads to a deficiency in fully exploiting the semantic information from different resolution levels. In consideration of the aforementioned issues, we proposed a deep multiresolution representation framework for pansharpening, termed DMR-Pan. With the idea of maintaining high-resolution (HR) and low-resolution (LR) representations, we proposed an effective strategy for the extraction of multiresolution semantics, where a PAN branch and an LRMS branch operate in parallel to not only retain HR spatial details and spectral information but also extract multilevel semantics from different resolutions. Through cross-modality and cross-resolution guidance mechanisms, the extracted multiresolution semantics are aggregated with minimal information loss. Finally, a novel query fusion mechanism is introduced to capture the latent interdependency between dual modalities with cross-modality channel-group attention (CCGA), thereby maximizing complementary semantics and significantly improving the fusion ability of the framework. Rigorous experimentation conducted on multiple datasets illustrates that our DMR-Pan surpasses comparable techniques both in qualitative and quantitative assessments.
Article Sequence Number: 5517216
Date of Publication: 29 April 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.