Loading [MathJax]/extensions/TeX/AMScd.js
Manifold Learning and Deep Generative Networks for Heterogeneous Change Detection From Hyperspectral and Synthetic Aperture Radar Images | IEEE Journals & Magazine | IEEE Xplore

Manifold Learning and Deep Generative Networks for Heterogeneous Change Detection From Hyperspectral and Synthetic Aperture Radar Images


Abstract:

Unsupervised change detection (CD) stands as a critical tool for damage assessment after a natural disaster. We emphasize heterogeneous CD methods, which support the case...Show More

Abstract:

Unsupervised change detection (CD) stands as a critical tool for damage assessment after a natural disaster. We emphasize heterogeneous CD methods, which support the case of highly heterogeneous images at the two observation dates, providing greater flexibility than traditional homogeneous methods. This adaptability is vital for swift responses in the aftermath of natural disasters. In this framework, we address the challenging case of detecting changes between the hyperspectral and synthetic aperture radar images. This case has intrinsic difficulties, namely, the difference in the nature of the physical quantity measured, added to the great difference in dimensionality of the two imaging domains. To address these challenges, a novel method is proposed based on the integration of a manifold learning technique and deep learning networks trained to perform an image-to-image translation task. The method works in a fully unsupervised manner, further enforcing a fast implementation in real-world scenarios. From an application-oriented perspective, we focus on flooded-area mapping using the PRISMA and COSMO-SkyMed missions. The experimental validation on two datasets, a semisimulated one and a real one associated with flooding, suggests that the proposed method allows for accurate detection of flooded areas and other ground changes.
Published in: IEEE Geoscience and Remote Sensing Letters ( Volume: 22)
Article Sequence Number: 5500105
Date of Publication: 12 November 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.