Correntropy-Based Spatial-Spectral Robust Sparsity-Regularized Hyperspectral Unmixing | IEEE Journals & Magazine | IEEE Xplore

Correntropy-Based Spatial-Spectral Robust Sparsity-Regularized Hyperspectral Unmixing


Abstract:

Hyperspectral unmixing (HU) is a crucial technique for exploiting remotely sensed hyperspectral data, which aims at estimating a set of spectral signatures, called endmem...Show More

Abstract:

Hyperspectral unmixing (HU) is a crucial technique for exploiting remotely sensed hyperspectral data, which aims at estimating a set of spectral signatures, called endmembers and their corresponding proportions, called abundances. The performance of HU is often seriously degraded by various kinds of noise existing in hyperspectral images (HSIs). Most of existing robust HU methods are based on the assumption that noise or outlier only exists in one kind of formulation, e.g., band noise or pixel noise. However, in real-world applications, HSIs are unavoidably corrupted by noisy bands and noisy pixels simultaneously, which require robust HU in both the spatial dimension and spectral dimension. Meanwhile, the sparsity of abundances is an inherent property of HSIs and different regions in an HSI may possess various sparsity levels across locations. This article proposes a correntropy-based spatial-spectral robust sparsity-regularized unmixing model to achieve 2-D robustness and adaptive weighted sparsity constraint for abundances simultaneously. The updated rules of the proposed model are efficient to be implemented and carried out by a half-quadratic technique. The experimental results obtained by both synthetic and real hyperspectral data demonstrate the superiority of the proposed method compared to the state-of-the-art methods.
Published in: IEEE Transactions on Geoscience and Remote Sensing ( Volume: 59, Issue: 2, February 2021)
Page(s): 1453 - 1471
Date of Publication: 16 June 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.