Abstract:
Hyperspectral image (HSI) classification is an essential task in remote sensing, but its performance is greatly affected by limited labeled samples. Currently, generative...Show MoreMetadata
Abstract:
Hyperspectral image (HSI) classification is an essential task in remote sensing, but its performance is greatly affected by limited labeled samples. Currently, generative adversarial network (GAN)-based methods can generate the virtue samples to augment the training set. However, with limited labeled data, GANs perform poorly in capturing features during sample generation. Very few relation networks (RNs) and few-shot learning (FSL) methods considered data augmentation to enhance performance. To address this challenge, we propose FSHyperRGAN, a few-shot HSI classification method based on relational GAN, which uses GANs to augment the training samples for RNs, while leveraging relation feature extraction to guide the generation of specific class samples. FSHyperRGAN comprises four modules: a data processing module converting the HSI data to 1-D and 3-D features, an adversarial generation (AG) module synthesizing virtual samples conditioned on labels, a data embedding and reconstruction (DER) module encoding latent spaces for accurate sample reconstruction while preserving category characteristics, and a relation computation (RC) module computing relation scores across generated, reconstructed, and original samples. In addition, a relational feature matching scheme is also applied, which can use virtual samples to guide classification. Two FSHyperRGAN frameworks are designed, 1D-FSHyperRGAN and 3D-FSHyperRGAN, which can be utilized for 1-D spectral or 3-D spatial-spectral classification. Experiments on widely used HSI datasets illustrate that the proposed method outperforms several state-of-the-art methods and achieves overall accuracies of 92.64%, 86.82%, 83.64%, and 84.57% on KSC, PaviaU, Houston, and WHU-Hi-HongHu datasets, respectively.
Published in: IEEE Transactions on Geoscience and Remote Sensing ( Volume: 62)