Loading [a11y]/accessibility-menu.js
Aggregative and Contrastive Dual-View Graph Attention Network for Hyperspectral Image Classification | IEEE Journals & Magazine | IEEE Xplore

Aggregative and Contrastive Dual-View Graph Attention Network for Hyperspectral Image Classification


Abstract:

Graph convolutional networks (GCNs) have recently gained prominence in hyperspectral images (HSIs) classification tasks given their superior performance on non-Euclidean ...Show More

Abstract:

Graph convolutional networks (GCNs) have recently gained prominence in hyperspectral images (HSIs) classification tasks given their superior performance on non-Euclidean data. However, GCN-based methods are heavily reliant on complete graph structural information, which can cause the aggregation and transmission of information across nodes from differing classes, thereby compromising the classification performance. Furthermore, the scarcity of labeled pixels in HSIs often limits the representational capability of such methods. To address these issues, we propose an aggregative and contrastive dual-view graph attention network (ACoD-GAT) for HSI classification. Specifically, we present a progressive aggregation module, including a pixel clustering submodule and a node aggregation submodule to exploit semantic information at various levels. Besides, we integrate multiscale manipulation with a diffusion matrix to construct the dual view to further extract semantic information from both local and global perspectives. Moreover, we design an unsupervised contrastive loss function and a supervised contrastive loss function to facilitate contrastive learning on the dual view, improving the representational capabilities of ACoD-GAT with very few labeled samples. The extensive experimental results on four benchmark datasets demonstrate the superiority of the proposed ACoD-GAT compared with other state-of-the-art methods.
Article Sequence Number: 5530817
Date of Publication: 15 August 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.