Attraction and Repulsion: Unsupervised Domain Adaptive Graph Contrastive Learning Network | IEEE Journals & Magazine | IEEE Xplore

Attraction and Repulsion: Unsupervised Domain Adaptive Graph Contrastive Learning Network


Abstract:

Graph convolutional networks (GCNs) are important techniques for analytics tasks related to graph data. To date, most GCNs are designed for a single graph domain. They ar...Show More

Abstract:

Graph convolutional networks (GCNs) are important techniques for analytics tasks related to graph data. To date, most GCNs are designed for a single graph domain. They are incapable of transferring knowledge from/to different domains (graphs), due to the limitation in graph representation learning and domain adaptation across graph domains. This paper proposes a novel Graph Contrastive Learning Network (GCLN) for unsupervised domain adaptive graph learning. The key innovation is to enforce attraction and repulsion forces within each single graph domain, and across two graph domains. Within each graph, an attraction force encourages local patch node features to be similar to global representation of the entire graph, whereas a repulsion force will repel node features so they can separate network from its permutations (i.e. domain-specific graph contrastive learning). Across two graph domains, an attraction force encourages node features from two domains to be largely consistent, whereas a repulsion force ensures features are discriminative to differentiate graph domains (i.e. cross-domain graph contrastive learning). The within- and cross-domain graph contrastive learning is carried out by optimizing an objective function, which combines source classifier and target classifier loss, domain-specific contrastive loss, and cross-domain contrastive loss. As a result, feature learning from graphs is facilitated using knowledge transferred between graphs. Experiments on real-world datasets demonstrate that GCLN outperforms state-of-the-art graph neural network algorithms.
Page(s): 1079 - 1091
Date of Publication: 21 March 2022
Electronic ISSN: 2471-285X

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.