Skip to main content

Label-Aware Hierarchical Contrastive Domain Adaptation for Cross-Network Node Classification

  • Conference paper
  • First Online:
Advanced Data Mining and Applications (ADMA 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14178))

Included in the following conference series:

  • 261 Accesses

Abstract

Cross-network node classification (CNNC) has gained a great deal of attention recently, which aims to transfer the knowledge from a label-rich source network to accurately classify nodes for a different but related unlabeled target network. To tackle the problem of network shift, the existing CNNC algorithms combine graph neural networks (GNNs) and domain adaptation (DA) to solve the problem. However, GNNs are vulnerable to network structure noises, and the traditional DA methods mainly focus on matching the marginal distributions and cannot guarantee the alignment of the class-conditional distributions of different networks. To remedy these deficiencies, we propose a novel label-aware hierarchical contrastive domain adaptation (LHCDA) model to address CNNC. On one hand, we use multi-head graph attention network (GAT) to learn noise-resistant node embeddings. On the other hand, a label-aware hierarchical contrastive domain adaptation module is designed to align the class-conditional distributions across networks at both node-node level and node-class level. Since target labels are unavailable, we use K-means clustering to generate pseudo-labels and employ the prediction confidence to reduce the noises. Extensive experimental results on six CNNC tasks demonstrate that the proposed LHCDA model is superior than previous state-of-the-art CNNC methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: International Conference on Learning Representations (2017)

    Google Scholar 

  2. Shen, X., Mao, S., Chung, F.-L.: Cross-network learning with fuzzy labels for seed selection and graph sparsification in influence maximization. IEEE Trans. Fuzzy Syst. 28(9), 2195–2208 (2020)

    Article  Google Scholar 

  3. Shen, X., Dai, Q., Mao, S., Chung, F.-L., Choi, K.-S.: Network together: Node classification via cross-network deep network embedding. IEEE Trans. Neural Netw. Learn. Syst. 32(5), 1935–1948 (2021)

    Article  MathSciNet  Google Scholar 

  4. Shen, X., Pan, S., Choi, K.-S., Zhou, X.: Domain-adaptive message passing graph neural network. Neural Netw. 164, 439–454 (2023)

    Article  Google Scholar 

  5. Shen, X., Dai, Q., Chung, F.I., Lu, W., Choi, K.S.: Adversarial deep network embedding for cross-network node classification. In Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 03, pp. 2991-2999 (2020)

    Google Scholar 

  6. Dai, Q., Wu, X.-M., Xiao, J., Shen, X., Wang, D.: Graph transfer learning via adversarial domain adaptation with graph convolution. IEEE Trans. Knowl. Data Eng. 35(5), 4908–4922 (2023)

    Google Scholar 

  7. Wu, M., Pan, S., Zhou, C., Chang, X., Zhu, X.: Unsupervised domain adaptive graph convolutional networks. In: Proceedings of The Web Conference 2020, pp. 1457-1467 (2020)

    Google Scholar 

  8. Wu, M., Pan, S., Zhu, X.: Attraction and repulsion: unsupervised domain adaptive graph contrastive learning network. IEEE Trans. Emerg. Top. Comput. Intell. 6(5), 1079–1091 (2022)

    Article  Google Scholar 

  9. Zhang, X., Du, Y., Xie, R., Wang, C.: Adversarial separation network for cross-network node classification. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 2618–2626 (2021)

    Google Scholar 

  10. Xiao, J., Dai, Q., Xie, X., Dou, Q., Kwok, K.W., Lam, J.: Domain adaptive graph infomax via conditional adversarial networks. IEEE Trans. Netw. Sci. Eng. 10(1), 35–52 (2022)

    Google Scholar 

  11. Zhang, Y., Song, G., Du, L., Yang, S., Jin, Y.: DANE: domain adaptive network embedding. In: International Joint Conference on Artificial Intelligence, pp. 4362–4368 (2019)

    Google Scholar 

  12. Dai, E., Jin, W., Liu, H., Wang, S.: Towards robust graph neural networks for noisy graphs with sparse labels. In Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining, pp. 181–191 (2022)

    Google Scholar 

  13. Li, Q., Han, Z., Wu, X.M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, no. 1 (2018)

    Google Scholar 

  14. Ganin, Y., et al.: Domain-adversarial training of neural networks. J. Mach. Learn. Res. 17(1), 2096–2030 (2016)

    MathSciNet  Google Scholar 

  15. Long, M., Cao, Y., Wang, J., Jordan, M.: Learning transferable features with deep adaptation networks. In: International Conference on Machine Learning, PMLR pp. 97–105 (2015)

    Google Scholar 

  16. Ganin, Y., Lempitsky, V.: Unsupervised domain adaptation by backpropagation. In: International Conference on Machine Learning, pp. 1180–1189. PMLR (2015)

    Google Scholar 

  17. Saenko, K., Kulis, B., Fritz, M., Darrell, T.: Adapting visual category models to new domains. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6314, pp. 213–226. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15561-1_16

    Chapter  Google Scholar 

  18. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. In: International Conference on Learning Representations (2018)

    Google Scholar 

  19. Rizve, M.N., Duarte, K., Rawat, Y.S., Shah, M.: In defense of pseudo-labeling: an uncertainty-aware pseudo-label selection framework for semi-supervised learning. In: International Conference on Learning Representations (2021)

    Google Scholar 

  20. Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: International Conference on Machine Learning, pp. 1597–1607. PMLR (2020)

    Google Scholar 

  21. Aberdam, A., et al.: Sequence-to-sequence contrastive learning for text recognition. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 15302–15312 (2021)

    Google Scholar 

  22. Xie, Y., Xu, Z., Zhang, J., Wang, Z., Ji, S.: Self-supervised learning of graph neural networks: a unified review. IEEE Trans. Pattern Analy. Mach. Intell. 2412–2429 (2022)

    Google Scholar 

  23. Kang, G., Jiang, L., Yang, Y., Hauptmann, A.G.: Contrastive adaptation network for unsupervised domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4893–4902 (2019)

    Google Scholar 

  24. Wang, R., Wu, Z., Weng, Z., Chen, J., Qi, G.J., Jiang, J.G.: Cross-domain contrastive learning for unsupervised domain adaptation. IEEE Trans. Multimedia, 1665–1673 (2022)

    Google Scholar 

  25. Singh, A.: CLDA: contrastive learning for semi-supervised domain adaptation. Adv. Neural. Inf. Process. Syst. 34, 5089–5101 (2021)

    Google Scholar 

  26. Thota, M., Leontidis, G.: Contrastive domain adaptation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2209–2218 (2021)

    Google Scholar 

  27. Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. Adv. Neural. Inf. Process. Syst. 30, 1025–1035 (2017)

    Google Scholar 

  28. Wu, F., Souza, A., Zhang, T., Fifty, C., Yu, T., Weinberger, K.: Simplifying graph convolutional networks. In: International Conference on Machine Learning, pp. 6861–6871: PMLR (2019)

    Google Scholar 

  29. Gasteiger, J., Bojchevski, A., Günnemann, S.: Predict then propagate: Graph neural networks meet personalized pagerank. In: International Conference on Learning Representations (2019)

    Google Scholar 

  30. Gretton, A., Borgwardt, K., Rasch, M., Schölkopf, B., Smola, A.: A kernel method for the two-sample-problem. Adv. Neural Inf. Process. Syst. 19 (2006)

    Google Scholar 

  31. Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. Adv. Neural Inf. Process. Syst. 32 (2019)

    Google Scholar 

  32. Ben-David, S., Blitzer, J., Crammer, K., Kulesza, A., Pereira, F., Vaughan, J.W.: A theory of learning from different domains. Mach. Learn. 79, 151–175 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  33. Shen, X., Chung, F.L.: Deep network embedding with aggregated proximity preserving. In: Proceedings of the 2017 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2017, pp. 40–43 (2017)

    Google Scholar 

Download references

Acknowledgments

This research was supported in part by Hainan Provincial Natural Science Foundation of China (No. 322RC570), National Natural Science Foundation of China (No. 62102124), and the Research Start-up Fund of Hainan University (No. KYQD(ZR)-22016).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiao Shen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xue, P., Shao, M., Zhou, X., Shen, X. (2023). Label-Aware Hierarchical Contrastive Domain Adaptation for Cross-Network Node Classification. In: Yang, X., et al. Advanced Data Mining and Applications. ADMA 2023. Lecture Notes in Computer Science(), vol 14178. Springer, Cham. https://doi.org/10.1007/978-3-031-46671-7_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46671-7_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46670-0

  • Online ISBN: 978-3-031-46671-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics