Skip to main content
Log in

A double-layer attentive graph convolution networks based on transfer learning for dynamic graph classification

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In practical scenarios, many graphs dynamically evolve over time. The new node classification without labels and historical information is challenging. To address this challenge, we design a double-layer attentive graph convolutional network (DLA-GCN) based on the transfer learning, which mainly includes three deep learning components: the double-layer graph convolutional network (DLGCN), node multi-parameter learning (NMPL) algorithm, and domain-adversarial transfer learning (DATL) method. In terms of dynamic spatial correlation, DLGCN jointly exploits the pre-defined and adaptive adjacency matrix to capture local and global feature aggregation. An inter-graph attention mechanism is further used to produce a unified representation for each node in graphs by automatically merging different spatial correlations. To reduce the complexity and improve accuracy, the matrix decomposition method is designed to learn the node-specific patterns of nodes in the NMPL component. In terms of dynamic time correlation, DATL is proposed to learns and transfers similar features as historical information of new nodes by optimizing three different loss functions, namely source classifier loss, domain classifier loss, and target classifier loss as a whole. The experimental results on two real-world graph classification datasets show that the proposed approach can improve the accuracy by 18% and 10%, respectively, compared with the state-of-art baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

The data sets supporting the results of this article are included within the references.

References

  1. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 701–710

  2. Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 855–864

  3. Yu B, Yin H, Zhu Z (2017) Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting. arXiv preprint arXiv:1709.04875

  4. Song C, Lin Y, Guo S, Wan H (2020) Spatial-temporal synchronous graph convolutional networks: A new framework for spatial-temporal network data forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 914–921

  5. Roy A, Roy KK, Ali AA, Amin MA, Rahman AM (2021) Unified spatio-temporal modeling for traffic forecasting using graph neural network. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE

  6. Bai L, Yao L, Li C, Wang X, Wang C (2020) Adaptive graph convolutional recurrent network for traffic forecasting. Adv Neural Inf Process Syst 33:17804–17815

    Google Scholar 

  7. Guo S, Lin Y, Feng N, Song C, Wan H (2019) Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 922–929

  8. Li R, Wang S, Zhu F, Huang J (2018) Adaptive graph convolutional neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32

  9. Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl T, Leiserson C (2020) Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5363–5370

  10. Li Y, Yu R, Shahabi C, Liu Y (2017) Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv preprint arXiv:1707.01926

  11. Sankar A, Wu Y, Gou L, Zhang W, Yang H (2020) Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In: Proceedings of the 13th International Conference on Web Search and Data Mining, pp. 519–527

  12. Ruan J, Chen H, Wang Z, Chen S (2021) Ds-tagcn: A dual-stream topology attentive gcn for node classification in dynamic graphs. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1–7. IEEE

  13. Dai Q, Wu X-M, Xiao J, Shen X, Wang D (2022) Graph transfer learning via adversarial domain adaptation with graph convolution. IEEE Transactions on Knowledge and Data Engineering

  14. Yin X, Li F, Shen Y, Qi H, Yin B (2022) Nodetrans: A graph transfer learning approach for traffic prediction. arXiv preprint arXiv:2207.01301

  15. Hou W, Huang H, Peng Q, Yu R, Yu L, Wang L (2022) Spatial-hierarchical graph neural network with dynamic structure learning for histological image classification. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 181–191. Springer

  16. Trivedi R, Dai H, Wang Y, Song L (2017) Know-evolve: Deep temporal reasoning for dynamic knowledge graphs. In: International Conference on Machine Learning, pp. 3462–3471. PMLR

  17. Micheli A, Tortorella D (2022) Discrete-time dynamic graph echo state networks. Neurocomputing 496:85–95

    Article  Google Scholar 

  18. Han K, Xiao A, Wu E, Guo J, Xu C, Wang Y (2021) Transformer in transformer. Adv Neural Inform Process Syst 34

  19. Qu H, Li L, Li Z, Zheng J, Tang X (2022) Robust discriminative projection with dynamic graph regularization for feature extraction and classification. Knowl-Based Syst 253:109563

    Article  Google Scholar 

  20. Huang X, Rao Y, Xie H, Wong T-L, Wang FL (2017) Cross-domain sentiment classification via topic-related tradaboost. In: Thirty-First AAAI Conference on Artificial Intelligence

  21. Fu, D., He, J.: Sdg: A simplified and dynamic graph neural network. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2273–2277 (2021)

  22. Wu M, Pan S, Zhou C, Chang X, Zhu X (2020) Unsupervised domain adaptive graph convolutional networks. In: Proceedings of The Web Conference 2020, pp. 1457–1467

  23. Devlin J, Chang M-W, Lee K, Toutanova K (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805

  24. Kim B-H, Ye JC, Kim J-J (2021) Learning dynamic graph representation of brain connectome with spatio-temporal attention. Adv Neural Inf Process Syst 34:4314–4327

    Google Scholar 

  25. Liu Q, Dong Y, Zhang Y, Luo H (2022) A fast dynamic graph convolutional network and cnn parallel network for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing

  26. Chen J, Jiao L, Liu X, Li L, Liu F, Yang S (2021) Automatic graph learning convolutional networks for hyperspectral image classification. IEEE Trans Geosci Remote Sens 60:1–16

    Article  Google Scholar 

  27. Liao T, Chen J-C, Jeng S-K, Tai C (2022) Cross-domain knowledge transfer for skeleton-based action recognition based on graph convolutional gradient reversal layer. In: 2022 IEEE 5th International Conference on Multimedia Information Processing and Retrieval (MIPR), pp. 387–390. IEEE

  28. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907

  29. Khan W, Haroon M (2022) An efficient framework for anomaly detection in attributed social networks. Int J Inform Technol. https://doi.org/10.1007/s41870-022-01044-2

    Article  Google Scholar 

  30. Tu E, Wang Z, Yang J, Kasabov N (2022) Deep semi-supervised learning via dynamic anchor graph embedding in latent space. Neural Netw 146:350–360

    Article  PubMed  Google Scholar 

  31. Galke L, Vagliano I, Franke B, Zielke T, Hoffmann M, Scherp A (2023) Lifelong learning on evolving graphs under the constraints of imbalanced classes and new classes. Neural Netw. https://doi.org/10.1016/j.neunet.2023.04.022

    Article  PubMed  Google Scholar 

  32. Lombardo G, Poggi A, Tomaiuolo M (2022) Continual representation learning for node classification in power-law graphs. Futur Gener Comput Syst 128:420–428

    Article  Google Scholar 

  33. Yao K, Liang J, Liang J, Li M, Cao F (2022) Multi-view graph convolutional networks with attention mechanism. Artif Intell 307:103708

    Article  MathSciNet  Google Scholar 

  34. Jin T, Dai H, Cao L, Zhang B, Huang F, Gao Y, Ji R (2022) Deepwalk-aware graph convolutional networks. Sci China Inform Sci 65(5):152104

    Article  MathSciNet  Google Scholar 

  35. Huang J, Du L, Chen X, Fu Q, Han S, Zhang D (2023) Robust mid-pass filtering graph convolutional networks. In: Proceedings of the ACM Web Conference 2023, pp. 328–338

  36. Sharma K, Verma S, Medya S, Bhattacharya A, Ranu S (2023) Task and model agnostic adversarial attack on graph neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, pp. 15091–15099

  37. Zhang G, Hu Z, Wen G, Ma J, Zhu X (2023) Dynamic graph convolutional networks by semi-supervised contrastive learning. Pattern Recognit 139:109486

    Article  Google Scholar 

  38. Mo X, Wan B, Tang R, Ding J, Liu G (2023) Attention-based network embedding with higher-order weights and node attributes. CAAI Transactions on Intelligence Technology

  39. Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, et al. (2016) Tensorflow: Large-scale machine learning on heterogeneous distributed systems. arXiv preprint arXiv:1603.04467

  40. Velickovic P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. Stat 1050:20

    Google Scholar 

  41. Wu B, Liang X, Zheng X, Guo Y, Tang H (2022) Improving dynamic graph convolutional network with fine-grained attention mechanism. In: ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3938–3942. IEEE

  42. Mallick T, Balaprakash P, Rask E, Macfarlane J (2021) Transfer learning with graph neural networks for short-term highway traffic forecasting. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 10367–10374. IEEE

  43. Zhang C-Y, Yao Z-L, Yao H-Y, Huang F, Chen CP (2022) Dynamic representation learning via recurrent graph neural networks. IEEE Transactions on Systems, Man, and Cybernetics: Systems

  44. Pan J, Li H, Teng J, Zhao Q, Li M (2022) Dynamic network representation learning method based on improved gru network. Comput Inform 41(6):1491–1509

    Article  Google Scholar 

Download references

Acknowledgements

This work has been supported by the National Natural Science Foundation of China under Grant No.61971057 and MoE-CMCC “Artifical Intelligence” Project No. MCM20190701.

Funding

This work has been supported by the National Natural Science Foundation of China under Grant No.61971057 and MoE-CMCC “Artifical Intelligence” Project No. MCM20190701.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong Zhang.

Ethics declarations

Conflict of interest

The authors declare that they have no confict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yao, L., Guo, D., Wang, X. et al. A double-layer attentive graph convolution networks based on transfer learning for dynamic graph classification. Int. J. Mach. Learn. & Cyber. 15, 863–877 (2024). https://doi.org/10.1007/s13042-023-01944-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-023-01944-0

Keywords

Navigation