Skip to main content
Log in

A gated graph attention network based on dual graph convolution for node embedding

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

The research on node classification is based on node embeddings. Node classification accuracy can be improved if the embeddings of different nodes are well discriminated. With the rapid development of deep learning, researchers have proposed many graph neural network models (GNNs), such as GCN and GAT, which generally obtain node embeddings by aggregating neighborhood information. However, such methods only emphasize feature aggregation in neighborhoods and do not consider the class labels of nodes, which leads to the oversmoothing problem and weak differences in inter-class nodes. In this paper, we propose a gated graph attention network based on dual graph convolution for node embedding (GGAN-DGC). To strengthen the embedding difference of inter-class nodes, GGAN-DGC introduces a gated attention mechanism. This mechanism utilizes a supervised gated attention (GA) matrix to separate the GNN aggregation process according to the node class, so as to heterogenize the homogenous graphs. The GA matrix is obtained by the dual graph convolutional network (DGC), which can improve the receptive field of the original graph. In addition, GGAN-DGC adopts triplet loss as the global supervision function of node embedding, which can streng-then the class correlation of node embedding at the global level. Finally, based on the obtained node embedding, nodes can be classified correctly. The experimental results on five datasets confirm that our GGAN-DGC model performs better than other representative methods in node classification, especially for datasets with strong heterophily. In addition, we verify that GGAN-DGC can also perform better than other methods in graph classification experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data Availability

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

References

  1. Bruna J, Zaremba W, Szlam A, LeCun Y (2014) Spectral networks and locally connected networks on graphs. In: 2nd International conference on learning representations, ICLR 2014, Banff, AB, Canada, April 14-16, 2014, conference track proceedings

  2. Chen H, Xu Y, Huang F, Deng Z, Huang W, Wang S, He P, Li Z (2020) Label-aware graph convolutional networks. In: Proceedings of the 29th ACM international conference on information & knowledge management, pp 1977–1980

  3. Chen Y, Wu L, Zaki M (2020) Iterative deep graph learning for graph neural networks: better and robust node embeddings. Adv Neural Inf Process Syst 33:19314–19326

    Google Scholar 

  4. Chien E, Peng J, Li P, Milenkovic O (2021) Adaptive universal generalized pagerank graph neural network. In: 9th International conference on learning representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021

  5. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. Advances in Neural Information Processing Systems, vol 29

  6. García-Plaza AP, Fresno V, Unanue RM, Zubiaga A (2016) Using fuzzy logic to leverage html markup for web page representation. IEEE Trans Fuzzy Syst 25(4):919–933

    Article  Google Scholar 

  7. Gross JL, Yellen J., Anderson M (2018) Graph theory and its applications. Chapman and Hall/CRC

  8. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Advances in Neural Information Processing Systems, vol 30

  9. He D, Liang C, Liu H, Wen M, Jiao P, Feng Z (2022) Block modeling-guided graph convolutional neural networks. In: Proceedings of the AAAI conference on artificial intelligence, vol 36, pp 4022–4029

  10. Holzinger A, Malle B, Saranti A, Pfeifer B (2021) Towards multi-modal causability with graph neural networks enabling information fusion for explainable ai. Information Fusion 71:28–37

    Article  Google Scholar 

  11. Hou Y, Zhang J, Cheng J, Ma K, Ma RTB, Chen H, Yang M (2020) Measuring and improving the use of graph information in graph neural networks. In: 8th International conference on learning representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020

  12. Hu Z, Dong Y, Wang K, Sun Y (2020) Heterogeneous graph transformer. In: Proceedings of the web conference 2020, pp 2704–2710

  13. Ivanov S, Sviridov S, Burnaev E (2019) Understanding isomorphism bias in graph data sets. arXiv:1910.12091

  14. Jia J, Benson AR (2020) Residual correlation in graph neural network regression. In: Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp 588–598

  15. Van der Maaten L, Hinton G (2008) Visualizing data using t-sne. Journal of machine learning research, vol 9(11)

  16. Monti F, Boscaini D, Masci J, Rodola E, Svoboda J, Bronstein MM (2017) Geometric deep learning on graphs and manifolds using mixture model cnns. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 5115–5124

  17. Monti F, Shchur O, Bojchevski A, Litany O, Günnemann S, Bronstein MM (2018) Dual-primal graph convolutional networks. arXiv:1806.00770

  18. Pei H, Wei B, Chang KC, Lei Y, Yang B (2020) Geom-gcn: Geometric graph convolutional networks. In: 8th International conference on learning representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020

  19. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. ACM

  20. Qu M, Bengio Y, Tang J (2019) Gmnn: Graph markov neural networks. In: International conference on machine learning, pp 5241–5250. PMLR

  21. Ribeiro LF, Saverese PH, Figueiredo DR (2017) struc2vec: Learning node representations from structural identity. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp 385–394

  22. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T (2008) Collective classification in network data. AI magazine 29(3):93–93

    Article  Google Scholar 

  23. Shi C, Hu B, Zhao WX, Philip SY (2018) Heterogeneous information network embedding for recommendation. IEEE Trans Knowl Data Eng 31(2):357–370

    Article  Google Scholar 

  24. Stretcu O, Viswanathan K, Movshovitz-Attias D, Platanios E, Ravi S, Tomkins A (2019) Graph agreement models for semi-supervised learning. Adv Neural Inf Process Syst, vol 32

  25. Velikovi P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2018) Graph attention networks. In: International conference on learning representations

  26. Wang T, Jin D, Wang R, He D, Huang Y (2022) Powerful graph convolutional networks with adaptive propagation mechanism for homophily and heterophily. In: Proceedings of the AAAI conference on artificial intelligence, vol 36, pp 4210–4218

  27. Wang X, He X, Wang M, Feng F, Chua T.S (2019) Neural graph collaborative filtering. In: Proceedings of the 42nd international ACM SIGIR conference on research and development in information retrieval, pp 165–174

  28. Wang X, Zhu M, Bo D, Cui P, Shi C, Pei J (2020) Am-gcn: Adaptive multi-channel graph convolutional networks. In: Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp 1243–1253

  29. Wang Y, Derr T (2021) Tree decomposed graph neural network. In: Proceedings of the 30th ACM international conference on information & knowledge management, pp 2040–2049

  30. Welling M, Kipf TN (2016) Semi-supervised classification with graph convolutional networks. In: J. International conference on learning representations (ICLR 2017)

  31. Xu K, Hu W, Leskovec J, Jegelka S (2019) How powerful are graph neural networks?. In: 7th International conference on learning representations, ICLR 2019, New Orleans, LA, USA, May 6-9, 2019

  32. Yan S, Wang H, Li Y, Zheng Y, Han L (2021) Attention-aware metapath-based network embedding for hin based recommendation. Expert Syst Appl 174:114601

    Article  Google Scholar 

  33. Yang L, Zhou W, Peng W, Niu B, Gu J, Wang C, Cao X, He D (2022) Graph neural networks beyond compromise between attribute and topology. In: Proceedings of the ACM Web Conference, 2022, pp 127–1135

  34. Yang T, Wang Y, Yue Z, Yang Y, Tong Y, Bai J (2022) Graph pointer neural networks. In: Proceedings of the AAAI conference on artificial intelligence. vol 36, pp 8832–8839

  35. Zhao X, Liu Y, Xu Y, Yang Y, Luo X, Miao C (2022) Heterogeneous star graph attention network for product attributes prediction. Adv Eng Inform 51:101447

    Article  Google Scholar 

  36. Zhu J, Rossi RA, Rao A, Mai T, Lipka N, Ahmed NK, Koutra D (2021) Graph neural networks with heterophily. In: Proceedings of the AAAI conference on artificial intelligence. vol 35, pp 11168–11176

  37. Zhu J, Yan Y, Zhao L, Heimann M, Akoglu L, Koutra D (2020) Beyond homophily in graph neural networks: current limitations and effective designs. Adv Neural Inf Process Syst 33:7793–7804

    Google Scholar 

Download references

Acknowledgements

This work was supported in part by the Natural Science Foundation of Zhejiang Province (Grant No.LY22F020001), the 3315 Plan Foundation of Ningbo (Grant No.2019B-18-G).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu Xin.

Ethics declarations

Competing interests

he authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, R., Wang, L., Xin, Y. et al. A gated graph attention network based on dual graph convolution for node embedding. Appl Intell 53, 19962–19975 (2023). https://doi.org/10.1007/s10489-023-04568-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-04568-7

Keywords

Navigation