Skip to main content
Log in

Structural attention network for graph

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

We present a structural attention network (SAN) for graph modeling, which is a novel approach to learn node representations based on graph attention networks (GATs), with the introduction of two improvements specially designed for graph-structured data. The transition matrix was used to differentiate the structures between the nodes. The output features of nodes in the graph are represented as the concatenation of multi-order features to differentiate the structures among multiple orders. This novel neural network is based on a graph attention network, which makes the model pay attention to the topology of the graph. Using various experiments on citation networks and a protein-protein interaction dataset, we demonstrate the benefits of structural information in graph attention mechanisms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  2. Zhang Y, Pezeshki M, Brakel P et al (2016) Towards end-to-end speech recognition with deep convolutional neural networks. In: Interspeech conference, San Francisco, pp 410–414

  3. Conneau A, Schwenk H, Barrault L, et al (2017) Very deep convolutional networks for text classification european chapter of the association for computational linguistics EACL’17

  4. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444

    Article  Google Scholar 

  5. Niepert M, Ahmed M, Kutzkov K (2016) Learning convolutional neural networks for graphs. In: International conference on machine learning(ICML), New York, pp 2014–2023

  6. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Advances in neural information processing systems. pp 1024–1034

  7. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  8. Bahdanau D, Cho K, Bengio Y (2015) Neural machine translation by jointly learning to align and translate International Conference on Learning representations(ICLR)

  9. Gehring J, Auli M, Grangier D et al (2017) A convolutional encoder model for neural machine translation. In: Proceedings of the 55th annual meeting of the association for computational linguistics, vol 1, pp 123–135

  10. Yuan J, Gao N, Xiang J (2019) Transgate: knowledge graph embedding with shared gate structure[J]. In: Proceedings of the AAAI conference on artificial intelligence, vol 33, pp 3100–3107

  11. Velickovic P, Cucurull G, Casanova A et al (2018) Graph attention networks. International conference on learning representations (ICLR)

  12. Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. Curran Associates Inc, pp 5998–6008

  13. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. arXiv:1301.3781

  14. Hechtlinger Y, Chakravarti P, Qin J (2017) A generalization of convolutional neural networks to graph-structured data. arXiv:1704.08165

  15. Cao S, Lu W, Xu Q (2015) Grarep: Learning graph representations with global structural information. In: Proceedings of the 24th ACM international on conference on information and knowledge management, pp 891–900

  16. Gori M, Monfardini G, Scarselli FA (2005) New model for learning in graph domains. In: IEEE international joint conference on neural networks, IEEE

  17. Scarselli F, Gori M, Tsoi AC et al (2009) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80

    Article  Google Scholar 

  18. Bruna J, Zaremba W, Szlam A et al (2014) Spectral networks and locally connected networks on graphs. In: International conference on learning representations (ICLR2014)

  19. Edwards M, Xie X (2016) Graph based convolutional neural network. arXiv:1609.08965

  20. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th international conference on neural information processing systems. Curran Associates Inc, pp 3844–3852

  21. Kipf TN, Welling M (2017) Semi-Supervised classification with graph convolutional networks. In: International conference on learning representations (ICLR)

  22. Atwood J, networks Towsley D. (2016) Diffusion-convolutional neural. In: Proceedings of the 30th international conference on neural information processing systems. Curran Associates Inc, pp 2001–2009

  23. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 701–710

  24. Grover A, Leskovec J (2016) node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 855–864

  25. Cao S, Lu W, Xu Q (2016) Deep neural networks for learning graph representations. In: Proceedings of the Thirtieth AAAI conference on artificial intelligence, pp 1145–1152

  26. Luong MT, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 conference on empirical methods in natural language processing, Lisbon, pp 1412–1421

  27. Shaw P, Uszkoreit J, Vaswani A (2018) Self-attention with relative position representations. In: Proceedings of the 2018 conference of the North American chapter of the association for computational linguistics: human language technologies, vol 2, pp 464–468

  28. Page L, Brin S, Motwani R, Winograd T The PageRank citation ranking: Bringing order to the Web. Technical report, Stanford Digital Library Technologies Project. citeseer.ist.psu.edu/page98pagerank.html

  29. Sen P, Namata G, Bilgic M, et al (2008) Collective classification in network data. AI Mag 29(3):93–107

    Google Scholar 

  30. Lu Q, Getoor L (2003) Link-based classification. In: Proceedings of the 20th international conference on machine learning(ICML)., pp 496–503

  31. Yang Z, Cohen WW, Salakhutdinov R (2016) Revisiting semi-supervised learning with graph embeddings. In: Proceedings of the 33rd international conference on international conference on machine learning, vol 48, pp 40–48

  32. Wang X, Girshick R, Gupta A, et al (2018) Non-local neural networks. In: Conference on computer vision and pattern recognition(CVPR). IEEE, pp 7794–7803

  33. Wang J, Guo Y, Wen X, et al (2020) Improving graph-based label propagation algorithm with group partition for fraud detection. Appl Intell. https://doi.org/10.1007/s10489-020-01724-1

  34. Gao H, Wang Z, Ji S (2018) Large-scale learnable graph convolutional networks. In: The 24th ACM SIGKDD international conference ACM

  35. Yang L, Kang Z, Cao X et al (2019) Topology optimization based graph convolutional network[C]//IJCAI. pp 4054–4061

  36. Chen Y, Rohrbach M, Yan Z et al (2019) Graph-based global reasoning networks[C]//2019. In: IEEE conference on computer vision and pattern recognition(CVPR), pp 433–442

  37. Zhao L, Akoglu L (2020) Connecting graph convolutional networks and graph-regularized PCA[J]. arXiv:2006.12294

  38. Hu F, Zhu Y, Wu S, et al (2019) Hierarchical graph convolutional networks for semi-supervised node classification[J]. arXiv:1902.06667

  39. Velickovic P, Fedus W, Hamilton WL, et al (2019) Deep graph infomax[c]//ICLR (Poster)

Download references

Acknowledgments

The authors thank the reviewers for their helpful comments, and the authors of GATs for making the source code of their approaches publicly available. Yifen Li have received funding from the Changsha Vocational&Technical College’s education research and innovation programme under grant agreement CZJG19QN01.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anzhong Zhou.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, A., Li, Y. Structural attention network for graph. Appl Intell 51, 6255–6264 (2021). https://doi.org/10.1007/s10489-021-02214-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-02214-8

Keywords

Navigation