Skip to main content
Log in

Hierarchical graph attention networks for semi-supervised node classification

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Recently, there has been a promising tendency to generalize convolutional neural networks (CNNs) to graph domain. However, most of the methods cannot obtain adequate global information due to their shallow structures. In this paper, we address this challenge by proposing a hierarchical graph attention network (HGAT) for semi-supervised node classification. This network employs a hierarchical mechanism for the learning of node features. Thus, more information can be effectively obtained of the node features by iteratively using coarsening and refining operations on different hierarchical levels. Moreover, HGAT combines with the attention mechanism in the input and prediction layer. It can assign different weights to different nodes in a neighborhood, which helps to improve accuracy. Experiment results demonstrate that state-of-the-art performance was achieved by our method, not only on Cora, Citeseer, and Pubmed citation datasets, but also on the simplified NELL knowledge graph dataset. The sensitive analysis further verifies that HGAT can capture global structure information by increasing the receptive field, as well as the effective transfer of node features.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Shuman DI, Narang SK, Frossard P, Ortega A, Vandergheynst P (2013) The emerging field of signal processing on graphs: extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Proc Mag 30(3):83–98

    Article  Google Scholar 

  2. Berg RVD, Kipf TN, Welling M (2017) Graph convolutional matrix completion. arXiv:https://arxiv.org/abs/1706.02263

  3. Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery and data mining, pp 974–983

  4. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: Proceedings of the 34th international conference on machine learning, pp 1263–1272

  5. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graph attention networks. In: 6th International conference on learning representations, pp 1–12

  6. Kipf TN, Welling M (2018) Semi-supervised classification with graph convolutional networks. In: 5th International conference on learning representations, pp 1–14

  7. Li Q, Han Z, Wu XM (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-Second AAAI conference on artificial intelligence, pp 1–9

  8. Zhou J, Cui G, Zhang Z, Yang C, Liu Z, Sun M (2019) Graph neural networks: a review of methods and applications. arXiv:https://arxiv.org/abs/1812.08434v3

  9. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. arXiv:https://arxiv.org/abs/1301.3781

  10. Perozzi B, Al-Rfou R, Skiena S (2014) DeepWalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 701–710

  11. Grover A, Leskovec J (2016) node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 855–864

  12. Gori M, Monfardini G, Scarselli F (2005) A new model for learning in graph domains. In: Proceedings of 2005 IEEE international joint conference on neural networks, pp 729–734

  13. Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2008) The graph neural network model. IEEE T Neural Networ 20(1):61–80

    Article  Google Scholar 

  14. Li Y, Tarlow D, Brockschmidt M, Zemel R (2016) Gated graph sequence neural networks. In: 4th International conference on learning representations, pp 1–20

  15. LeCun Y, Bengio Y (1995) Convolutional networks for images, speech, and time series. Handbook Brain Theory Neural Netw 3361:10

    Google Scholar 

  16. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  17. Wu Z, Pan S, Chen F, Long G, Zhang C, Yu PS (2019) A comprehensive survey on graph neural networks. arXiv:https://arxiv.org/abs/1901.00596v3

  18. Gao H, Wang Z, Ji S (2018) Large-scale learnable graph convolutional networks. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1416–1424

  19. Bronstein MM, Bruna J, LeCun Y, Szlam A, Vandergheynst P (2017) Geometric deep learning: going beyond Euclidean data. IEEE Signal Proc Mag 34(4):18–42

    Article  Google Scholar 

  20. Bruna J, Zaremba W, Szlam A, LeCun Y (2013) Spectral networks and locally connected networks on graphs. arXiv:https://arxiv.org/abs/1312.6203

  21. Chen H, Perozzi B, Hu Y, Skiena S (2018) Harp: hierarchical representation learning for networks. In: Thirty-Second AAAI conference on artificial intelligence, pp 2127–2134

  22. Liang J, Gurukar S, Parthasarathy S (2018) Mile: A multi-level framework for scalable graph embedding. In: The United States conference of Mayors’ 86th Winter Meeting, pp 1–11

  23. Hu F, Zhu Y, Wu S, Wang L, Tan T (2019) Hierarchical graph convolutional networks for semi-supervised node classification. In: Proceedings of the Twenty-Eighth international joint conference on artificial intelligence, pp 1–8

  24. Ying Z, You J, Morris C, Ren X, Hamilton W, Leskovec J (2018) Hierarchical graph representation learning with differentiable pooling. In: Advances in neural information processing systems 31: annual conference on neural information processing systems, pp 4800–4810

  25. Lv J, Zhong J, Liang J, Yang Z (2019) ACE: ant colony based multi-level network embedding for hierarchical graph representation learning. IEEE Access 7:73970–73982

    Article  Google Scholar 

  26. Henaff M, Bruna J, LeCun Y (2015) Deep convolutional networks on graph-structured data. Comput Sci, 1–10

  27. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in neural information processing systems 29: annual conference on neural information processing systems, pp 3844–3852

  28. Zhuang C, Ma Q (2018) Dual graph convolutional networks for graph-based semi-supervised classification. In: Proceedings of the 2018 World Wide Web conference, pp 23–27

  29. Duvenaud DK, Maclaurin D, Iparraguirre J, Bombarell R, Hirzel T, Aspuru-Guzik A, Adams RP (2015) Convolutional networks on graphs for learning molecular fingerprints. In: Advances in neural information processing systems 28: annual conference on neural information processing systems, pp 7–12

  30. Atwood J, Towsley D (2016) Diffusion-convolutional neural networks. In: Advances in neural information processing systems 29: annual conference on neural information processing systems, pp 1993–2001

  31. Niepert M, Ahmed M, Kutzkov K (2016) Learning convolutional neural networks for graphs. In: 33rd International conference on machine learning, pp 19–24

  32. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: Advances in neural information processing systems 30: annual conference on neural information processing systems, pp 1024–1034

  33. Xu K, Li C, Tian Y, Sonobe T, Kawarabayashi KI, Jegelka S (2018) Representation learning on graphs with jumping knowledge networks. In: 35th International conference on machine learning, pp 1–14

  34. Karypis G, Kumar V (1998) Multilevel k-way partitioning scheme for irregular graphs. J Parallel Distr Com 48(1):96–129

    Article  Google Scholar 

  35. Yang Z, Cohen WW, Salakhutdinov R (2016) Revisiting semi-supervised learning with graph embeddings. In: Proceedings of the 33th international conference on machine learning, pp 1–9

  36. Kingma DP, Ba J (2015) Adam: a method for stochastic optimization. In: 3rd International conference on learning representations, pp 1–15

  37. Maaten LVD, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9:2579–2605

    MATH  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Key R&D Program of China (No. 2018YFB1701702), the National Natural Science Foundation of China (Nos. 51675477, 51775489), Zhejiang Provincial Natural Science Foundation of China (No. LZ18E050001).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yixiong Feng.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, K., Feng, Y., Gao, Y. et al. Hierarchical graph attention networks for semi-supervised node classification. Appl Intell 50, 3441–3451 (2020). https://doi.org/10.1007/s10489-020-01729-w

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-01729-w

Keywords

Navigation