Skip to main content

Self-attention Based Multi-scale Graph Convolutional Networks

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13623))

Included in the following conference series:

  • 1595 Accesses

Abstract

Graph convolutional networks (GCNs) have achieved remarkable learning ability for dealing with various graph structural data recently. In general, GCNs have low expressive power due to their shallow structure. In this paper, to improve the expressive power of GCNs, we propose two multi-scale GCN frameworks by incorporating self-attention mechanism and multi-scale information into the design of GCNs. The self-attention mechanism allows us to adaptively learn the local structure of the neighborhood, and achieves more accurate predictions. Extensive experiments on both node classification and graph classification demonstrate the effectiveness of our approaches over several state-of-the-art GCNs.

The work described in this paper was supported partially by the National Natural Science Foundation of China (11871167, 12271111), Guangdong Basic and Applied Basic Research Foundation (2022A1515011726), Special Support Plan for High-Level Talents of Guangdong Province (2019TQ05X571), Foundation of Guangdong Educational Committee (2019KZDZX1023), Project of Guangdong Province Innovative Team (2020WCXTD011).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bianchi, F.M., Grattarola, D., Livi, L., Alippi, C.: Graph neural networks with convolutional arma filters. IEEE Trans. Pattern Anal. Mach. Intell. 1 (2021)

    Google Scholar 

  2. Bronstein, M.M., Bruna, J., LeCun, Y., Szlam, A., Vandergheynst, P.: Geometric deep learning: going beyond Euclidean data. IEEE Sig. Process. Mag. 34(4), 18–42 (2017)

    Article  Google Scholar 

  3. Bruna, J., Zaremba, W., Szlam, A., Lecun, Y.: Spectral networks and locally connected networks on graphs. In: 2nd International Conference on Learning Representations, pp. 1–14. ICLR, Canada (2014)

    Google Scholar 

  4. Casas, S., Gulino, C., Liao, R., Urtasun, R.: SpaGNN: spatially-aware graph neural networks for relational behavior forecasting from sensor data. In: 2020 IEEE International Conference on Robotics and Automation, pp. 9491–9497. IEEE, Paris (2020)

    Google Scholar 

  5. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th International Conference on Neural Information Processing Systems, pp. 3844–3852. Curran Associates Inc., Red Hook (2016)

    Google Scholar 

  6. Gao, H., Ji, S.: Graph u-nets. In: Proceedings of the 36th International Conference on Machine Learning, pp. 2083–2092. ACM, California (2019)

    Google Scholar 

  7. Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 1025–1035. Curran Associates Inc., Long Beach (2017)

    Google Scholar 

  8. He, M., Wei, Z., Huang, Z., Xu, H.: Bernnet: learning arbitrary graph spectral filters via Bernstein approximation. In: Advances in Neural Information Processing Systems, pre-proceedings, vol. 34, pp. 1–13. Curran Associates Inc., Virtual Conference (2021)

    Google Scholar 

  9. Kingma, D., Ba, J.: Adam: a method for stochastic optimization. In: 3rd International Conference for Learning Representations, pp. 1–12. ICLR, San Diego (2015)

    Google Scholar 

  10. Kipf, T., Welling, M.: Semi-supervised classification with graph convolutional networks. In: 5th International Conference on Learning Representations, pp. 1–14. ICLR, Toulon, France (2017)

    Google Scholar 

  11. Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: Proceedings of the 36th International Conference on Machine Learning, pp. 3734–3743. ACM, California (2019)

    Google Scholar 

  12. Liao, R., Zhao, Z., Urtasun, R., Zemel, R.: Lanczosnet: multi-scale deep graph convolutional networks. In: 8th International Conference on Learning Representations, pp. 1–18. ICLR, New Orleans (2019)

    Google Scholar 

  13. Luan, S., Zhao, M., Chang, X.W., Precup, D.: Break the ceiling: stronger multi-scale deep graph convolutional networks. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems, pp. 10945–10955. Curran Associates Inc., Vancouver (2019)

    Google Scholar 

  14. Morris, C., Kriege, N.M., Bause, F., Kersting, K., Mutzel, P., Neumann, M.: Tudataset: a collection of benchmark datasets for learning with graphs. In: ICML 2020 Workshop on Graph Representation Learning and Beyond (GRL+ 2020), pp. 1–11. ACM, Virtual Conference (2020)

    Google Scholar 

  15. Rong, Y., Huang, W., Xu, T., Huang, J.: Dropedge: towards deep graph convolutional networks on node classification. In: Eighth International Conference on Learning Representations, pp. 1–18. ICLR, Virtual Conference (2020)

    Google Scholar 

  16. Sen, P., Namata, G., Bilgic, M., Getoor, L., Gallagher, B., Eliassi-Rad, T.: Collective classification in network data. AI Mag. 29, 93–106 (2008)

    Google Scholar 

  17. Stokes, J.M., Yang, K., Swanson, K., Jin, W., Collins, J.J.: A deep learning approach to antibiotic discovery. Cell 180(4), 688-702.e13 (2020)

    Article  Google Scholar 

  18. Sun, J., et al.: A framework for recommending accurate and diverse items using Bayesian graph convolutional neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 2030–2039. Association for Computing Machinery, Virtual Event (2020)

    Google Scholar 

  19. Sun, K., Lin, Z., Zhu, Z.: AdaGCN: adaboosting graph convolutional networks into deep models. In: The Ninth International Conference on Learning Representations, pp. 1–15. ICLR, Virtual Conference (2021)

    Google Scholar 

  20. Vaswani, A., et al.: Attention is all you need. In: Thirty-first Conference on Neural Information Processing Systems, pp. 5998–6008. Curran Associates Inc., Long Beach (2017)

    Google Scholar 

  21. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Li\(\grave{o}\), P., Bengio, Y.: Graph attention networks. In: Sixth International Conference on Learning Representations, pp. 1–12. ICLR, Vancouver (2018)

    Google Scholar 

  22. Vinyals, O., Bengio, S., Kudlur, M.: Order matters: sequence to sequence for sets. In: The 4th International Conference on Learning Representations, pp. 1–11. ICLR, San Juan (2016)

    Google Scholar 

  23. Wang, R.J., Li, X., Ling, C.X.: Pelee: a real-time object detection system on mobile devices. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems, pp. 1967–1976. Curran Associates Inc., Montréal (2018)

    Google Scholar 

  24. Xu, B., Shen, H., Cao, Q., Qiu, Y., Cheng, X.: Graph wavelet neural network. In: 8th International Conference on Learning Representations, pp. 1–13. ICLR, New Orleans (2019)

    Google Scholar 

  25. Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.I., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462. ACM, Stockholm (2018)

    Google Scholar 

  26. Yang, Z., Cohen, W.W., Salakhutdinov, R.: Revisiting semi-supervised learning with graph embeddings. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning, pp. 40–48. JMLR.org, New York (2016)

    Google Scholar 

  27. Yao, L., Mao, C., Luo, Y.: Graph convolutional networks for text classification. In: Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, pp. 7370–7377. AAAI Press, Honolulu (2019)

    Google Scholar 

  28. Ying, R., You, J., Morris, C., Ren, X., Hamilton, W.L., Leskovec, J.: Hierarchical graph representation learning with differentiable pooling. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems, pp. 4805–4815. Curran Associates Inc., Montréal (2018)

    Google Scholar 

  29. Zheng, J., Wang, Y., Xu, W., Gan, Z., Li, P., Lv, J.: GSSA: pay attention to graph feature importance for GCN via statistical self-attention. Neurocomputing 417, 458–470 (2020)

    Article  Google Scholar 

  30. Fan, X., Gong, M., Xie, Y., Jiang, F., Li, H.: Structured self-attention architecture for graph-level representation learning. Pattern Recogn. 100, 107084 (2020)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jia Cai .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xiong, Z., Cai, J. (2023). Self-attention Based Multi-scale Graph Convolutional Networks. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Lecture Notes in Computer Science, vol 13623. Springer, Cham. https://doi.org/10.1007/978-3-031-30105-6_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-30105-6_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-30104-9

  • Online ISBN: 978-3-031-30105-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics