Skip to main content
Log in

A new framework for graph neural network with local information diffusion

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Graph neural networks have attracted a lot of attention in recent years. In most graph neural networks, the representation of a node is obtained by aggregating its information and the information of its neighbor nodes. However, a common disadvantage for many existing graph neural networks is that the depth of the network is limited, because the node representation often appears excessively smooth after only a few layers. In this paper, a new graph neural network based on local information diffusion is proposed, in which the representations of nodes are extracted by a cascaded architecture of network layers. In each layer, first, the proposed model randomly samples some nodes as signal sources and the representations of the signal sources are extracted by a convolutional neural network. The representations of other nodes are then updated by the local information diffusion from signal sources rather than traditional information aggregation. To further reduce the number of parameters, a parameter sharing mechanism is introduced, i.e. some network layers are grouped into a block and the parameters of neural network layers in a block are the same. In addition, a strategy is proposed to automatically estimate the hyper-parameters of the network. The experiments on some benchmark datasets show that the depth of our model is bigger than the traditional models and the performance is also better than the traditional models. It indicates that our model is effective and competitive.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Atlas LE, Homma T, Marks II RJ(1988) An artificial neural network for spatio-temporal bipolar patterns: Application to phoneme classification. In: Neural information processing systems, pp 31–40

  2. Atwood J, Towsley D (2016) Diffusion-convolutional neural networks. In: Advances in neural information processing systems, pp 1993–2001

  3. Belkin M, Niyogi P, Sindhwani V (2006) Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research 7(11)

  4. Bianchi FM, Grattarola D, Alippi C (2020) Spectral clustering with graph neural networks for graph pooling. In: International conference on machine learning, PMLR, pp 874–883

  5. Bruna J, Zaremba W, Szlam A, LeCun Y (2013) Spectral networks and locally connected networks on graphs

  6. Chen M, Wei Z, Huang Z, Ding B, Li Y (2020) Simple and deep graph convolutional networks. In: International conference on machine learning, PMLR, pp 1725–1735

  7. Chiang WL, Liu X, Si S, Li Y, Bengio S, Hsieh CJ (2019) Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp 257–266

  8. Dabhi S, Parmar M (2020) Nodenet: A graph regularised neural network for node classification. arXiv preprint arXiv:2006.09022

  9. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. Advances in Neural Information Processing Systems 29:3844–3852

    Google Scholar 

  10. Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, Gómez-Bombarelli R, Hirzel T, Aspuru-Guzik A, Adams RP (2015) Convolutional networks on graphs for learning molecular fingerprints. MIT Press

    Google Scholar 

  11. Fu X, Zhang J, Meng Z, King I (2020) Magnn: Metapath aggregated graph neural network for heterogeneous graph embedding. Proceedings of the Web Conference 2020:2331–2341

    Google Scholar 

  12. Gehring J, Auli M, Grangier D, Dauphin YN (2017) A convolutional encoder model for neural machine translation. In: Proceedings of the 55th annual meeting of the association for computational linguistics (Volume 1: Long Papers)

  13. Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the Thirteenth international conference on artificial intelligence and statistics, JMLR Workshop and Conference Proceedings, pp 249–256

  14. Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. In: Proceedings of the fourteenth international conference on artificial intelligence and statistics, JMLR Workshop and Conference Proceedings, pp 315–323

  15. Hamilton WL, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. In: Proceedings of the 31st international conference on neural information processing systems, pp 1025–1035

  16. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  17. Huang G, Liu Z, Van Der Maaten L, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4700–4708

  18. Kingma DP, Ba J (2014) Adam: A method for stochastic optimization. Computer Science

  19. Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks

  20. Klicpera J, Bojchevski A, Günnemann S (2019) Combining neural networks with personalized pagerank for classification on graphs. In: International conference on learning representations

  21. LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proceedings of the IEEE 86(11):2278–2324

    Article  Google Scholar 

  22. Li Q, Han Z, Wu XM (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-Second AAAI conference on artificial intelligence

  23. Liao R, Zhao Z, Urtasun R, Zemel RS (2019) Lanczosnet: Multi-scale deep graph convolutional networks. International conference on machine learning

  24. Lu Q, Getoor L (2003) Link-based classification using labeled and unlabeled data. In: ICML 2003 workshop on the continuum\_from labeled to unlabeled data in machine learning and data mining

  25. Luan S, Zhao M, Chang XW, Precup D (2019) Break the ceiling: Stronger multi-scale deep graph convolutional networks

  26. Monti F, Boscaini D, Masci J, Rodola E, Svoboda J, Bronstein MM (2017) Geometric deep learning on graphs and manifolds using mixture model cnns. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 5115–5124

  27. Niepert M, Ahmed M, Kutzkov K (2016) Learning convolutional neural networks for graphs. In: International conference on machine learning, PMLR, pp 2014–2023

  28. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L et al (2019) Pytorch: An imperative style, high-performance deep learning library. Advances in Neural Information Processing Systems 32:8026–8037

    Google Scholar 

  29. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pp 701–710

  30. Rong Y, Huang W, Xu T, Huang J (2020) Dropedge: Towards deep graph convolutional networks on node classification. International Conference on Learning Representations

  31. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T (2008) Collective classification in network data. AI Magazine 29(3):93–93

    Article  Google Scholar 

  32. Shchur O, Mumme M, Bojchevski A, Günnemann S (2018) Pitfalls of graph neural network evaluation. arXiv preprint arXiv:1811.05868

  33. Shi W, Rajkumar R (2020) Point-gnn: Graph neural network for 3d object detection in a point cloud. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 1711–1719

  34. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graphattention networks. International Conference on Learning Representations

  35. Weston J, Ratle F, Mobahi H, Collobert R (2012) Deep learning via semi-supervised embedding. In: Neural networks: Tricks of the trade, Springer, pp 639–655

  36. Xu B, Shen H, Cao Q, Cen K, Cheng X (2019) Graph convolutional networks using heat kernel for semi-supervised learning. In: Proceedings of the Twenty-Eighth international joint conference on artificial intelligence, IJCAI-19, pp 1928–1934

  37. Xu B, Shen H, Cao Q, Qiu Y, Cheng X (2019) Graph wavelet neural network. InternationalConference on Learning Representations

  38. Xu K, Li C, Tian Y, Sonobe T, Kawarabayashi Ki, Jegelka S (2018) Representation learning on graphs with jumping knowledge networks. In: International conference on machine learning, PMLR, pp 5453–5462

  39. Yang Z, Cohen W, Salakhudinov R (2016) Revisiting semi-supervised learning with graph embeddings. In: International conference on machine learning, PMLR, pp 40–48

  40. Zhao L, Akoglu L (2020) Pairnorm: Tackling oversmoothing in gnns. International Conference on Learning Representations

  41. Zhao T, Zhang X, Wang S (2021) Graphsmote: Imbalanced node classification on graphs with graph neural networks. In: Proceedings of the 14th ACM international conference on web search and data mining, pp 833–841

  42. Zhu X, Ghahramani Z, Lafferty JD (2003) Semi-supervised learning using gaussian fields and harmonic functions. In: Proceedings of the 20th international conference on machine learning (ICML-03), pp 912–919

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Linkai Luo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Peng, S., Luo, L. & Peng, H. A new framework for graph neural network with local information diffusion. Appl Intell 52, 10768–10778 (2022). https://doi.org/10.1007/s10489-021-03132-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-03132-5

Keywords

Navigation