Skip to main content

Advertisement

Log in

PairGNNs: enabling graph neural networks with pair-based view

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Graph neural network (GNN) is widely used in graph structure data and has achieved specific effects in downstream tasks such as node classification. Most existing GNN approaches are designed with a node-based view and homophily assumption during learning. They mainly preserve the similarity between a node and its surrounding context by intensifying low-frequency information in between. The main drawback of such node-view is its lack of support for expressing the compound relationships between nodes. The lack of discrimination of closed nodes makes it hard to handle tasks, e.g., edge classification that demands the high-frequency signals between connected nodes. To preserve the differences between connected nodes during aggregation, a new view, pairGNNs, is proposed, which replaces the node with a paired structure (two adjacent nodes) as the primary learning and information aggregation unit. Paired Nodes form neighbors based on the same node. Aiming at the “neighbor explosion” situation resulting from those designs, we propose a neighbor sampling strategy that significantly reduces the computational complexity by using k minimum degree paired nodes. We apply this model to three off-the-shelf GNNs: GCN, GAT, and GraphSage. Extensive experiments have been performed on nine different datasets and three different downstream tasks. Results prove that GNNs with a paired-view can significantly improve performance compared to their node-view peers and outperform seven strong compared baselines in major downstream tasks, especially on datasets with complex structures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Lu Y, Ding H, Ji S, Sze N, He Z (2021) Dual attentive graph neural network for metro passenger flow prediction. Neural Comput Appl 33(20):13417–13431

    Article  Google Scholar 

  2. Zhang Z, Cui P, Zhu W (2020) Deep learning on graphs: a survey. IEEE Transactions on Knowledge and Data Engineering

  3. Wu Z, Pan S, Chen F, Long G, Zhang C, Philip SY (2020) A comprehensive survey on graph neural networks. IEEE Trans Neural Netw Learning Syst 32(1):4–24

    Article  Google Scholar 

  4. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907

  5. Wu F, Zhang T, de Souza AH, Fifty C, Yu T, Weinberger KQ (2019) Simplifying graph convolutional networks. ArXiv abs/1902.07153

  6. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv preprint arXiv:1710.10903

  7. Nt H, Maehara T (2019) Revisiting graph neural networks: all we have is low-pass filters. arXiv preprint arXiv:1905.09550

  8. Xu B, Shen H, Cao Q, Cen K, Cheng X (2020) Graph convolutional networks using heat kernel for semi-supervised learning. arXiv preprint arXiv:2007.16002

  9. Li Q, Wu X-M, Liu H, Zhang X, Guan Z (2019) Label efficient semi-supervised learning via graph filtering. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9582–9591

  10. McPherson M, Smith-Lovin L, Cook JM (2001) Birds of a feather: Homophily in social networks. Ann Rev Soc 27(1):415–444

    Article  Google Scholar 

  11. Zhu J, Yan Y, Zhao L, Heimann M, Akoglu L, Koutra D (2020) Beyond homophily in graph neural networks: current limitations and effective designs. Adv Neural Inf Process Syst 33:7793

    Google Scholar 

  12. Hou Y, Zhang J, Cheng J, Ma K, Ma RT, Chen H, Yang M-C (2020) Measuring and improving the use of graph information in graph neural networks. In: International Conference on Learning Representations

  13. Abu-El-Haija S, Perozzi B, Kapoor A, Alipourfard N, Lerman K, Harutyunyan H, Ver Steeg G, Galstyan A (2019) Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In: International Conference on Machine Learning, pp. 21–29. PMLR

  14. Bo D, Wang X, Shi C, Shen H (2021) Beyond low-frequency information in graph convolutional networks. arXiv preprint arXiv:2101.00797

  15. Wang C, Wang C, Wang Z, Ye X, Yu PS (2020) Edge2vec: Edge-based social network embedding. ACM Trans Knowl Discov Data (TKDD) 14(4):1–24

    Article  Google Scholar 

  16. Xu K, Hu W, Leskovec J, Jegelka S (2018) How powerful are graph neural networks? arXiv preprint arXiv:1810.00826

  17. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272. PMLR

  18. Battaglia PW, Pascanu R, Lai M, Rezende DJ, Kavukcuoglu K (2016) Interaction networks for learning about objects, relations and physics. ArXiv abs/1612.00222

  19. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. Adv Neural Inf Process Syst 29:3844–3852

    Google Scholar 

  20. Li Q, Han Z, Wu X-M (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-Second AAAI Conference on Artificial Intelligence

  21. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Advances in neural information processing systems 30

  22. Li R, Wang S, Zhu F, Huang J (2018) Adaptive graph convolutional neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32

  23. Zhuang C, Ma Q (2018) Dual graph convolutional networks for graph-based semi-supervised classification. In: Proceedings of the 2018 World Wide Web Conference, pp 499–508

  24. Wang Y, Cao J, Tao H (2021) Graph convolutional network with multi-similarity attribute matrices fusion for node classification. Neural Computing and Applications, pp 1–11

  25. Pasa L, Navarin N, Sperduti A (2020) Som-based aggregation for graph convolutional neural networks. Neural Comput Appl, 1–20

  26. Feng F, He X, Tang J, Chua T-S (2019) Graph adversarial training: dynamically regularizing based on graph structure. IEEE Trans Knowl Data Eng 33(6):2493–2504

    Article  Google Scholar 

  27. Jiang H, He H (2021) Learning from negative links. IEEE Transactions on Cybernetics, 1–1. https://doi.org/10.1109/TCYB.2021.3104246

  28. Zhu J, Yan Y, Zhao L, Heimann M, Akoglu L, Koutra D (2020) Beyond homophily in graph neural networks: current limitations and effective designs. arXiv preprint arXiv:2006.11468

  29. Klicpera J, Bojchevski A, Günnemann S (2018) Predict then propagate: graph neural networks meet personalized pagerank. arXiv preprint arXiv:1810.05997

  30. Chen M, Wei Z, Huang Z, Ding B, Li Y (2020) Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp 1725–1735. PMLR

  31. Xu K, Li C, Tian Y, Sonobe T, Kawarabayashi K-i, Jegelka S (2018) Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp 5453–5462. PMLR

  32. Hou Y, Zhang J, Cheng J, Ma K, Ma RT, Chen H, Yang M-C (2019) Measuring and improving the use of graph information in graph neural networks. In: International Conference on Learning Representations

  33. Rong Y, Huang W, Xu T, Huang J (2019) Dropedge: Towards deep graph convolutional networks on node classification. arXiv preprint arXiv:1907.10903

  34. Feng W, Zhang J, Dong Y, Han Y, Luan H, Xu Q, Yang Q, Kharlamov E, Tang J (2020) Graph random neural network for semi-supervised learning on graphs. arXiv preprint arXiv:2005.11079

  35. Bo D, Wang X, Shi C, Shen H (2021) Beyond low-frequency information in graph convolutional networks. arXiv preprint arXiv:2101.00797

  36. Chien E, Peng J, Li P, Milenkovic O (2020) Adaptive universal generalized pagerank graph neural network. arXiv preprint arXiv:2006.07988

  37. Epasto A, Perozzi B (2019) Is a single embedding enough? learning node representations that capture multiple social contexts. In: The World Wide Web Conference, pp. 394–404

  38. Morris C, Ritzert M, Fey M, Hamilton WL, Lenssen JE, Rattan G, Grohe M (2019) Weisfeiler and leman go neural: Higher-order graph neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 33, pp 4602–4609

  39. Murphy R, Srinivasan B, Rao V, Ribeiro B (2019) Relational pooling for graph representations. In: International Conference on Machine Learning, pp 4663–4673. PMLR

  40. Vashishth S, Sanyal S, Nitin V, Talukdar P (2019) Composition-based multi-relational graph convolutional networks. arXiv preprint arXiv:1911.03082

  41. Gong L, Cheng Q (2019) Exploiting edge features for graph neural networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 9211–9219

  42. Jiang X, Ji P, Li S (2019) Censnet: Convolution with edge-node switching in graph neural networks. In: IJCAI, pp 2656–2662

  43. Yang Y, Li D (2020) Nenn: incorporate node and edge features in graph neural networks. In: Asian Conference on Machine Learning, pp 593–608. PMLR

  44. Zeng H, Zhou H, Srivastava A, Kannan R, Prasann V (2019) Graphsaint: Graph sampling based inductive learning method. arXiv preprint arXiv:1907.04931

  45. Newman ME (2003) Mixing patterns in networks. Phys Rev E 67(2):026126

    Article  Google Scholar 

  46. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp 701–710

Download references

Funding

This work was supported by the National Science Foundation of China (61772473, 62073345 and 62011530148).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ning Gui.

Ethics declarations

Conflict of interest

The computing resources supporting this work were partially provided by High-Flyer AI. (Hangzhou High-Flyer AI Fundamental Research Co., Ltd.).

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (pdf 321 KB)

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, C., Deng, G. & Gui, N. PairGNNs: enabling graph neural networks with pair-based view. Neural Comput & Applic 35, 3343–3355 (2023). https://doi.org/10.1007/s00521-022-07817-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07817-x

Keywords