Abstract
Graph convolutional network (GCN) nowadays become new state-of-the-art for networks representation learning. Most of the existing methods are single-granular methods that failed to analyze the graph at multi-granular views so as to lose abundant information. Advanced graph pooling techniques can be successfully benefiting from semi-supervised networks representation learning. How to capture multi-granular information through the graph pooling techniques on graphs without additional input features is a great challenge. Technically speaking, we propose our graph node embeddings framework, MGPOOL. First, inspired by the triadic influence learning, we use the 3-clique algorithm to coarsen the graph repeatedly. Three nodes of a triangle form a supernode. We treat the supernodes as key nodes for our graph pooling operations. That keeps the local relationship. These graphs capture consecutive 3-cliques from the finest to the coarsest to preserve global structural relationships. Second, we use the unsupervised single-granular algorithms on the coarsest graph to acquire its node embeddings. Based on that, our graph pooling operations combining with that node embeddings to generate another same size of the coarsest graph. This makes up for the uniqueness of the coarsening result at a time and expands the receptive field for each node to avoid high-proximity information lost. Third, we take the embeddings, the coarsest graph and new coarsest graph as uniform input of MGPOOL. We restore the coarsest graph to the original graph to get the original graph node embeddings. The experimental results on four public datasets, Wiki, Cora, CiteSeer, and DBLP, demonstrate that our method has a better Macro F1 value for node classification tasks and AUC and Ap value for link prediction than the baseline methods.


Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
https://github.com/GTmac/HARP/tree/master/
example_networks/DBLP.
https://github.com/GTmac/HARP/tree/master/
example_networks/citeseer.
References
Chen J, Liu Y, Zhao S, Zhang YP (2019) Citation recommendation based on weighted heterogeneous information network containing semantic linking. In: International conference on multimedia and expo, pp 31–36
Hamilton WL, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. In: Conference and workshop on neural information processing systems
Perozzi B, Al-Rfou R, Skiena S (2014) DeepWalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, New York, USA, August, pp 701–710
Grover A, Leskovec J (2016) Node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, San Francisco, USA, pp 855–864
Tang J, Qu M, Wang M et al (2015) Line: large-scale information network embedding. In: Proceedings of the 24th international conference on World Wide Web, Florence, pp 1067–1077
Bruna J, Zaremba W, Szlam A, Lecun Y (2014) Spectral networks and locally connected networks on graphs. In: International conference on learning representations
Boscaini D, Masci J, Melzi S, Bronstein MM, Castellani U, Vandergheynst P (2015) Learning class-specific descriptors for deformable shapes using localized spectral convolutional networks. Comput Graph Forum 34:13–23
Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in neural information processing systems, pp 3844–3852
Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In International conference on learning representations
Khasanova R, Frossard P (2017) Graph-based isometry invariant representation learning. In: International conference on machine learning, pp 1847–1856
Levie R, Monti F, Bresson X, Bronstein MM (2017) Cayleynets: graph convolutional neural networks with complex rational spectral fifilters. arXiv preprint. arXiv:1705.07664
Gao HY, Ji SW (2019) Graph u-nets. In: International conference on machine learning, pp 2083–2092
Gao HY, Ji SW (2019) Graph representation learning via hard and channel-wise attention networks. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge, discovery and data mining, pp 741–749
Yuan H, Ji SW (2020) Structpool: structured graph pooling via conditional random fields. In: International Conference on Learning Representations, Addis Ababa, Ethiopia, April 26-30, 2020.
Zhang MH, Chen YX (2018) Link prediction based on graph neural networks. In: Advances in neural information processing systems, pp 5165–5175
Zhang MH, Cui ZC, Neumann M, Chen YX (2018) An end-to-end deep learning architecture for graph classification. In: AAAI conference on artificial intelligence, pp 4438–4445
Ying ZT, You JX, Morris Christopher, Ren X, Hamilton W, Leskovec J (2018) Hierarchical graph representation learning with differentiable pooling. In: Advances in neural information processing systems, pp 4800–4810
Gao HY, Wang ZY, Ji SW (2018) Large-scale learnable graph convolutional networks. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge, discovery and data mining, pp 1416–1424
Lee J, Lee I, Kang J (2019) Self-attention graph pooling. In: International conference on machine learning, pp 3734–3743
Xu BB, Shen HW, Cao Q, Qiu YQ, Cheng XQ (2019) Graph wavelet neural network. In: International conference on learning representations
Liang JQ, Gurukar S, Parthasarathy S. MILE: a multi-level framework for scalable graph embeddings refinement
Hu FY, Zhu YQ, Wu S, Wang L, Tan TN (2019) Hierarchical graph convolutional networks for semi-supervised node classification. In: International joint conference on artificial intelligence, pp 4532–4539
Wang X, Cui P, Wang J, Pei J, Zhu WW, and Yang SQ (2017) Community preserving network embedding. In: Association for the advancement of artificial intelligence conference
Hu X (2017) Label informed attributed network embedding. In: Tenth ACM international conference on web search and data mining, pp 731–739
Du L, Lu ZC, Wang Y, Song GJ, Wang YM, Chen W (2018) Galaxy network embedding: a hierarchical community structure preserving approach. In: Proceedings of the twenty-seventh international joint conference on artificial intelligence, pp 2079–2085
Yu F, Koltun V (2016) Multi-scale context aggregation by dilation convolutions. In: International conference on learning representations
Zhang MH, Hu BB, Shi C, Wu B, Wang B (2018) Matrix factorization meets social network embedding for rating prediction. In: Proceedings of APWeb/WAIM, Macau, China, pp 121–129
Qiu JZ, Dong YX, Ma H, Li J, Wang KS, Tang J (2018) Network embedding as matrix factorization: unifying DeepWalk, LINE, PTE, and node2vec. In: Proceedings of WSDM, Feb., Marina Del Rey, CA, USA, pp 459–467
Tenenbaum JB, De Silva V, Langford CJ (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323
Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326
Song DJ, Meyer David A, Tao DC (2015) Top-k link recommendation in social networks. In: Proceedings of the 2015 IEEE international conference on data mining. IEEE Computer Society Washington, DC, USA, November 14–17, pp 389–398
Zhang J, Dong YX, Wang Y, Tang J, Ding M (2019) ProNE: fast and scalable network representation, learning. In: Proceedings of the twenty-seventh international joint conference on artificial intelligence
Monti F, Boscaini D, Masci J, Rodola E, Svoboda J, Bronstein MM (2017) Geometric deep learning on graphs and manifolds using mixture model cnns. In: Proceedings of international conference on computer vision and pattern recognition, vol 1, p 3
Hamilton W, Ying ZT, Leskovec J (2017) Inductive representation learning on large graphs. In: Advances in neural information processing systems, pp 1024–1034
Velickovic P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graph attention networks. In: International conference on machine learning
Monti F, Otness K, Bronstein MM (2018) MotifNet: a motif-based graph convolutional network for directed graphs. arXiv:1802.01572
Ding M, Tang J, Zhang J (2018) Semi-supervised learning on graphs with generative adversarial nets. In: Proceedings of the 27th ACM international conference on information and knowledge management. ACM, pp 913–922
Wu S, Tang YY, Zhu Y. Q, Wang L, Xie X, Tan TN (2019) Session-based recommendation with graph neural networks. In: AAAI conference on artificial intelligence
Yang Z, Cohen W, Salakhutdinov R (2016) Revisiting semi-supervised learning with graph embeddings. In: International conference on machine learning, New York, pp 40–48
Zhuang CY, Ma Q (2018) Dual graph convolutional networks for graph-based semi-supervised classification. In: International World Wide Web conference
Chen HC, Perozzi B, Hu YF, Skiena S (2018) HARP: hierarchical representation learning for networks. In: AAAI conference on artificial intelligence, New Orleans, Riverside, USA, Feb, pp 2127–2134
Fu GJ, Hou CB, Yao X (2019) Learning Topologiacal representation for networks via hierarchical sampling. arXiv:1902.06684 [cs.SI], originally announced Feb., to be published
Xin ZH, Chen J, Cheng GL, Zhao S (2019) Marc: multi-granular representation learning for networks based on the 3-clique. IEEE Access 7(1):141715–141727
Deng C, Zhao Z, Wang Y, Zhang Z, Feng Z (2020) GraphZoom: a multi-level spectral approach for accurate and scalable graph embedding. In: International conference on learning representations
Zhang ZY, Tu CC, Liu ZY, Sun MS (2018) OpenNE: an open source toolkit for network embedding, May, [Online]. https://www.github.Com/thunlp/OpenNEhmsr=joyk.Com&utm_source=joyk.com&utm_medium=referral
Tang J, Zhang J, Yao L, Li J, Su Z (2008) ArnetMiner: extraction and mining of academic social networks. In: Proceedings of the 14th ACM SIGKDD international conference on knowledge discovery and data mining, August 24, pp 990–999
He XN, Deng K, Wang X, Li Y, Zhang YD, Wang M (2020) LightGCN: simplifying and powering graph convolution network for recommendation. In: International conference on research and development in information retrieval
McCallum A, Freitag D, Pereira FCN (2000) Maximum entropy markov models for information extraction and segmentation, In: International conference on machine learning, vol. 17
Li QM, Han ZC, Wu XM (2018) Deeper insights into graph convolutional networks for semisupervised learning. In: AAAI conference on artificial intelligence, pp 3538–3545
Li YX, Chai Y, Yin HP, Chen B (2020) A novel feature learning framework for high-dimensional data classification. Int J Mach Learn Cybern 2020(7):1–15
Acknowledgements
The authors would like to thank the support by National Natural Science Foundation of China (Grants #61876001, #61602003 and #61673020). Suzhou Scientific and Technological Research Projects (Grants #SZ2018GG04), Provincial Universities Academic Program for Excellent Young Talents At Home and Abroad (Grants #gxgnfx2018052). Key Research Projects of Anhui Provincial Education Department (Grants #KJ2019A1001, #KJ2019A0668), The Open Research Fund of National Engineering Research Center (Grants #AE2019010),The Key Research and Development Projects of Anhui Province (Grants #202004a05020043, #202004a06020045, #202004b11020023).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Xin, Z., Chen, G., Chen, J. et al. MGPOOL: multi-granular graph pooling convolutional networks representation learning. Int. J. Mach. Learn. & Cyber. 13, 783–796 (2022). https://doi.org/10.1007/s13042-021-01328-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-021-01328-2