Skip to main content
Log in

Graph Convolutional Network Based on Manifold Similarity Learning

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

In the area of large-scale graph data representation and semi-supervised learning, deep graph-based convolutional neural networks have been widely applied. However, typical graph convolutional network (GCN) aggregates information of neighbor nodes based on binary neighborhood similarity (adjacency matrix). It treats all neighbor nodes of one node equally, which does not suppress the influence of dissimilar neighbor nodes. In this paper, we investigate GCN based on similarity matrix instead of adjacency matrix of graph nodes. Gaussian heat kernel similarity in Euclidean space is first adopted, which is named EGCN. Then biologically inspired manifold similarity is trained in reproducing kernel Hilbert space (RKHS), based on which a manifold GCN (named MGCN) is proposed for graph data representation and semi-supervised learning with four different kernel types. The proposed method is evaluated with extensive experiments on four benchmark document citation network datasets. The objective function of manifold similarity learning converges very quickly on different datasets using various kernel functions. Compared with state-of-the-art methods, our method is very competitive in terms of graph node recognition accuracy. In particular, the recognition rates of MGCN (Gaussian kernel) and MGCN (Polynomial Kernel) outperform that of typical GCN about 3.8% on Cora dataset, 3.5% on Citeseer dataset, 1.3% on Pubmed dataset and 4% on Cora_ML dataset, respectively. Although the proposed MGCN is relatively simple and easy to implement, it can discover local manifold structure by manifold similarity learning and suppress the influence of dissimilar neighbor nodes, which shows the effectiveness of the proposed MGCN.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

References

  1. Bruna J, Zaremba W, Szlam A, LeCun Y. Spectral networks and locally connected networks on graphs. 2013. arXiv preprint arXiv:13126203

  2. Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G. The graph neural network model. IEEE Transactions on Neural Networks. 2008;20(1):61–80.

    Article  Google Scholar 

  3. Defferrard M, Bresson X, Vandergheynst P. Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in neural information processing systems. 2016;3844-3852.

  4. Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. 2016. arXiv preprint arXiv:160902907

  5. Al-Saffar AAM, Tao H, Talab MA. Review of deep convolution neural network in image classification. In: International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications. 2017;26-31.

  6. Velickovic P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y. Graph attention networks. 2017. arXiv preprint arXiv:171010903

  7. Thekumparampil KK, Wang C, Oh S, Li LJ. Attention-based graph neural network for semi-supervised learning. 2018. arXiv preprint arXiv:180303735

  8. Wu F, Zhang T, Souza Jr AHd, Fifty C, Yu T, Weinberger KQ. Simplifying graph convolutional networks. In: International conference on machine learning. 2019;6861-6871.

  9. Vashishth S, Yadav P, Bhandari M, Talukdar P. Confidence-based graph convolutional networks for semi-supervised learning. 2019. arXiv preprint arXiv:190108255

  10. Yang Y, Hu Y, Wu F. Sparse and low-rank subspace data clustering with manifold regulariza- tion learned by local linear embedding. Applied Sciences. 2018;8(11):2175.

    Article  Google Scholar 

  11. Gretton A. Introduction to RKHS, and some simple kernel algorithms. Adv Top Mach Learn Lecture Conducted from University College London 16. 2013.

  12. Chen SB, Ding CH, Luo B. Similarity learning of manifold data. IEEE Transactions on Cybernetics. 2015;45(9):1744–56.

    Article  Google Scholar 

  13. Fangyuan L, Kewen X, Wenjia N. Improved reconstruction weight-based locally linear embedding algorithm. Journal of Image and Graphics. 2018;01.

  14. Roweis ST, Saul LK. Nonlinear dimensionality reduction by locally linear embedding. science. 2000;290(5500):2323-2326.

  15. Sunouchi G. On Mercer’s theorem. Proceedings of the Japan Academy. 2006;22(1946):360–1.

    MathSciNet  MATH  Google Scholar 

  16. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T. Collective classification in network data. AI magazine. 2008;29(3):93–106.

    Article  Google Scholar 

  17. Bojchevski A, Gunnemann S. Deep gaussian embedding of graphs: Unsupervised inductive learning via ranking. 2017. arXiv preprint arXiv:170703815

  18. Yang Z, Cohen WW, Salakhutdinov R. Revisiting semi-supervised learning with graph embeddings. In: International conference on machine learning. 2016;40-48.

  19. Kingma DP, Ba J. A method for stochastic optimization. 2014. arXiv preprint arXiv:14126980

  20. Zhu X, Ghahramani Z, Lafferty JD. Semisupervised learning using gaussian fields and harmonic functions. In: International conference on machine learning. 2003;912-919.

  21. Belkin M, Niyogi P, Sindhwani V. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of machine learning research. 2006;2399-2434.

  22. Weston J, Ratle F, Mobahi H, Collobert R. Deep learning via semi-supervised embedding. In: Neural Networks: Tricks of the Trade, Springer. 2012;639-655.

  23. Perozzi B, Al-Rfou R, Skiena S. Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. 2014. 701-710.

  24. Arora S, Hu W, Kothari PK. An analysis of the t-SNE algorithm for data visualization. Conference On Learning Theory. 2018;75:1455–62.

    Google Scholar 

Download references

Acknowledgements

This work was supported in part by National Natural Science Foundation of China (NO. 61976004) and in part by Shenzhen Science & Research Project under Grant JCYJ20170817155939233.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Si-Bao Chen.

Ethics declarations

Conflicts of Interest

Si-Bao Chen, Xiu-Zhi Tian, Chris H.Q. Ding, Bin Luo, Yi Liu, Hao Huang and Qiang Li declare that they have no conflict of interest.

Informed Consent

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Declaration of Helsinki 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.

Human and Animal Rights

This article does not contain any studies with human participants or animals performed by any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, SB., Tian, XZ., Ding, C.H.Q. et al. Graph Convolutional Network Based on Manifold Similarity Learning. Cogn Comput 12, 1144–1153 (2020). https://doi.org/10.1007/s12559-020-09788-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-020-09788-4

Keywords

Navigation