Abstract
Graphs are a common and important data structure, and networks such as the Internet and social networks can be represented by graph structures. The proposal of Graph Convolutional Network (GCN) brings graph research into the era of deep learning and has achieved better results than traditional methods on various tasks. For ordinary neural networks, more layers can often achieve better results. However, for GCN, the deepening of the number of layers will cause a catastrophic decline in performance, including the gradual indistinguishability of node features, the disappearance of gradients, and the inability to update weights. This phenomenon is called over-smoothing. The occurrence of over-smoothing makes training deep GCNs a difficult problem. Compared with deep GCNs, shallow GCNs tend to perform better. Therefore, we design a contrastive learning model such that the deep GCN learns the features of the same node (positive samples) of the shallow GCN while alienating the features of other nodes (negative samples) , so that the deep GCN can learn the performance of the shallow GCN. Experiments show that our method can effectively alleviate the over-smoothing phenomenon. At the same time, we apply this model to other over-smoothing methods, and also achieve better results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)
Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3438–3445 (2020)
Chen, M., Wei, Z., Huang, Z., Ding, B., Li, Y.: Simple and deep graph convolutional networks. In: International Conference on Machine Learning, pp. 1725–1735. PMLR (2020)
Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: International Conference on Machine Learning, pp. 1597–1607. PMLR (2020)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. Adv. Neural Inf. Process. Syst. 29 (2016)
Do, T.H., Nguyen, D.M., Bekoulis, G., Munteanu, A., Deligiannis, N.: Graph convolutional neural networks with node transition probability-based message passing and dropnode regularization. Expert Syst. Appl. 174, 114711 (2021)
Getoor, L.: Link-based classification. In: Advanced Methods for Knowledge Discovery from Complex Data. AIKP, pp. 189–207. Springer, London (2005). https://doi.org/10.1007/1-84628-284-5_7
Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. Adv. Neural Inf. Process. Syst. 30 (2017)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Huang, W., Rong, Y., Xu, T., Sun, F., Huang, J.: Tackling over-smoothing for general graph convolutional networks. arXiv preprint arXiv:2008.09864 (2020)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Li, G., Muller, M., Thabet, A., Ghanem, B.: Deepgcns: Can GCNS go as deep as cnns? In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 9267–9276 (2019)
Li, H., Miao, S., Feng, R.: DG-FPN: learning dynamic feature fusion based on graph convolution network for object detection. In: 2020 IEEE International Conference on Multimedia and Expo (ICME), pp. 1–6. IEEE (2020)
Li, Q., Han, Z., Wu, X.M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Li, Z., et al.: A hybrid deep learning approach with GCN and LSTM for traffic flow prediction. In: 2019 IEEE Intelligent Transportation Systems Conference (ITSC), pp. 1929–1933. IEEE (2019)
Lu, W., et al.: Skipnode: On alleviating over-smoothing for deep graph convolutional networks. arXiv preprint arXiv:2112.11628 (2021)
Monti, F., Boscaini, D., Masci, J., Rodola, E., Svoboda, J., Bronstein, M.M.: Geometric deep learning on graphs and manifolds using mixture model cnns. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5115–5124 (2017)
Namata, G., London, B., Getoor, L., Huang, B., Edu, U.: Query-driven active surveying for collective classification. In: 10th International Workshop on Mining and Learning with Graphs. vol. 8, p. 1 (2012)
Oono, K., Suzuki, T.: Graph neural networks exponentially lose expressive power for node classification. arXiv preprint arXiv:1905.10947 (2019)
Qiu, J., et al.: GCC: graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020)
Rong, Y., Huang, W., Xu, T., Huang, J.: Dropedge: towards deep graph convolutional networks on node classification. arXiv preprint arXiv:1907.10903 (2019)
Sen, P., Namata, G., Bilgic, M., Getoor, L., Galligher, B., Eliassi-Rad, T.: Collective classification in network data. AI Mag. 29(3), 93–93 (2008)
Wan, S., Zhan, Y., Liu, L., Yu, B., Pan, S., Gong, C.: Contrastive graph poisson networks: Semi-supervised learning with extremely limited labels. Adv. Neural Inf. Process. Syst. 34 (2021)
Wu, F., Souza, A., Zhang, T., Fifty, C., Yu, T., Weinberger, K.: Simplifying graph convolutional networks. In: International Conference on Machine Learning, pp. 6861–6871. PMLR (2019)
Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018)
Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K.i., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, pp. 5453–5462. PMLR (2018)
You, J., Ying, R., Leskovec, J.: Position-aware graph neural networks. In: International Conference on Machine Learning, pp. 7134–7143. PMLR (2019)
Zhao, L., Akoglu, L.: Pairnorm: Tackling oversmoothing in gnns. arXiv preprint arXiv:1909.12223 (2019)
Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Graph contrastive learning with adaptive augmentation. In: Proceedings of the Web Conference 2021, pp. 2069–2080 (2021)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jin, R., Zhan, Y., Zhang, R. (2022). Deep Graph Convolutional Networks Based on Contrastive Learning: Alleviating Over-smoothing Phenomenon. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13606. Springer, Cham. https://doi.org/10.1007/978-3-031-20503-3_12
Download citation
DOI: https://doi.org/10.1007/978-3-031-20503-3_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20502-6
Online ISBN: 978-3-031-20503-3
eBook Packages: Computer ScienceComputer Science (R0)