Abstract
Graph neural networks (GNN) have achieved remarkable success in a wide range of tasks by encoding features combined with topology to create effective representations. However, the fundamental problem of understanding and analyzing how graph topology influences the performance of learning models on downstream tasks has not yet been well understood. In this paper, we propose a metric, TopoInf, which characterizes the influence of graph topology by measuring the level of compatibility between the topological information of graph data and downstream task objectives. We provide analysis based on the decoupled GNNs on the contextual stochastic block model to demonstrate the effectiveness of the metric. Through extensive experiments, we demonstrate that TopoInf is an effective metric for measuring topological influence on corresponding tasks and can be further leveraged to enhance graph learning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chanpuriya, S., Musco, C.: Simplified graph convolution with heterophily. In: NeurIPS (2022)
Chen, D., Lin, Y., Li, W., et al.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: AAAI (2020)
Chen, M., Wei, Z., Huang, Z., et al.: Simple and deep graph convolutional networks. In: ICML (2020)
Chien, E., Peng, J., Li, P., Milenkovic, O.: Adaptive universal generalized pagerank graph neural network. In: ICLR (2021)
Chin, A., Chen, Y., M. Altenburger, K., Ugander, J.: Decoupled smoothing on graphs. In: WWW (2019)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: NeurIPS (2016)
Deshpande, Y., Montanari, A., Mossel, E., Sen, S.: Contextual stochastic block models. In: NeurIPS (2018)
Dong, H., Chen, J., Feng, F., et al.: On the equivalence of decoupled graph convolution network and label propagation. In: WWW (2021)
Dong, M., Kluger, Y.: Towards understanding and reducing graph structural noise for GNNs. In: ICML (2023)
Du, J., Zhang, S., Wu, G., Moura, J.M.F., Kar, S.: Topology adaptive graph convolutional networks. arXiv preprint arXiv:1710.10370 (2018)
He, M., Wei, Z., Huang, Z., Xu, H.: BernNet: learning arbitrary graph spectral filters via bernstein approximation. In: NeurIPS (2021)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2017)
Klicpera, J., Bojchevski, A., Günnemann, S.: Predict then propagate: graph neural networks meet personalized PageRank. In: ICLR (2019)
Luo, D., Cheng, W., Yu, W., et al.: Learning to drop: robust graph neural network via topological denoising. In: WSDM (2021)
Ma, Y., Liu, X., Shah, N., et al.: Is homophily a necessity for graph neural networks? arXiv preprint arXiv:2106.06134 (2021)
Nt, H., Maehara, T.: Revisiting graph neural networks: all we have is low-pass filters. arXiv preprint arXiv:1905.09550 (2019)
Pei, H., Wei, B., Chang, K.C., Lei, Y., Yang, B.: Geom-GCN: geometric graph convolutional networks. In: ICLR (2020)
Ren, Y., Bai, J., Zhang, J.: Label contrastive coding based graph neural network for graph classification. In: DASFAA (2021)
Rong, Y., Huang, W., Xu, T., Huang, J.: DropEdge: towards deep graph convolutional networks on node classification. In: ICLR (2020)
Sen, P., Namata, G., Bilgic, M., et al.: Collective classification in network data. AI Mag. 29(3) (2008)
Shchur, O., Mumme, M., Bojchevski, A., Günnemann, S.: Pitfalls of graph neural network evaluation. arXiv preprint arXiv:1811.05868 (2018)
Sun, S., Yuxiang, R., et al.: Large language models as topological structure enhancers for text-attributed graphs. arXiv preprint arXiv:2311.14324 (2024)
Topping, J., Di Giovanni, F., Chamberlain, B.P., et al.: Understanding over-squashing and bottlenecks on graphs via curvature. In: ICLR (2022)
Bai, J., Ren, Y., Zhang, J.: Measuring and sampling: a metric-guided subgraph learning framework for graph neural network. Int. J. Intell, Syst (2022)
Veli?kovi?, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: ICLR (2018)
Wu, F., Souza, A., Zhang, T., Fifty, C., Yu, T., Weinberger, K.: Simplifying graph convolutional networks. In: ICML (2019)
Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? In: ICLR (2019)
Yang, Z., Cohen, W., Salakhudinov, R.: Revisiting semi-supervised learning with graph embeddings. In: WWW (2016)
Zhang, W., Yang, M., Sheng, Z., et al.: Node-dependent local smoothing for scalable graph learning. In: NeurIPS (2021)
Zhu, H., Koniusz, P.: Simple spectral graph convolution. In: ICLR (2021)
Zhu, J., Rossi, R.A., Rao, A., et al.: Graph neural networks with heterophily. In: AAAI (2021)
Zhu, J., Yan, Y., et al.: Beyond homophily in graph neural networks: current limitations and effective designs. In: NeurIPS (2020)
Acknowledgment
This work was supported by the National Key Research and Development Plan No. 2022YFB3904204, NSF China under Grant No. 62202299, 62020106005, 61960206002, Shanghai Natural Science Foundation No. 22ZR1429100.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wu, K. et al. (2025). Characterizing the Influence of Topology on Graph Learning Tasks. In: Onizuka, M., et al. Database Systems for Advanced Applications. DASFAA 2024. Lecture Notes in Computer Science, vol 14851. Springer, Singapore. https://doi.org/10.1007/978-981-97-5779-4_3
Download citation
DOI: https://doi.org/10.1007/978-981-97-5779-4_3
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-5778-7
Online ISBN: 978-981-97-5779-4
eBook Packages: Computer ScienceComputer Science (R0)