Abstract
Recently, graph neural networks (GNNs) exhibit strong expressive power in modeling graph structured data and have been shown to work effectively for graph classification tasks. However, existing GNN models for predicting graph categories frequently neglect the graph hierarchy information or fail to accurately capture the graph substructure, resulting in significant performance decreases. In this paper, we propose Hierarchical Graph Representation Learning (HGRL), a multi-level framework for capturing hierarchical local and global topological structures to enrich graph representations. In specific, we utilize a structural coarsening module that generates a series of coarsened graphs for an input graph instance, followed by a graph encoder to preserve the local graph structure information. Furthermore, graph convolutional networks are layered to capture high dimensional proximity in graphs, and we incorporate the attention mechanism for entire graph embedding, which enables our framework to focus on critical nodes with significant contribution to the learned low-dimensional graph representations for subsequent graph classification tasks. Experimental results on multiple benchmark datasets demonstrate that the proposed HGRL can substantially improve the classification accuracy and outperform the existing state-of-the-art graph classification approaches.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Zhang, Z., Li, Z., Liu, H., Xiong, N.N.: Multi-scale dynamic convolutional network for knowledge graph embedding. IEEE Trans. Knowl. Data Eng. 34(5), 2335–2347 (2022)
Cai, T., Li, J., Mian, A.S., Sellis, T., Yu, J.X., et al.: Target-aware holistic influence maximization in spatial social networks. IEEE Trans. Knowl. Data Eng. 34(4), 1993–2007 (2022)
Cong, Q., Anishchenko, I., Ovchinnikov, S., Baker, D.: Protein interaction networks revealed by proteome coevolution. Science 365(6449), 185–189 (2019)
Al-Rfou, R., Perozzi, B., Zelle, D.: DDGK: learning graph representations for deep divergence graph kernels. In: The World Wide Web Conference, pp. 37–48 (2019)
Gao, H., Ji, S.: Graph U-Nets. In: International Conference on Machine Learning, pp. 2083–2092. PMLR (2019)
Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Proceedings of International Conference on Machine Learning, pp. 1263–1272. PMLR (2017)
Bianchi, F.M., Grattarola, D., Livi, L., Alippi, C.: Hierarchical representation learning in graph neural networks with node decimation pooling. IEEE Trans. Neural Netw. Learn. Syst. 33, 2195–2207 (2020)
Hu, F., Zhu, Y., Wu, S., Wang, L., Tan, T.: Hierarchical graph convolutional networks for semi-supervised node classification. arXiv preprint arXiv:1902.06667 (2019)
Huang, J., Li, Z., Li, N., Liu, S., Li, G.: AttPool: towards hierarchical feature representation in graph convolutional networks via attention mechanism. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 6480–6489 (2019)
Li, Q., Han, Z., Wu, X.-M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings 32nd AAAI Conference on Artificial Intelligence (2018)
Mnih, V., Heess, N., Graves, A., et al.: Recurrent models of visual attention. In: Advances in Neural Information Processing Systems 27 (2014)
Ying, Z., You, J., Morris, C., Ren, X., Hamilton, W., Leskovec, J.: Hierarchical graph representation learning with differentiable pooling. In: Advances in Neural Information Processing Systems (2018)
Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: Proceedings of International Conference on Machine Learning, pp. 3734–3743. PMLR (2019)
Ma, Y., Wang, S., Aggarwal, C.C., Tang, J.: Graph convolutional networks with eigenpooling. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 723–731 (2019)
Ahmadi, A.H.K.: Memory-based graph networks. Ph.D. thesis, University of Toronto, Canada (2020)
Chen, H., Perozzi, B., Hu, Y., Skiena, S.: HARP: hierarchical representation learning for networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
Safro, I., Sanders, P., Schulz, C.: Advanced coarsening schemes for graph partitioning. ACM J. Exp. Algorithmics 19, 1–24 (2015)
Dhillon, I.S., Guan, Y., Kulis, B.: Weighted graph cuts without eigenvectors a multilevel approach. IEEE Trans. Pattern Anal. Mach. Intell. 29(11), 1944–1957 (2007)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. In: Proceedings of International Conference on Learning Representations (2018)
Wang, G. Ying, R., Huang, J., Leskovec, J.: Improving graph attention networks with large margin-based constraints. In: Advances in Neural Information Processing Systems (2019)
Liu, Z., et al.: GeniePath: graph neural networks with adaptive receptive paths. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 4424–4431 (2019)
Oono, K., Suzuki, T.: Graph neural networks exponentially lose expressive power for node classification. In: Proceedings of International Conference on Learning Representations (2020)
Bastings, J., Filippova, K.: The elephant in the interpretability room: why use attention as explanation when we have saliency methods? In: Proceedings of the 4th BlackboxNLP Workshop, EMNLP (2021)
Yajun, Yu., Pan, Z., Guyu, H., Ren, H.: Graph classification based on sparse graph feature selection and extreme learning machine. Neurocomputing 261, 20–27 (2017)
Pan, S., Jia, W., Zhu, X., Long, G., Zhang, C.: Task sensitive feature exploration and learning for multitask graph classification. IEEE Trans. Cybern. 47(3), 744–758 (2016)
Dobson, P.D., Doig, A.J.: Distinguishing enzyme structures from non-enzymes without alignments. J. Mol. Biol. 330(4), 771–783 (2003)
Pan, S., Jia, W., Zhu, X.: CogBoost: boosting for fast cost-sensitive graph classification. IEEE Trans. Knowl. Data Eng. 27(11), 2933–2946 (2015)
Schomburg, I., Chang, A., Ebeling, C., Gremse, M., Heldt, C., Huhn, G., Schomburg, D.: Brenda, the enzyme database: updates and major new developments. Nucleic Acids Res. 32, 431–433 (2004)
Cai, C., Wang, Y.: A simple yet effective baseline for non-attributed graph classification. arXiv preprint arXiv:1811.03508 (2018)
Borgwardt, K.M., Kriegel, H.-P.: Shortest-path kernels on graphs. In: Proceedings of the 5th IEEE International Conference on Data Mining, pp. 74–81. IEEE (2005)
Shervashidze, N., Vishwanathan, S.V.N., Petri, T., Mehlhorn, K., Borgwardt, K.: Efficient graphlet kernels for large graph comparison. In: Artificial Intelligence Statistics, pp. 488–495. PMLR (2009)
Shervashidze, N., Schweitzer, P., Van Leeuwen, E.J., Mehlhorn, K., Borgwardt, K.M.: Weisfeiler-Lehman graph kernels. J. Mach. Learn. Res. 12(9), 2539–2561 (2011)
Niepert, M., Ahmed, M., Kutzkov, K.: Learning convolutional neural networks for graphs. In: Proceedings of International Conference on Machine Learning, pp. 2014–2023. PMLR (2016)
Zhang, M., Cui, Z., Neumann, M., Chen, Y.: An end-to-end deep learning architecture for graph classification. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence (2018)
Verma, S., Zhang, Z.-L.: Graph capsule convolutional neural networks. In: Joint ICML and IJCAI Workshop on Computational Biology (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Yu, B., Xu, X., Wen, C., Xie, Y., Zhang, C. (2022). Hierarchical Graph Representation Learning with Structural Attention for Graph Classification. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13605. Springer, Cham. https://doi.org/10.1007/978-3-031-20500-2_39
Download citation
DOI: https://doi.org/10.1007/978-3-031-20500-2_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20499-9
Online ISBN: 978-3-031-20500-2
eBook Packages: Computer ScienceComputer Science (R0)