ABSTRACT
Graph Neural Networks (GNNs) generalize conventional neural networks to graph-structured data and have received considerable attention owing to their impressive performance. In spite of the notable successes, the performance of Euclidean models is inherently bounded and limited by the representation ability of Euclidean geometry, especially when it comes to datasets with highly non-Euclidean latent anatomy. Recently, hyperbolic spaces have emerged as a promising alternative for processing graph data with tree-like structure or power-law distribution and a surge of works on either methods or novel applications have been seen. Unlike Euclidean space, which expands polynomially, hyperbolic space grows exponentially with its radius, making it more suitable for modeling complex real-world data. Hence, it gains natural advantages in abstracting tree-like graphs with a hierarchical organization or power-law distribution.
To support the burgeoning interest in Hyperbolic Graph Neural Networks (HGNNs), the primary goal of this tutorial is to give a systematical review of the methods, applications, and challenges in this fast-growing and vibrant area, with the express purpose of being accessible to all audiences.More specifically, we will first give a brief introduction to graph neural networks as well as some preliminary of Riemannian manifold and hyperbolic geometry. We then will comprehensively revisit the technical details of the developed HGNNs, by unifying them into a general framework and summarizing the variants of each component. Besides, we will introduce applications deployed in a variety of fields. Finally, we will discuss several challenges and present the potential solutions to address them, including some initial attempts of our own, which potentially paves the path for the further flourishing of the research community.
Supplemental Material
- Gregor Bachmann, Gary Bécigneul, and Octavian Ganea. 2020. Constant curvature graph convolutional networks. In ICML. PMLR, 486--496.Google Scholar
- Avishek Joey Bose, Ariella Smofsky, Renjie Liao, Prakash Panangaden, and William L Hamilton. 2020. Latent Variable Modelling with Hyperbolic Normalizing Flows. In ICML. 1045--1055.Google Scholar
- Yankai Chen, Menglin Yang, Yingxue Zhang, Mengchen Zhao, Ziqiao Meng, Jianye Hao, and Irwin King. 2022. Modeling Scale-free Graphs for Knowledge-aware Recommendation. WSDM (2022).Google Scholar
- Jianxin Li, Xingcheng Fu, Qingyun Sun, Cheng Ji, Jiajun Tan, Jia Wu, and Hao Peng. 2022. Curvature Graph Generative Adversarial Networks. arXiv preprint arXiv:2203.01604 (2022).Google Scholar
- Ramit Sawhney, Shivam Agarwal, Arnav Wadhwa, and Rajiv Shah. 2021. Exploring the scale-free nature of stock markets: Hyperbolic graph learning for algorithmic trading. In WWW. 11--22.Google Scholar
- Jianing Sun, Zhaoyue Cheng, Saba Zuberi, Felipe Pérez, and Maksims Volkovs. 2021a. HGCF: Hyperbolic Graph Convolution Networks for Collaborative Filtering. In WWW. 593--601.Google Scholar
- Li Sun, Junda Ye, Hao Peng, Feiyang Wang, and Philip S Yu. 2022. Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces. arXiv preprint arXiv:2211.17068 (2022).Google Scholar
- Li Sun, Zhongbao Zhang, Jiawei Zhang, Feiyang Wang, Hao Peng, Sen Su, and Philip S Yu. 2021b. Hyperbolic variational graph neural network for modeling dynamic graphs. In AAAI, Vol. 35. 4375--4383.Google ScholarCross Ref
- Atsushi Suzuki, Atsushi Nitanda, Linchuan Xu, Kenji Yamanishi, Marc Cavazza, et al. 2021. Generalization Bounds for Graph Embedding Using Negative Sampling: Linear vs Hyperbolic. In NeurIPS.Google Scholar
- Liping Wang, Fenyu Hu, Shu Wu, and Liang Wang. 2021a. Fully Hyperbolic Graph Convolution Network for Recommendation. arXiv preprint arXiv:2108.04607 (2021).Google Scholar
- Shen Wang, Xiaokai Wei, Cicero Nogueira Nogueira dos Santos, Zhiguo Wang, Ramesh Nallapati, Andrew Arnold, Bing Xiang, Philip S Yu, and Isabel F Cruz. 2021b. Mixed-curvature multi-relational graph neural network for knowledge graph completion. In WWW. 1761--1771.Google Scholar
- Zhenxing Wu, Dejun Jiang, Chang-Yu Hsieh, Guangyong Chen, Ben Liao, Dongsheng Cao, and Tingjun Hou. 2021. Hyperbolic relational graph convolution networks plus: a simple but highly efficient QSAR-modeling method. Briefings in Bioinformatics , Vol. 22, 5 (2021), bbab112.Google ScholarCross Ref
- Bo Xiong, Shichao Zhu, Nico Potyka, Shirui Pan, Chuan Zhou, and Steffen Staab. 2022. Pseudo-Riemannian Graph Convolutional Networks. In NeurIPS.Google Scholar
- Menglin Yang, Zhihao Li, Min Zhou, Jiahong Liu, and Irwin King. 2022a. Hicf: Hyperbolic informative collaborative filtering. In KDD. 2212--2221.Google Scholar
- Menglin Yang, Min Zhou, Marcus Kalander, Zengfeng Huang, and Irwin King. 2021. Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space. In KDD. 1975--1985.Google Scholar
- Menglin Yang, Min Zhou, Zhihao Li, Jiahong Liu, Lujia Pan, Hui Xiong, and Irwin King. 2022b. Hyperbolic Graph Neural Networks: A Review of Methods and Applications. arXiv e-prints (2022), arXiv 2202.13852.Google Scholar
- Menglin Yang, Min Zhou, Jiahong Liu, Defu Lian, and Irwin King. 2022c. HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric Regularization. In WWW.Google Scholar
- Menglin Yang, Min Zhou, Lujia Pan, and Irwin King. 2022d. Hyperbolic Curvature Graph Neural Network. arXiv preprint arXiv:2212.01793 (2022).Google Scholar
- Menglin Yang, Min Zhou, Hui Xiong, and Irwin King. 2022 e. Hyperbolic Temporal Network Embedding. IEEE Transactions on Knowledge and Data Engineering (TKDE) (2022), 1--14.Google Scholar
- Sixiao Zhang, Hongxu Chen, Xiao Ming, Lizhen Cui, Hongzhi Yin, and Guandong Xu. 2021. Where are we in embedding spaces? A Comprehensive Analysis on Network Embedding Approaches for Recommender Systems. In KDD.Google Scholar
- Min Zhou, Menglin Yang, Lujia Pan, and Irwin King. 2022. Hyperbolic Graph Representation Learning: A Tutorial. arXiv preprint arXiv:2211.04050 (2022).Google Scholar
- Shichao Zhu, Shirui Pan, Chuan Zhou, Jia Wu, Yanan Cao, and Bin Wang. 2020. Graph geometry interaction learning. NeurIPS , Vol. 33 (2020), 7548--7558. ioGoogle Scholar
Index Terms
- Hyperbolic Graph Neural Networks: A Tutorial on Methods and Applications
Recommendations
Lorentzian Graph Convolutional Networks
WWW '21: Proceedings of the Web Conference 2021Graph convolutional networks (GCNs) have received considerable research attention recently. Most GCNs learn the node representations in Euclidean geometry, but that could have a high distortion in the case of embedding graphs with scale-free or ...
Mining the Largest Dense Vertexlet in a Weighted Scale-free Graph
An important problem of knowledge discovery that has recently evolved in various reallife networks is identifying the largest set of vertices that are functionally associated. The topology of many real-life networks shows scale-freeness, where the ...
Comments