Abstract
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data, with applications spanning numerous domains. However, most research focuses on static graphs, neglecting the dynamic nature of real-world networks where topologies and attributes evolve over time. By integrating sequence modeling modules into traditional GNN architectures, dynamic GNNs aim to bridge this gap, capturing the inherent temporal dependencies of dynamic graphs for a more authentic depiction of complex networks. This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models. We present the mainstream dynamic GNN models in detail and categorize models based on how temporal information is incorporated. We also discuss large-scale dynamic GNNs and pre-training techniques. Although dynamic GNNs have shown superior performance, challenges remain in scalability, handling heterogeneous information, and lack of diverse graph datasets. The paper also discusses possible future directions, such as adaptive and memory-enhanced models, inductive learning, and theoretical analysis.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Kazemi S M, Goel R, Jain K, Kobyzev I, Sethi A, Forsyth P, Poupart P. Representation learning for dynamic graphs: a survey. The Journal of Machine Learning Research, 2020, 21(1): 70
Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K. Inductive representation learning on temporal graphs. In: Proceedings of the 8th International Conference on Learning Representations. 2020
Rossi E, Chamberlain B, Frasca F, Eynard D, Monti F, Bronstein M. Temporal graph networks for deep learning on dynamic graphs. 2020, arXiv preprint arXiv: 2006.10637
You J, Du T, Leskovec J. ROLAND: graph learning framework for dynamic graphs. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2022, 2358–2366
Zhu Y, Lyu F, Hu C, Chen X, Liu X. Encoder-decoder architecture for supervised dynamic graph learning: a survey. 2022, arXiv preprint arXiv: 2203.10480
Barros C D T, Mendonca M R F, Vieira A B, Ziviani A. A survey on embedding dynamic graphs. ACM Computing Surveys, 2021, 55(1): 10
Cai B, Xiang Y, Gao L, Zhang H, Li Y, Li J. Temporal knowledge graph completion: a survey. In: Proceedings of the 32nd International Joint Conference on Artificial Intelligence. 2023, 6545–6553
Liu C, Paterlini S. Stock price prediction using temporal graph model with value chain data. 2023, arXiv preprint arXiv: 2303.09406
Wang X, Ma Y, Wang Y, Jin W, Wang X, Tang J, Jia C, Yu J. Traffic flow prediction via spatial temporal graph neural network. In: Proceedings of Web Conference 2020. 2020, 1082–1092
Gao Y, Wang X, He X, Feng H, Zhang Y. Rumor detection with self-supervised learning on texts and social graph. Frontiers of Computer Science, 2023, 17(4): 174611
Hu W, Fey M, Ren H, Nakata M, Dong Y, Leskovec J. OGB-LSC: a large-scale challenge for machine learning on graphs. In: Proceedings of the 1st Neural Information Processing Systems Track on Datasets and Benchmarks. 2021
Fu D, He J. DPPIN: a biological repository of dynamic protein-protein interaction network data. In: Proceedings of 2022 IEEE International Conference on Big Data. 2022, 5269–5277
Hawkes A G. Spectra of some self-exciting and mutually exciting point processes. Biometrika, 1971, 58(1): 83–90
Zuo S, Jiang H, Li Z, Zhao T, Zha H. Transformer HawKes process. In: Proceedings of the 37th International Conference on Machine Learning. 2020, 11692–11702
Lu Y, Wang X, Shi C, Yu P S, Ye Y. Temporal network embedding with micro- and macro-dynamics. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. 2019, 469–478
Zuo Y, Liu G, Lin H, Guo J, Hu X, Wu J. Embedding temporal network via neighborhood formation. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018, 2857–2866
Chen F, Wang Y C, Wang B, Kuo C C J. Graph representation learning: a survey. APSIPA Transactions on Signal and Information Processing. 2020, 9: e15
Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290(5500): 2323–2326
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373–1396
Cao S, Lu W, Xu Q. GraRep: Learning graph representations with global structural information. In: Proceedings of the 24th ACM International on Conference on Information and Knowledge Management. 2015, 891–900
Luo X, Yuan J, Huang Z, Jiang H, Qin Y, Ju W, Zhang M, Sun Y. Hope: High-order graph ode for modeling interacting dynamics. In: International Conference on Machine Learning. 2023, 23124–23139
Bartunov S, Kondrashkin D, Osokin A, Vetrov D. Breaking sticks and ambiguities with adaptive skip-gram. In: Proceedings of the 19th International Conference on Artificial Intelligence and Statistics. 2016, 130–138
Joachims T. Text categorization with support vector machines: learning with many relevant features. In: Proceedings of the 10th European Conference on Machine Learning. 1998, 137–142
Perozzi B, Al-Rfou R, Skiena S. DeepWalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2014, 701–710
Grover A, Leskovec J. Node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2016, 855–864
Wang D, Cui P, Zhu W. Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2016, 1225–1234
Cao S, Lu W, Xu Q. Deep neural networks for learning graph representations. In: Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016, 1145–1152
Defferrard M, Bresson X, Vandergheynst P. Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016, 3844–3852
Kipf T N, Welling M. Semi- supervised classification with graph convolutional networks. In: Proceedings of the 5th International Conference on Learning Representations. 2017
Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y. Graph attention networks. In: Proceedings of the 6th International Conference on Learning Representations. 2018
Hamilton W L, Ying Z, Leskovec J. Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 1025–1035
Zhu D, Cui P, Zhang Z, Pei J, Zhu W. High-order proximity preserved embedding for dynamic networks. IEEE Transactions on Knowledge and Data Engineering, 2018, 30(11): 2134–2144
Li J, Dani H, Hu X, Tang J, Chang Y, Liu H. Attributed network embedding for learning in a dynamic environment. In: Proceedings of 2017 ACM on Conference on Information and Knowledge Management. 2017, 387–396
Nguyen G H, Lee J B, Rossi R A, Ahmed N K, Koh E, Kim S. Continuous-time dynamic network embeddings. In: Proceedings of Web Conference 2018. 2018, 969–976
Heidari F, Papagelis M. Evonrl: Evolving network representation learning based on random walks. In: Complex Networks and Their Applications VII: Volume 1 Proceedings The 7th International Conference on Complex Networks and Their Applications COMPLEX NETWORKS 2018 7. 2019, 457–469
Manessi F, Rozza A, Manzo M. Dynamic graph convolutional networks. Pattern Recognition, 2020, 97: 107000
Sankar A, Wu Y, Gou L, Zhang W, Yang H. DySAT: deep neural representation learning on dynamic graphs via self-attention networks. In: Proceedings of the 13th International Conference on Web Search and Data Mining. 2020, 519–527
Wang Y, Li P, Bai C, Subrahmanian V S, Leskovec J. Generic representation learning for dynamic social interaction. In: Proceedings of KDD’ 20: Knowledge Discovery in Databases. 2020
Wang Y, Li P, Bai C, Leskovec J. TEDIC: neural modeling of behavioral patterns in dynamic social interaction networks. In: Proceedings of Web Conference 2021. 2021, 693–705
Chen J, Wang X, Xu X. GC- LSTM: graph convolution embedded LSTM for dynamic network link prediction. Applied Intelligence, 2022, 52(7): 7513–7528
Li J, Han Z, Cheng H, Su J, Wang P, Zhang J, Pan L. Predicting path failure in time-evolving graphs. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019, 1279–1289
Jin W, Qu M, Jin X, Ren X. Recurrent event network: autoregressive structure inferenceover temporal knowledge graphs. In: Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing. 2020, 6669–6683
Zhu Y, Cong F, Zhang D, Gong W, Lin Q, Feng W, Dong Y, Tang J. WinGNN: dynamic graph neural networks with random gradient aggregation window. In: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2023, 3650–3662
Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl T, Leiserson C. EvolveGCN: evolving graph convolutional networks for dynamic graphs. In: Proceedings of the 34th AAAI Conference on Artificial Intelligence. 2020, 5363–5370
Qin X, Sheikh N, Lei C, Reinwald B, Domeniconi G. SEIGN: a simple and efficient graph neural network for large dynamic graphs. In: Proceedings of the 39th IEEE International Conference on Data Engineering. 2023, 2850–2863
Goyal P, Chhetri S R, Canedo A. Dyngraph2vec: capturing network dynamics using dynamic graph representation learning. Knowledge-Based Systems, 2020, 187: 104816
Trivedi R, Dai H, Wang Y, Song L. Know-evolve: deep temporal reasoning for dynamic knowledge graphs. In: Proceedings of the 34th International Conference on Machine Learning. 2017, 3462–3471
Trivedi R, Farajtabar M, Biswal P, Zha H. DyRep: learning representations over dynamic graphs. In: Proceedings of the 7th International Conference on Learning Representations. 2019
Knyazev B, Augusta C, Taylor G W. Learning temporal attention in dynamic graphs with bilinear interactions. PLoS One, 2021, 16(3): e0247936
Han Z, Ma Y, Wang Y, Gunnemann S, Tresp V. Graph Hawkes neural network for forecasting on temporal knowledge graphs. In: Proceedings of the Automated Knowledge Base Construction. 2020
Sun H, Geng S, Zhong J, Hu H, He K. Graph Hawkes transformer for extrapolated reasoning on temporal knowledge graphs. In: Proceedings of 2022 Conference on Empirical Methods in Natural Language Processing. 2022, 7481–7493
Wen Z, Fang Y. Trend: temporal event and node dynamics for graph representation learning. In: Proceedings of ACM Web Conference 2022. 2022, 1159–1169
Ma Y, Guo Z, Ren Z, Tang J, Yin D. Streaming graph neural networks. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. 2020, 719–728
Kumar S, Zhang X, Leskovec J. Predicting dynamic embedding trajectory in temporal interaction networks. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019, 1269–1278
Wang X, Lyu D, Li M, Xia Y, Yang Q, Wang X, Wang X, Cui P, Yang Y, Sun B, Guo Z Y. APAN: asynchronous propagation attention network for real-time temporal graph embedding. In: Proceedings of 2021 International Conference on Management of Data. 2021, 2628–2638
Wang Y, Chang Y Y, Liu Y, Leskovec J, Li P. Inductive representation learning in temporal networks via causal anonymous walks. In: Proceedings of the 9th International Conference on Learning Representations. 2021
Li Y, Shen Y, Chen L, Yuan M. Zebra: when temporal graph neural networks meet temporal personalized PageRank. Proceedings of the VLDB Endowment, 2023, 16(6): 1332–1345
Li H, Chen L. EARLY: efficient and reliable graph neural network for dynamic graphs. Proceedings of the ACM on Management of Data, 2023, 1(2): 163
Zheng Y, Wei Z, Liu J. Decoupled graph neural networks for large dynamic graphs. Proceedings of the VLDB Endowment, 2023, 16(9): 2239–2247
Fu D, He J. SDG: a simplified and dynamic graph neural network. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2021, 2273–2277
Liu H, Xu X, Lu J A, Chen G, Zeng Z. Optimizing pinning control of complex dynamical networks based on spectral properties of grounded laplacian matrices. IEEE Transactions on Systems, Man, and Cybernetics: Systems. 2018, 51(2): 786–796
Bonner S, Atapour-Abarghouei A, Jackson P T, Brennan J, Kureshi I, Theodoropoulos G, McGough A S, Obara B. Temporal neighbourhood aggregation: predicting future links in temporal graphs via recurrent variational graph convolutions. In: Proceedings of 2019 IEEE International Conference on Big Data. 2019, 5336–5345
Chung J, Gulcehre C, Cho K, Bengio Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. 2014, arXiv preprint arXiv: 1412.3555
Goyal P, Kamra N, He X, Liu Y. DynGEM: deep embedding method for dynamic graphs. 2018, arXiv preprint arXiv: 1805.11273
Chen T, Goodfellow I, Shlens J. Net2Net: accelerating learning via knowledge transfer. In: Proceedings of the 4th International Conference on Learning Representations. 2016
Kipf T, Fetaya E, Wang K C, Welling M, Zemel R. Neural relational inference for interacting systems. In: Proceedings of the 35th International Conference on Machine Learning. 2018, 2688–2697
Kazemi S M, Goel R, Eghbali S, Ramanan J, Sahota J, Thakur S, Wu S, Smyth C, Poupart P, Brubaker M. Time2Vec: learning a vector representation of time. 2019, arXiv preprint arXiv: 1907.05321
Loomis L H. Introduction to Abstract Harmonic Analysis. New York: Dover Publications, 2013
Wu F, Souza A, Zhang T, Fifty C, Yu T, Weinberger K. Simplifying graph convolutional networks. In: Proceedings of the 36th International Conference on Machine Learning. 2019, 6861–6871
Gasteiger J, Bojchevski A, Günnemann S. Predict then propagate: graph neural networks meet personalized pagerank. In: Proceedings of the 7h International Conference on Learning Representations. 2019
Wang C, Sun D, Bai Y. PiPAD: pipelined and parallel dynamic GNN training on GPUs. In: Proceedings of the 28th ACM SIGPLAN Annual Symposium on Principles and Practice of Parallel Programming. 2023, 405–418
Chen H, Hao C. DGNN-booster: a generic FPGA accelerator framework for dynamic graph neural network inference. In: Proceedings of the 31st IEEE Annual International Symposium on Field-Programmable Custom Computing Machines. 2023, 195–201
Chakaravarthy V T, Pandian S S, Raje S, Sabharwal Y, Suzumura T, Ubaru S. Efficient scaling of dynamic graph neural networks. In: Proceedings of International Conference for High Performance Computing, Networking, Storage and Analysis. 2021, 77
Zhou H, Zheng D, Nisa I, Ioannidis V, Song X, Karypis G. TGL: a general framework for temporal GNN training on billion-scale graphs. Proceedings of the VLDB Endowment, 2022, 15(8): 1572–1580
Zhou H, Zheng D, Song X, Karypis G, Prasanna V. DistTGL: distributed memory-based temporal graph neural network training. In: Proceedings of International Conference for High Performance Computing, Networking, Storage and Analysis. 2023, 39
Chen X, Liao Y, Xiong Y, Zhang Y, Zhang S, Zhang J, Sun Y. SPEED: streaming partition and parallel acceleration for temporal interaction graph embedding. 2023, arXiv preprint arXiv: 2308.14129
Xia Y, Zhang Z, Wang H, Yang D, Zhou X, Cheng D. Redundancy-free high-performance dynamic GNN training with hierarchical pipeline parallelism. In: Proceedings of the 32nd International Symposium on High-Performance Parallel and Distributed Computing. 2023, 17–30
Li J, Tian S, Wu R, Zhu L, Zhao W, Meng C, Chen L, Zheng Z, Yin H. Less can be more: unsupervised graph pruning for large-scale dynamic graphs. 2023, arXiv preprint arXiv: 2305.10673
Madan A, Cebrian M, Moturu S, Farrahi K, Pentland A. Sensing the “health state” of a community. IEEE Pervasive Computing, 2012, 11(4): 36–45
Shetty J, Adibi J. The enron email dataset database schema and brief statistical report. Information Sciences Institute Technical Report, University of Southern California, 2004, 4(1): 120–128
Sapiezynski P, Stopczynski A, Lassen D D, Lehmann S. Interaction data from the copenhagen networks study. Scientific Data, 2019, 6(1): 315
Panzarasa P, Opsahl T, Carley K M. Patterns and dynamics of users’ behavior and interaction: network analysis of an online community. Journal of the American Society for Information Science and Technology, 2009, 60(5): 911–932
Kumar S, Spezzano F, Subrahmanian V S, Faloutsos C. Edge weight prediction in weighted signed networks. In: Proceedings of the 16th IEEE International Conference on Data Mining. 2016, 221–230
Kumar S, Hooi B, Makhija D, Kumar M, Faloutsos C, Subrahmanian V S. REV2: fraudulent user prediction in rating platforms. In: Proceedings of the 11th ACM International Conference on Web Search and Data Mining. 2018, 333–341
Leetaru K, Schrodt P A. GDELT: global data on events, location, and tone, 1979–2012. In: Proceedings of ISA Annual Convention. 2013, 1–49
Huang Q, Jiang J, Rao X S, Zhang C, Han Z, Zhang Z, Wang X, He Y, Xu Q, Zhao Y, Hu C, Shang S, Du B. BenchTemp: a general benchmark for evaluating temporal graph neural networks. 2023, arXiv preprint arXiv: 2308.16385
Jin M, Li Y F, Pan S. Neural temporal walks: motif-aware representation learning on continuous-time dynamic graphs. In: Proceedings of the 36th International Conference on Neural Information Processing Systems. 2022, 1445
Zhu H, Li X, Zhang P, Li G, He J, Li H, Gai K. Learning tree-based deep model for recommender systems. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018, 1079–1088
Jin Y, Lee Y C, Sharma K, Ye M, Sikka K, Divakaran A, Kumar S. Predicting information pathways across online communities. In: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2023, 1044–1056
Huang X, Yang Y, Wang Y, Wang C, Zhang Z, Xu J, Chen L, Vazirgiannis M. DGraph: a large-scale financial dataset for graph anomaly detection. In: Proceedings of the 36th International Conference on Neural Information Processing Systems. 2022, 1654
Bailey M A, Strezhnev A, Voeten E. Estimating dynamic state preferences from united nations voting data. Journal of Conflict Resolution, 2017, 61(2): 430–456
Huang S, Hitti Y, Rabusseau G, Rabbany R. Laplacian change point detection for dynamic graphs. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2020, 349–358
Fowler J H. Legislative cosponsorship networks in the US house and senate. Social Networks, 2006, 28(4): 454–465
MacDonald G K, Brauman K A, Sun S, Carlson K M, Cassidy E S, Gerber J S, West P C. Rethinking agricultural trade relationships in an era of globalization. BioScience, 2015, 65(3): 275–289
Béres F, Pálovics R, Oláh A, Benczúr A A. Temporal walk based centrality metric for graph streams. Applied Network Science, 2018, 3(1): 32
Leskovec J, Kleinberg J, Faloutsos C. Graphs over time: densification laws, shrinking diameters and possible explanations. In: Proceedings of the 11th ACM SIGKDD International Conference on Knowledge Discovery in Data Mining. 2005, 177–187
Schäfer M, Strohmeier M, Lenders V, Martinovic I, Wilhelm M. Bringing up OpenSky: a large-scale ads-b sensor network for research. In: Proceedings of the 13th International Symposium on Information Processing in Sensor Networks. 2014, 83–94
Gehrke J, Ginsparg P, Kleinberg J. Overview of the 2003 KDD cup. ACM SIGKDD Explorations Newsletter, 2003, 5(2): 149–151
Weber M, Domeniconi G, Chen J, Weidele D K I, Bellei C, Robinson T, Leiserson C E. Anti-money laundering in Bitcoin: experimenting with graph convolutional networks for financial forensics. 2019, arXiv preprint arXiv: 1908.02591
Poursafaei F, Huang S, Pelrine K, Rabbany R. Towards better evaluation for dynamic link prediction. In: Proceedings of the 36th International Conference on Neural Information Processing Systems. 2022, 2386
Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345–1359
Neyshabur B, Sedghi H, Zhang C. What is being transferred in transfer learning? In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 44
Wang H, Mao Y, Sun J, Zhang S, Zhou D. Dynamic transfer learning across graphs. 2023, arXiv preprint arXiv: 2305.00664
Hu Z, Dong Y, Wang K, Chang K W, Sun Y. GPT-GNN: generative pre-training of graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2020, 1857–1867
Qiu J, Chen Q, Dong Y, Zhang J, Yang H, Ding M, Wang K, Tang J. GCC: graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2020, 1150–1160
Chen K J, Zhang J, Jiang L, Wang Y, Dai Y. Pre-training on dynamic graph neural networks. Neurocomputing, 2022, 500: 679–687
Bei Y, Xu H, Zhou S, Chi H, Zhang M, Li Z, Bu J. CPDG: a contrastive pre-training method for dynamic graph neural networks. 2023, arXiv preprint arXiv: 2307.02813
Sharma K, Raghavendra M, Lee Y C, Kumar M A, Kumar S. Representation learning in continuous-time dynamic signed networks. In: Proceedings of the 32nd ACM International Conference on Information and Knowledge Management. 2023, 2229–2238
Dai E, Wang S. Towards self-explainable graph neural network. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 2021, 302–311
Xie J, Liu Y, Shen Y. Explaining dynamic graph neural networks via relevance back-propagation. 2022, arXiv preprint arXiv: 2207.11175
Zheng K, Ma B, Chen B. DynBraingNN: Towards spatio-temporal interpretable graph neural network based on dynamic brain connectome for psychiatric diagnosis. In: Proceedings of the 14th International Workshop on Machine Learning in Medical Imaging. 2023, 164–173
Brown T B, Mann B, Ryder N, Subbiah M, Kaplan JD, Dhariwal P, Neelakantan A, Shyam P, Sastry G, Askell A, Agarwal S, Herbert-Voss A, Krueger G, Henighan T, Child R, Ramesh A, Ziegler D M, Wu J, Winter C, Hesse C, Chen M, Sigler E, Litwin M, Gray S, Chess B, Clark J, Berner C, McCandlish S, Radford A, Sutskever I, Amodei D. Language models are few-shot learners. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 159
Zhuang Y, Yu Y, Wang K, Sun H, Zhang C. ToolQA: a dataset for LLM question answering with external tools. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 36
Mendonça J, Pereira P, Moniz H, Carvalho J P, Lavie A, Trancoso I. Simple LLM prompting is state-of-the-art for robust and multilingual dialogue evaluation. In: Proceedings of the 11th Dialog System Technology Challenge. 2023, 133–143
Zhang Z, Wang X, Zhang Z, Li H, Qin Y, Wu S, Zhu W. LLM4DyG: can large language models solve problems on dynamic graphs? 2023, arXiv preprint arXiv: 2310.17110
Tang J, Yang Y, Wei W, Shi L, Su L, Cheng S, Yin D, Huang C. GraphGPT: graph instruction tuning for large language models. 2023, arXiv preprint arXiv: 2310.13023
Acknowledgements
This research was supported in part by National Science and Technology Major Project (2022ZD0114 802), by National Natural Science Foundation of China (Grant Nos. U2241212, 61932001), by Beijing Natural Science Foundation (No. 4222028), by Beijing Outstanding Young Scientist Program (No. BJJWZYJH012019100020098), by Alibaba Group through Alibaba Innovative Research Program, and by Huawei-Renmin University joint program on Information Retrieval. We also wish to acknowledge the support provided by the fund for building world-class universities (disciplines) of Renmin University of China, by Engineering Research Center of Next-Generation Intelligent Search and Recommendation, Ministry of Education, Intelligent Social Governance Interdisciplinary Platform, Major Innovation & Planning Interdisciplinary Platform for the “Double-First Class” Initiative, Public Policy and Decision-making Research Lab, and Public Computing Cloud, Renmin University of China. The work was partially done at Beijing Key Laboratory of Big Data Management and Analysis Methods, MOE Key Lab of Data Engineering and Knowledge Engineering, and Pazhou Laboratory (Huangpu), Guangzhou, Guangdong 510555, China.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests The authors declare that they have no competing interests or financial conflicts to disclose.
Additional information
Yanping Zheng is a PhD candidate at Gaoling School of Artificial Intelligence, Renmin University of China, China advised by Professor Zhewei Wei. She received her master’s degree of engineering from Beijing Technology and Business University, China in 2020. Her research focuses on graph learning algorithms. She is particularly interested in efficient algorithms on Graph Neural Networks, Dynamic Graph Representation Learning.
Lu Yi is currently a PhD student at Gaoling School of Artificial Intelligence, Renmin University of China, China and advised by Professor Zhewei Wei. She received her B.E. degree in Computer Science and Technology at School of Computer Science, Beijing University of Posts and Telecommunications, China in June 2022. Her research lie in the field of graph-related machine learning and efficient graph algorithm.
Zhewei Wei is currently a professor at Gaoling School of Artificial Intelligence, Renmin University of China, China. He obtained his PhD degree at Department of Computer Science and Engineering, The Hong Kong University of Science and Technology (HKUST), China in 2012. He received the BSc degree in the School of Mathematical Sciences at Peking University, China in 2008. His research interests include graph algorithms, massive data algorithms, and streaming algorithms. He was the Proceeding Chair of SIGMOD/PODS2020 and ICDT2021, the Area Chair of ICML 2022/2023, NeurIPS 2022/2023, ICLR 2023, WWW 2023. He is also the PC member of various top conferences, such as VLDB, KDD, ICDE, ICML, and NeurIPS.
Electronic supplementary material
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.
The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Zheng, Y., Yi, L. & Wei, Z. A survey of dynamic graph neural networks. Front. Comput. Sci. 19, 196323 (2025). https://doi.org/10.1007/s11704-024-3853-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11704-024-3853-2