Lecture-style Tutorial: Towards Graph Foundation Models
Pages 1264 - 1267
Abstract
Emerging as fundamental building blocks for diverse artificial intelligence applications, foundation models have achieved notable success across natural language processing and many other domains. Concurrently, graph machine learning has gradually evolved from shallow methods to deep models to leverage the abundant graph-structured data that constitute an important pillar in the data ecosystem for artificial intelligence. Naturally, the emergence and homogenization capabilities of foundation models have piqued the interest of graph machine learning researchers. This has sparked discussions about developing a next-generation graph learning paradigm, one that is pre-trained on broad graph data and can be adapted to a wide range of downstream graph-based tasks. However, there is currently no clear definition or systematic analysis for this type of work.
In this tutorial, we will introduce the concept of graph foundation models (GFMs), and provide a comprehensive exposition on their key characteristics and underpinning technologies. Subsequently, we will thoroughly review existing works that lay the groundwork towards GFMs, which are summarized into three primary categories based on their roots in graph neural networks, large language models, or a hybrid of both. Beyond providing a comprehensive overview and in-depth analysis of the current landscape and progress towards graph foundation models, this tutorial will also explore potential avenues for future research in this important and dynamic field. Finally, to help the audience gain a systematic understanding of the topics covered in this tutorial, we present further details in our recent preprint paper, "Towards Graph Foundation Models: A Survey and Beyond"[4], available at https://arxiv.org/pdf/2310.11829.pdf.
Supplemental Material
MP4 File
Presentation video - Tutorial
- Download
- 1665.16 MB
MP4 File
Supplemental video
- Download
- 24.86 MB
References
[1]
Rishi Bommasani, Drew A Hudson, Ehsan Adeli, Russ Altman, Simran Arora, Sydney von Arx, Michael S Bernstein, Jeannette Bohg, Antoine Bosselut, Emma Brunskill, et al. 2021. On the opportunities and risks of foundation models. arXiv preprint arXiv:2108.07258 (2021).
[2]
Jiayan Guo, Lun Du, and Hengyu Liu. 2023. GPT4Graph: Can Large Language Models Understand Graph Structured Data? An Empirical Evaluation and Benchmarking. arXiv preprint arXiv:2305.15066 (2023).
[3]
Tianyang Lin, Yuxin Wang, Xiangyang Liu, and Xipeng Qiu. 2022. A survey of transformers. AI Open (2022).
[4]
Jiawei Liu, Cheng Yang, Zhiyuan Lu, Junze Chen, Yibo Li, Mengmei Zhang, Ting Bai, Yuan Fang, Lichao Sun, Philip S Yu, et al. 2023. Towards Graph Foundation Models: A Survey and Beyond. arXiv preprint arXiv:2310.11829 (2023).
[5]
Yixin Liu, Ming Jin, Shirui Pan, Chuan Zhou, Yu Zheng, Feng Xia, and S Yu Philip. 2022. Graph self-supervised learning: A survey. IEEE Transactions on Knowledge and Data Engineering, Vol. 35, 6 (2022), 5879--5900.
[6]
Xipeng Qiu, Tianxiang Sun, Yige Xu, Yunfan Shao, Ning Dai, and Xuanjing Huang. 2020. Pre-trained models for natural language processing: A survey. Science China Technological Sciences, Vol. 63, 10 (2020), 1872--1897.
[7]
Chuan Shi, Yitong Li, Jiawei Zhang, Yizhou Sun, and S Yu Philip. 2016. A survey of heterogeneous information network analysis. IEEE Transactions on Knowledge and Data Engineering, Vol. 29, 1 (2016), 17--37.
[8]
Heng Wang, Shangbin Feng, Tianxing He, Zhaoxuan Tan, Xiaochuang Han, and Yulia Tsvetkov. 2023. Can Language Models Solve Graph Problems in Natural Language? arXiv preprint arXiv:2305.10037 (2023).
[9]
Yang Yuan. 2023. On the power of foundation models. In International Conference on Machine Learning. PMLR, 40519--40530.
[10]
Ziwei Zhang, Haoyang Li, Zeyang Zhang, Yijian Qin, Xin Wang, and Wenwu Zhu. 2023. Large Graph Models: A Perspective. arXiv preprint arXiv:2308.14522 (2023). io
Index Terms
- Lecture-style Tutorial: Towards Graph Foundation Models
Comments
Information & Contributors
Information
Published In

May 2024
1928 pages
ISBN:9798400701726
DOI:10.1145/3589335
- General Chairs:
- Tat-Seng Chua,
- Chong-Wah Ngo,
- Program Chairs:
- Ravi Kumar,
- Hady W. Lauw,
- Roy Ka-Wei Lee
Copyright © 2024 ACM.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Published: 13 May 2024
Check for updates
Author Tags
Qualifiers
- Tutorial
Conference
WWW '24
Sponsor:
Acceptance Rates
Overall Acceptance Rate 1,899 of 8,196 submissions, 23%
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 249Total Downloads
- Downloads (Last 12 months)249
- Downloads (Last 6 weeks)9
Reflects downloads up to 17 Feb 2025
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in