Abstract
Temporal heterogeneous graphs can model lots of complex systems in the real world, such as social networks and e-commerce applications, which are naturally time-varying and heterogeneous. As most existing graph representation learning methods cannot efficiently handle both of these characteristics, we propose a Transformer-like representation learning model, named THAN, to learn low-dimensional node embeddings preserving the topological structure features, heterogeneous semantics, and dynamic evolutionary patterns of temporal heterogeneous graphs simultaneously. Specifically, THAN first samples heterogeneous neighbors with temporal constraints and projects node features into the same vector space, then encodes time information and aggregates the neighborhood influence in different weights via type-aware self-attention. Experiments on three real-world datasets demonstrate that THAN outperforms the state-of-the-arts in terms of effectiveness with respect to the temporal link prediction task.
This work was supported in part by the National Key Research and Development Program of China (2018YFB0704301-1), the National Natural Science Foundation of China (61972268), the Med-X Center for Informatics Funding Project (YGJC001).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ba, L.J., Kiros, J.R., Hinton, G.E.: Layer normalization. CoRR abs/1607.06450 (2016)
Dong, Y., Chawla, N.V., Swami, A.: metapath2vec: scalable representation learning for heterogeneous networks. In: SIGKDD, pp. 135–144 (2017)
Fan, Y., Ju, M., Zhang, C., Zhao, L., Ye, Y.: Heterogeneous temporal graph neural network. CoRR abs/2110.13889 (2021)
Fey, M., Lenssen, J.E.: Fast graph representation learning with PyTorch geometric. CoRR abs/1903.02428 (2019)
Fu, T., Lee, W., Lei, Z.: HIN2Vec: explore meta-paths in heterogeneous information networks for representation learning. In: CIKM, pp. 1797–1806 (2017)
Grover, A., Leskovec, J.: node2vec: scalable feature learning for networks. In: SIGKDD, pp. 855–864 (2016)
Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: NIPS, pp. 1024–1034 (2017)
Hu, Z., Dong, Y., Wang, K., Sun, Y.: Heterogeneous graph transformer. In: WWW, pp. 2704–2710 (2020)
Huang, H., Shi, R., Zhou, W., Wang, X., Jin, H., Fu, X.: Temporal heterogeneous information network embedding. In: IJCAI, pp. 1470–1476 (2021)
Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: ACL, pp. 687–696 (2015)
Ji, Y., Jia, T., Fang, Y., Shi, C.: Dynamic heterogeneous graph embedding via heterogeneous Hawkes process. In: Oliver, N., Pérez-Cruz, F., Kramer, S., Read, J., Lozano, J.A. (eds.) ECML PKDD 2021. LNCS (LNAI), vol. 12975, pp. 388–403. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-86486-6_24
Kazemi, S.M., et al.: Time2Vec: learning a vector representation of time. CoRR abs/1907.05321 (2019)
Kazemi, S.M., et al.: Representation learning for dynamic graphs: a survey. J. Mach. Learn. Res. 21, 70:1–70:73 (2020)
Kipf, T.N., Welling, M.: Variational graph auto-encoders. CoRR abs/1611.07308 (2016)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2017)
Kumar, S., Zhang, X., Leskovec, J.: Predicting dynamic embedding trajectory in temporal interaction networks. In: SIGKDD, pp. 1269–1278 (2019)
Luo, J., Xiao, S., Jiang, S., Gao, H., Xiao, Y.: ripple2vec: node embedding with ripple distance of structures. Data Sci. Eng. 7, 156–174 (2022). https://doi.org/10.1007/s41019-022-00184-6
Pareja, A., et al.: EvolveGCN: evolving graph convolutional networks for dynamic graphs. In: AAAI, pp. 5363–5370 (2020)
Perozzi, B., Al-Rfou, R., Skiena, S.: DeepWalk: online learning of social representations. In: SIGKDD, pp. 701–710 (2014)
Sankar, A., Wu, Y., Gou, L., Zhang, W., Yang, H.: DySAT: Deep neural representation learning on dynamic graphs via self-attention networks. In: WSDM, pp. 519–527 (2020)
Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., et al. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38
Trivedi, R., Farajtabar, M., Biswal, P., Zha, H.: DyRep: learning representations over dynamic graphs. In: ICLR (2019)
Tuteja, S., Kumar, R.: A unification of heterogeneous data sources into a graph model in e-commerce. Data Sci. Eng. 7, 57–70 (2022). https://doi.org/10.1007/s41019-021-00174-0
Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 5998–6008 (2017)
Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: ICLR (2018)
Wang, X., et al.: Heterogeneous graph attention network. In: WWW, pp. 2022–2032 (2019)
Wang, Y., Chang, Y., Liu, Y., Leskovec, J., Li, P.: Inductive representation learning in temporal networks via causal anonymous walks. In: ICLR (2021)
Xu, D., Ruan, C., Körpeoglu, E., Kumar, S., Achan, K.: Self-attention with functional time representation learning. In: NIPS, pp. 15889–15899 (2019)
Xu, D., Ruan, C., Körpeoglu, E., Kumar, S., Achan, K.: Inductive representation learning on temporal graphs. In: ICLR (2020)
Xue, H., Yang, L., Jiang, W., Wei, Y., Hu, Y., Lin, Yu.: Modeling dynamic heterogeneous network for link prediction using hierarchical attention with temporal RNN. In: Hutter, F., Kersting, K., Lijffijt, J., Valera, I. (eds.) ECML PKDD 2020. LNCS (LNAI), vol. 12457, pp. 282–298. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-67658-2_17
Yin, Y., Ji, L., Zhang, J., Pei, Y.: DHNE: network representation learning method for dynamic heterogeneous networks. IEEE Access 7, 134782–134792 (2019)
Ying, C., et al.: Do transformers really perform bad for graph representation? CoRR abs/2106.05234 (2021)
Zhao, J., Wang, X., Shi, C., Hu, B., Song, G., Ye, Y.: Heterogeneous graph structure learning for graph neural networks. In: AAAI, pp. 4697–4705 (2021)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Li, L. et al. (2023). Transformer-Based Representation Learning on Temporal Heterogeneous Graphs. In: Li, B., Yue, L., Tao, C., Han, X., Calvanese, D., Amagasa, T. (eds) Web and Big Data. APWeb-WAIM 2022. Lecture Notes in Computer Science, vol 13422. Springer, Cham. https://doi.org/10.1007/978-3-031-25198-6_29
Download citation
DOI: https://doi.org/10.1007/978-3-031-25198-6_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-25197-9
Online ISBN: 978-3-031-25198-6
eBook Packages: Computer ScienceComputer Science (R0)