skip to main content
research-article

DNformer: Temporal Link Prediction with Transfer Learning in Dynamic Networks

Published:22 February 2023Publication History
Skip Abstract Section

Abstract

Temporal link prediction (TLP) is among the most important graph learning tasks, capable of predicting dynamic, time-varying links within networks. The key problem of TLP is how to explore potential link-evolving tendency from the increasing number of links over time. There exist three major challenges toward solving this problem: temporal nonlinear sparsity, weak serial correlation, and discontinuous structural dynamics. In this article, we propose a novel transfer learning model, called DNformer, to predict temporal link sequence in dynamic networks. The structural dynamic evolution is sequenced into consecutive links one by one over time to inhibit temporal nonlinear sparsity. The self-attention of the model is used to capture the serial correlation between the input and output link sequences. Moreover, our structural encoding is designed to obtain changing structures from the consecutive links and to learn the mapping between link sequences. This structural encoding consists of two parts: the node clustering encoding of each link and the link similarity encoding between links. These encodings enable the model to perceive the importance and correlation of links. Furthermore, we introduce a measurement of structural similarity in the loss function for the structural differences of link sequences. The experimental results demonstrate that our model outperforms other state-of-the-art TLP methods such as Transformer, TGAT, and EvolveGCN. It achieves the three highest AUC and four highest precision scores in five different representative dynamic networks problems.

REFERENCES

  1. [1] Ba Jimmy Lei, Kiros Jamie Ryan, and Hinton Geoffrey E.. 2016. Layer normalization. arXiv e-prints, arXiv: stat.ML/1607.06450.Google ScholarGoogle Scholar
  2. [2] Cai Deng and Lam Wai. 2020. Graph transformer for graph-to-sequence learning. In Proceedings of the 34th AAAI Conference on Artificial Intelligence. AAAI Press, Palo Alto, CA, 74647471.Google ScholarGoogle ScholarCross RefCross Ref
  3. [3] Chen Jinyin, Zhang Jian, Xu Xuanheng, Fu Chenbo, Zhang Dan, Zhang Qingpeng, and Xuan Qi. 2021. E-LSTM-D: A deep learning framework for dynamic network link prediction. IEEE Transactions on Systems, Man, and Cybernetics: Systems 51, 6 (2021), 36993712. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  4. [4] Chen Jia, Zhong Ming, Li Jianxin, Wang Dianhui, Qian Tieyun, and Tu Hang. 2022. Effective deep attributed network representation learning with topology adapted smoothing. IEEE Transactions on Cybernetics 52, 7 (2022), 59355946. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  5. [5] Devlin Jacob, Chang Ming-Wei, Lee Kenton, and Toutanova Kristina. 2019. BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, Minneapolis, Minnesota, 41714186. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  6. [6] Du Lun, Wang Yun, Song Guojie, Lu Zhicong, and Wang Junshan. 2018. Dynamic network embedding: An extended approach for skip-gram based network embedding. In Proceedings of the 27th AAAI Conference on Artificial Intelligence. AAAI Press, Palo Alto, CA, 20862092.Google ScholarGoogle ScholarCross RefCross Ref
  7. [7] Dunlavy Daniel M., Kolda Tamara G., and Acar Evrim. 2011. Temporal link prediction using matrix and tensor factorizations. ACM Transactions on Knowledge Discovery from Data 5, 2, Article 10 (Feb.2011), 27 pages. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. [8] Dwivedi Vijay Prakash and Bresson Xavier. 2020. A generalization of transformer networks to graphs. arXiv e-prints, arXiv:2012.09699.Google ScholarGoogle Scholar
  9. [9] Gao Xiaofeng, Cao Zhenhao, Li Sha, Yao Bin, Chen Guihai, and Tang Shaojie. 2019. Taxonomy and evaluation for microblog popularity prediction. ACM Transactions on Knowledge Discovery from Data 13, 2, Article 15 (March2019), 40 pages. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. [10] Gulati Anmol, Qin James, Chiu ChungCheng, Parmar Niki, Zhang Yu, Yu Jiahui, Han Wei, Wang Shibo, Zhang Zhengdong, and Wu Yonghui. 2020. Conformer: Convolution-augmented transformer for speech recognition. In Proceedings of INTERSPEECH. International Speech Communication Association, Shanghai, China, 50365040.Google ScholarGoogle ScholarCross RefCross Ref
  11. [11] Hou Chengbin, Zhang Han, He Shan, and Tang Ke. 2020. GloDyNE: Global topology preserving dynamic network embedding. IEEE Transactions on Knowledge and Data Engineering (2020), 11. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. [12] Hu Ziniu, Dong Yuxiao, Wang Kuansan, and Sun Yizhou. 2020. Heterogeneous graph transformer. In Proceedings of the Web Conference 2020. Association for Computing Machinery, New York, NY, 27042710. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. [13] Huang Jin, Nie Feiping, Huang Heng, Tu Yi-Cheng, and Lei Yu. 2013. Social trust prediction using heterogeneous networks. ACM Transactions on Knowledge Discovery from Data 7, 4, Article 17 (Dec.2013), 21 pages. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. [14] Junuthula Ruthwik R., Xu Kevin S., and Devabhaktuni Vijay K.. 2016. Evaluating link prediction accuracy in dynamic networks with added and removed edges. In Proceedings of 2016 IEEE International Conferences on Big Data and Cloud Computing. IEEE Computer Society, Los Alamitos, CA, 377384. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  15. [15] Kempe David, Kleinberg Jon, and Tardos Éva. 2003. Maximizing the spread of influence through a social network. In Proceedings of the 9th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Association for Computing Machinery, New York, NY, 137146. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. [16] Koncel-Kedziorski Rik, Bekal Dhanush, Luan Yi, Lapata Mirella, and Hajishirzi Hannaneh. 2019. Text generation from knowledge graphs with graph transformers. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, Minneapolis, Minnesota, 22842293. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  17. [17] Li Xiaoyi, Du Nan, Li Hui, Li Kang, Gao Jing, and Zhang Aidong. 2014. A deep learning approach to link prediction in dynamic networks. In Proceedings of the 2014 SIAM International Conference on Data Mining, Vol. 1. Society for Industrisl and Applied Mathematics, Philadelphia, Pennsylvania, 289297.Google ScholarGoogle ScholarCross RefCross Ref
  18. [18] Liben-Nowell David and Kleinberg Jon. 2007. The link-prediction problem for social networks. Journal of the American Society for Information Science and Technology 58, 7 (2007), 10191031. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. [19] Liu Xutong, Chen Feng, Lu Yen-Cheng, and Lu Chang-Tien. 2017. Spatial prediction for multivariate non-Gaussian data. ACM Transactions on Knowledge Discovery from Data 11, 3, Article 36 (March2017), 27 pages. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. [20] Liu Xuanqing, Yu Hsiang-Fu, Dhillon Inderjit S., and Hsieh Cho-Jui. 2020. Learning to encode position for transformer with continuous dynamical model. In Proceedings of the 37th International Conference on Machine Learning. JMLR.org, Vienna, AUSTRIA, Article 587, 9 pages.Google ScholarGoogle Scholar
  21. [21] Liu Ze, Lin Yutong, Cao Yue, Hu Han, Wei Yixuan, Zhang Zheng, Lin Stephen, and Guo Baining. 2021. Swin transformer: Hierarchical vision transformer using shifted windows. In Proceedings of the2021 IEEE/CVF International Conference on Computer Vision. Computer Vision Foundation, Montreal, Canada, 999210002. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  22. [22] Pareja Aldo, Domeniconi Giacomo, Chen Jie, Ma Tengfei, Suzumura Toyotaro, Kanezashi Hiroki, Kaler Tim, Schardl Tao B., and Leiserson Charles E.. 2020. EvolveGCN: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the 34th AAAI Conference on Artificial Intelligence. AAAI Press, Palo Alto, CA, 56795681.Google ScholarGoogle ScholarCross RefCross Ref
  23. [23] Paris Sylvain, Hasinoff Samuel W., and Kautz Jan. 2015. Local laplacian filters: Edge-aware image processing with a Laplacian pyramid. Communications of the ACM 58, 3 (Feb.2015), 8191. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. [24] Peng Hao, Li Jianxin, Yan Hao, Qiran Gong, Wang Senzhang, Liu Lin, Wang Lihong, and Ren Xiang. 2020. Dynamic network embedding via incremental skip-gram with negative sampling. Science China Information Sciences 63, 10 (2020), 19. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  25. [25] Peng Hao, Yang Renyu, Wang Zheng, Li Jianxin, He Lifang, Yu Philip S., Zomaya Albert Y., and Ranjan Rajiv. 2022. Lime: Low-cost and incremental learning for dynamic heterogeneous information networks. IEEE Transactions on Computers 71, 3 (2022), 628642. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. [26] Perozzi Bryan, Al-Rfou Rami, and Skiena Steven. 2014. DeepWalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Association for Computing Machinery, New York, NY, 701710.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. [27] Pu Cunlai, Li Jie, Wang Jian, and Quek Tony Q. S.. 2022. The node-similarity distribution of complex networks and its applications in link prediction. IEEE Transactions on Knowledge and Data Engineering 34, 8 (2022), 40114023. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  28. [28] Qiu Zhenyu, Wu Jia, Hu Wenbin, Du Bo, Yuan Guocai, and Yu Philip. 2021. Temporal link prediction with motifs for social networks. IEEE Transactions on Knowledge and Data Engineering (2021), 11. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  29. [29] Rong Yu, Bian Yatao, Xu Tingyang, Xie Weiyang, Wei Ying, Huang Wenbing, and Huang Junzhou. 2020. Self-supervised graph transformer on large-scale molecular data. In Proceedings of Neural Information Processing Systems, Vol. 33. Curran Associates, Inc., Vancouver, Canada, 1255912571.Google ScholarGoogle Scholar
  30. [30] Rossi Andrea, Barbosa Denilson, Firmani Donatella, Matinata Antonio, and Merialdo Paolo. 2021. Knowledge graph embedding for link prediction: A comparative analysis. ACM Transactions on Knowledge Discovery from Data 15, 2, Article 14 (Jan.2021), 49 pages. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. [31] Rossi Ryan A. and Ahmed Nesreen K.. 2015. The network data repository with interactive graph analytics and visualization. In Proceedings of AAAI Conference on Artificial Intelligence.AAAI Press, Palo Alto, CA, 42924293. Retrieved from http://networkrepository.com.Google ScholarGoogle ScholarCross RefCross Ref
  32. [32] Sharan Umang and Neville Jennifer. 2008. Temporal-relational classifiers for prediction in evolving domains. In Proceedings of 8th IEEE international conference on data mining. IEEE Computer Society, Los Alamitos, CA, 540549.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. [33] Smith Laura M., Zhu Linhong, Lerman Kristina, and Percus Allon G.. 2016. Partitioning networks with node attributes by compressing information flow. ACM Transactions on Knowledge Discovery from Data 11, 2 (2016), 126. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. [34] Sun Shichang, Liu Hongbo, Meng Jiana, Chen C. L. Philip, and Yang Yu. 2018. Substructural regularization with data-sensitive granularity for sequence transfer learning. IEEE Transactions on Neural Networks and Learning Systems 29, 6 (2018), 25452557. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  35. [35] Sun Yizhou, Tang Jie, Han Jiawei, Chen Cheng, and Gupta Manish. 2014. Co-evolution of multi-typed objects in dynamic star networks. IEEE Transactions on Knowledge and Data Engineering 26, 12 (2014), 29422955. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  36. [36] Vaswani Ashish, Shazeer Noam, Parmar Niki, Uszkoreit Jakob, Jones Llion, Gomez Aidan N., Kaiser Łukasz, and Polosukhin Illia. 2017. Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems. Curran Associates Inc., Red Hook, NY, 60006010.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. [37] Wang Huan, Hu Wenbin, Qiu Zhenyu, and Du Bo. 2017. Nodes’ evolution diversity and link prediction in social networks. IEEE Transactions on Knowledge and Data Engineering 29, 10 (2017), 22632274. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. [38] Wasserman Stanley and Faust Katherine. 1994. Social Network Analysis: Methods and Applications. Cambridge University Press, Cambridge, UK. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  39. [39] Wu Zonghan, Pan Shirui, Chen Fengwen, Long Guodong, Zhang Chengqi, and Yu Philip S.. 2021. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems 32, 1 (2021), 424. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  40. [40] Xu Da, Ruan Chuanwei, Korpeoglu Evren, Kumar Sushant, and Achan Kannan. 2020. Inductive representation learning on temporal graphs. In Proceedings of International Conference on Learning Representations. OpenReview.net, Ithaca, NY, 1.Google ScholarGoogle Scholar
  41. [41] Xu Jin, Tao Yubo, Yan Yuyu, and Lin Hai. 2020. Exploring evolution of dynamic networks via diachronic node embeddings. IEEE Transactions on Visualization and Computer Graphics 26, 7 (2020), 23872402. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  42. [42] Xu Keyulu, Hu Weihua, Leskovec Jure, and Jegelka Stefanie. 2019. How powerful are graph neural networks? In Proceedings of International Conference on Learning Representations. OpenReview.net, Ithaca, NY, 1.Google ScholarGoogle Scholar
  43. [43] Yan Sijie, Xiong Yuanjun, and Lin Dahua. 2018. Spatial temporal graph convolutional networks for skeleton-based action recognition. In Proceedings of the 32th AAAI Conference on Artificial Intelligence. AAAI Press, Palo Alto, CA, 74447452.Google ScholarGoogle ScholarCross RefCross Ref
  44. [44] Yang Min, Liu Junhao, Chen Lei, Zhao Zhou, Chen Xiaojun, and Shen Ying. 2020. An advanced deep generative framework for temporal link prediction in dynamic networks. IEEE Transactions on Cybernetics 50, 12 (2020), 49464957. Google ScholarGoogle ScholarCross RefCross Ref
  45. [45] Ying Chengxuan, Cai Tianle, Luo Shengjie, Zheng Shuxin, Ke Guolin, He Di, Shen Yanming, and Liu Tie-Yan. 2021. Do transformers really perform badly for graph representation? In Proceedings of Neural Information Processing Systems, Vol. 34. Curran Associates, Inc., Vancouver, Canada, 2887728888.Google ScholarGoogle Scholar
  46. [46] Yu Wenchao, Aggarwal Charu C., and Wang Wei. 2017. Temporally factorized network modeling for evolutionary network analysis. In Proceedings of the 10th ACM International Conference on Web Search and Data Mining. Association for Computing Machinery, New York, NY, 455464.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. [47] Zhang Muhan and Chen Yixin. 2018. Link prediction based on graph neural networks. In Proceedings of the 32nd International Conference on Neural Information Processing Systems. Curran Associates Inc., Red Hook, NY, 51715181.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. [48] Zhou Lekui, Yang Yang, Ren Xiang, Wu Fei, and Zhuang Yueting. 2018. Dynamic network embedding by modeling triadic closure process. In Proceedings of the 32th AAAI Conference on Artificial Intelligence. AAAI Press, Palo Alto, CA, 571578.Google ScholarGoogle ScholarCross RefCross Ref
  49. [49] Zhu Linhong, Guo Dong, Yin Junming, Steeg Greg Ver, and Galstyan Aram. 2016. Scalable temporal latent space inference for link prediction in dynamic social networks. IEEE Transactions on Knowledge and Data Engineering 28, 10 (2016), 27652777. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. DNformer: Temporal Link Prediction with Transfer Learning in Dynamic Networks

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Knowledge Discovery from Data
          ACM Transactions on Knowledge Discovery from Data  Volume 17, Issue 3
          April 2023
          379 pages
          ISSN:1556-4681
          EISSN:1556-472X
          DOI:10.1145/3583064
          Issue’s Table of Contents

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 22 February 2023
          • Online AM: 2 August 2022
          • Accepted: 20 July 2022
          • Revised: 17 March 2022
          • Received: 23 November 2021
          Published in tkdd Volume 17, Issue 3

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
        • Article Metrics

          • Downloads (Last 12 months)776
          • Downloads (Last 6 weeks)36

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Full Text

        View this article in Full Text.

        View Full Text

        HTML Format

        View this article in HTML Format .

        View HTML Format