Skip to main content
Log in

Dynamic network embedding via multiple sequence learning

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Capturing dynamic changes of networks can greatly improve the representation ability of nodes, which leads to dynamic network embedding becoming a hot research topic nowadays. However, current work focus on the correlation information and the position information of nodes, while the valuable timestamp information of edges is ignored. The timestamp information of edges presents the revolution of dynamic networks, which is extremely important for the dynamic node influence evaluation. To solve the problems of the existing works, we propose a novel dynamic network embedding method with multiple sequences learnings (DEMS). DEMS uses node sequence learning and edge sequence learning simultaneously to preserve more information of node dynamics in the network embedding. Specifically, node sequence learning preserves the node position information, and edge sequence learning preserves the edge timestamp information. Self-Attention mechanism is used in both sequence learnings to preserve the correlation information. Experiments on seven real-world dynamic networks verify the superiority of DEMS to the state-of-the-art methods in temporal link prediction tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. http://www.networkrepository.com/.

References

  1. Battaglia PW, Hamrick JB, Bapst V, Sanchez-Gonzalez A, Zambaldi V, Malinowski M, Tacchetti A, Raposo D, Santoro, A, Faulkner R et al (2018) Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261

  2. Cao S, Lu W, Xu, Q.: Grarep, (2015) Grarep: learning graph representations with global structural information. In: Proceedings of the 24th ACM international on conference on information and knowledge management, pp 891–900

  3. Cao S, Lu W, Xu Q (2016) Deep neural networks for learning graph representations. In: Thirtieth AAAI conference on artificial intelligence

  4. Chen J, Li K, Li K, Yu PS, Zeng Z (2021) Dynamic planning of bicycle stations in dockless public bicycle-sharing system using gated graph neural network. ACM Trans Intell Syst Technol (TIST) 12(2):1–22

    Article  Google Scholar 

  5. Du L, Wang Y, Song G, Lu Z, Wang J (2018) Dynamic network embedding: an extended approach for skip-gram based network embedding. In: IJCAI, pp 2086–2092

  6. Gehring J, Auli M, Grangier D, Yarats D, Dauphin YN (2017) Convolutional sequence to sequence learning. arXiv preprint arXiv:1705.03122

  7. Goyal P, Ferrara E (2018) Graph embedding techniques, applications, and performance: a survey. Knowl Based Syst 151:78–94

    Article  Google Scholar 

  8. Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pp 855–864

  9. Hajiramezanali E, Hasanzadeh A, Narayanan K, Duffield N, Zhou M, Qian X (2019) Variational graph recurrent neural networks. In: Advances in neural information processing systems, pp. 10701–10711

  10. Huang X, Li J, Hu, X.:SIAM, (2017) Accelerated attributed network embedding. In: Proceedings of the 2017 SIAM international conference on data mining. SIAM, pp. 633–641

  11. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980

  12. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907

  13. Kumar S, Zhang X, Leskovec J (2019) Predicting dynamic embedding trajectory in temporal interaction networks. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp 1269–1278

  14. Li J, Dani H, Hu X, Tang J, Chang Y, Liu H (2017) Attributed network embedding for learning in a dynamic environment. In: Proceedings of the 2017 ACM on conference on information and knowledge management, pp 387–396

  15. Mei H, Eisner JM (2017) The neural Hawkes process: a neurally self-modulating multivariate point process. In: Advances in neural information processing systems, pp 6754–6764

  16. Mikolov, T, Chen K, Corrado G, Dean, J (2013) Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781

  17. Nguyen GH, Lee JB, Rossi RA, Ahmed NK, Koh E, Kim S (2018) Continuous-time dynamic network embeddings. In: Companion proceedings of the the web conference 2018, pp 969–976

  18. Perozzi B, Al-Rfou R, Skiena, S., Deepwalk (2014) Deepwalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pp 701–710

  19. Qiu Z, Hu W, Wu J, Liu W, Du B, Jia X (2020) Temporal network embedding with high-order nonlinear information. In: AAAI, pp 5436–5443

  20. Sak H, Senior A, Beaufays F (2014) Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition. arXiv preprint arXiv:1402.1128

  21. Sutskever I, Vinyals O, Le QV (2014) Sequence to sequence learning with neural networks. In: Advances in neural information processing systems, pp 3104–3112

  22. Trivedi R, Farajtabar M, Biswal P, Zha H (2018) Dyrep: learning representations over dynamic graphs. In: International conference on learning representations

  23. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

  24. Wang D, Cui P, Zhu W (2016) Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pp 1225–1234

  25. Yu B, Lu B, Zhang C, Li C, Pan K (2020) Node proximity preserved dynamic network embedding via matrix perturbation. Knowl Based Syst

  26. Yu W, Cheng W, Aggarwal CC, Zhang K, Chen H, Wang W (2018) A flexible deep embedding approach for anomaly detection in dynamic networks. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, pp 2672–2681

  27. Zhang Z, Cui P, Pei J, Wang X, Zhu W (2018) Timers: error-bounded SVD restart on dynamic networks. In: Thirty-second AAAI conference on artificial intelligence

  28. Zhang Z, Yang H, Bu J, Zhou S, Yu P, Zhang J, Ester M, Wang C (2018) Anrl: attributed network representation learning via deep neural networks. In: IJCAI, vol 18, pp 3155–3161

  29. Zhou L, Yang Y, Ren X, Wu F, Zhuang Y (2018) Dynamic network embedding by modeling triadic closure process. In: Proceedings of the AAAI conference on artificial intelligence

  30. Zhou S, Yang H, Wang X, Bu J, Ester M, Yu P, Zhang J, Wang C (2018) Prre: personalized relation ranking embedding for attributed networks. In: Proceedings of the 27th ACM international conference on information and knowledge management, pp 823–832

  31. Zhu L, Guo D, Yin J, Ver Steeg G, Galstyan A (2016) Scalable temporal latent space inference for link prediction in dynamic social networks. IEEE Trans Knowl Data Eng 28(10):2765–2777

    Article  Google Scholar 

  32. Zuo Y, Liu G, Lin H, Guo J, Hu X, Wu J (2018) Embedding temporal network via neighborhood formation. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, pp 2857–2866

Download references

Acknowledgements

This work was supported by the Key Research and Development Program of Jiangsu Province (BE2019012), and Joint Fund of National Natural Science Foundation of China and Civil Aviation Administration of China (U2033202).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Weiwei Yuan.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, W., Shi, C. & Guan, D. Dynamic network embedding via multiple sequence learning. Neural Comput & Applic 34, 3843–3855 (2022). https://doi.org/10.1007/s00521-021-06646-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06646-8

Keywords

Navigation