Skip to main content
Log in

Embedding temporal networks inductively via mining neighborhood and community influences

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Network embedding aims to generate an embedding for each node in a network, which facilitates downstream machine learning tasks such as node classification and link prediction. Current work mainly focuses on transductive network embedding, i.e. generating fixed node embeddings, which is not suitable for real-world applications. This paper proposes a novel continual temporal network embedding method called ConMNCI by mining neighborhood and community influences inductively. We propose an aggregator function that integrates neighborhood influence with community influence to generate node embeddings at any time, and introduce the idea from continual learning to enhance inductive learning. We conduct extensive experiments on several real-world datasets and compare ConMNCI with several state-of-the-art baseline methods on various tasks, including node classification and network visualization. The experimental results show that ConMNCI significantly outperforms the state-of-the-art baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. In: International conference on learning representations

  2. Bastings J, Filippova K (2020) The elephant in the interpretability room: Why use attention as explanation when we have saliency methods?. In: Proceedings of the Third BlackboxNLP workshop on analyzing and interpreting neural networks for NLP, pp 149–155

  3. Bochner S (1934) A theorem on fourier-stieltjes integrals. Bulletin of The American Mathematical Society, pp 271–277

  4. Bruna J, Zaremba W, Szlam A, LeCun Y (2014) Spectral networks and locally connected networks on graphs. ICLR

  5. Cao S, Lu W, Xu Q (2015) Grarep: Learning graph representations with global structural information. CIKM

  6. Cavallari S, Zheng WV, Cai H, Chang CCK, Cambria E (2017) Learning community embedding with community detection and node embedding on graphs. CIKM, pp 377–386

  7. Cho K, Merrienboer vB, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using rnn encoder-decoder for statistical machine translation. EMNLP pp 1724–1734

  8. Cui P, Wang X, Pei J, Zhu W (2019) A survey on network embedding. IEEE Transactions on Knowledge and Data Engineering

  9. Ditzler G, Roveri M, Alippi C, Polikar R (2015) Learning in nonstationary environments: A survey. IEEE Computational Intelligence Magazine pp 12–25

  10. Erdos P (1961) Graph theory and probability. Canadian Journal of Mathematics

  11. Fanzhen L, Shan X, Jia W, Chuan Z, Wenbin H, Cecile P, Surya N, Jian Y, S PY (2020) Deep learning for community detection: Progress, challenges and opportunities. IJCAI pp 4981–4987

  12. Grossberg S (1980) How does a brain build a cognitive code? Psychological review pp 1–51

  13. Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. KDD pp 855–864

  14. Hamilton, LW, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. NIPS pp 1024–1034

  15. Hanley AJ, McNeil JB (1982) The meaning and use of the area under a receiver operating characteristic (roc) curve. Radiology pp 29–36

  16. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Computation pp 1735–1780

  17. Hu L, Li C, Shi C, Yang C, Shao C (2020) Graph neural news recommendation with long-term and short-term interest modeling. Information Processing and Management

  18. Kim D, Oh A (2020) How to find your friendly neighborhood: Graph attention design with self-supervision. ICLR

  19. Kingma PD, Ba LJ (2015) Adam: A method for stochastic optimization. ICLR

  20. Kirkpatrick J, Pascanu R, Rabinowitz CN, Veness J, Desjardins G, Rusu AA, Milan K, Quan J, Ramalho T, Grabska-Barwinska A, Hassabis D, Clopath C, Kumaran D, Hadsell R (2017) Overcoming catastrophic forgetting in neural networks. In: Proceedings of the national academy of sciences of the United States of America

  21. Kullback S, Leibler AR (1951) On information and sufficiency. The Annals of Mathematical Statistics pp 79–86

  22. Kumar S, Hooi B, Makhija D, Kumar M, Faloutsos C, Subrahmanian V (2018) Rev2: Fraudulent user prediction in rating platforms. In: WSDM, ACM, pp 333–341

  23. Kumar S, Spezzano F, Subrahmanian V, Faloutsos C (2016) Edge weight prediction in weighted signed networks. In: ICDM, IEEE, pp 221–230

  24. Kumar S, Zhang X, Leskovec J (2018) Learning dynamic embeddings from temporal interactions. arXiv: Machine Learning

  25. Li J, Wang Y, McAuley JJ (2020) Time interval aware self-attention for sequential recommendation. WSDM pp 322–330

  26. Liu M, Liu Y (2021) Inductive representation learning in temporal networks via mining neighborhood and community influences. In: SIGIR 2021: 44th international ACM SIGIR conference on research and development in information retrieval

  27. Liu M, Quan Z, Liu Y (2020) Network representation learning algorithm based on neighborhood influence sequence. ACML pp 609–624

  28. Maaten vdL, Hinton G (2008) Visualizing data using t-sne. Journal of Machine Learning Research

  29. McClelland LJ, McNaughton LB, O’Reilly CR (1995) Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychological review pp 419–457

  30. McCloskey M, Cohen JN (1989) Catastrophic interference in connectionist networks: The sequential learning problem. Psychology of Learning and Motivation pp 109–165

  31. Mehran SK, Rishab G, Sepehr E, Janahan R, Jaspreet S, Sanjay T, Stella W, Cathal S, Pascal P, Marcus B (2019) Time2vec: Learning a vector representation of time. arXiv: Social and Information Networks

  32. Mermillod M, Bugaiska A, Bonin P (2013) The stability-plasticity dilemma: investigating the continuum from catastrophic forgetting to age-limited learning effects. FRONTIERS IN PSYCHOLOGY pp 504–504

  33. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. CoRR

  34. Nguyen HG, Lee BJ, Rossi AR, Ahmed KN, Koh E, Kim S (2018) Continuous-time dynamic network embeddings. WWW pp 969–976

  35. Ni J, Li J, McAuley J (2019) Justifying recommendations using distantly-labeled reviews and fined-grained aspects. EMNLP/IJCNLP pp 188–197

  36. Niepert M, Ahmed MH, Kutzkov K (2016) Learning convolutional neural networks for graphs. In: ICLR, pp 2014–2023

  37. Ou M, Cui P, Pei J, Zhu, W (2016) Asymmetric transitivity preserving graph embedding. KDD

  38. Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl TB, Leiserson CE (2020) EvolveGCN: Evolving graph convolutional networks for dynamic graphs. In: Proceedings of the Thirty-Fourth AAAI conference on artificial intelligence

  39. Parisi IG, Kemker R, Part LJ, Kanan C, Wermter S (2019) Continual lifelong learning with neural networks: A review. Neural Networks pp 54–71

  40. Perozzi B, Al-Rfou’ R, Skiena S (2014) Deepwalk: online learning of social representations. KDD pp 701–710

  41. Qi C, Zhang J, Jia H, Mao Q, Wang L, Song H (2021) Deep face clustering using residual graph convolutional network. Knowledge Based Systems 211:106561

    Article  Google Scholar 

  42. Grossberg S (2013) Adaptive resonance theory: how a brain learns to consciously attend, learn, and recognize a changing world. Neural Networks pp 1–47

  43. Sankar A, Wu Y, Gou L, Zhang W, Yang H (2020) Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In: WSDM, pp 519–527

  44. Srinivasan B, Ribeiro B (2020) On the equivalence between node embeddings and structural graph representations. ICLR

  45. Sun FY, Qu M, Hoffmann J, Huang CW, Tang J (2019) vgraph: A generative model for joint community detection and node representation learning. NIPS pp 512–522

  46. Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q (2015) Line: Large-scale information network embedding. WWW

  47. Thrun BS, Mitchell MT (1993) Lifelong robot learning. Lifelong Robot Learning

  48. Trivedi R, Farajtabar M, Biswal P, Zha H (2019) Dyrep - learning representations over dynamic graphs. ICLR

  49. Tsai HYH, Bai S, Yamada M, Morency LP, Salakhutdinov R (2019) Transformer dissection: An unified understanding for transformer’s attention via the lens of kernel. EMNLP/IJCNLP 1:4343–4352

    Google Scholar 

  50. Tu C, Liu H, Liu Z, Sun M (2017) Cane: Context-aware network embedding for relation modeling. In: ACL, pp 1722–1731

  51. Tu C, Zeng X, Wang H, Zhang Z, Liu Z, Sun M, Zhang B, Lin L (2018) A unified framework for community detection and network representation learning. IEEE Transactions on Knowledge and Data Engineering pp 1–1

  52. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez NA, Kaiser L, Polosukhin I (2017) Attention is all you need. NIPS pp 5998–6008

  53. WANG D, Cui P, Zhu W (2016) Structural deep network embedding. KDD

  54. Wang Y, Chang YY, Liu Y, Leskovec J, Li P (2021) Inductive representation learning in temporal networks via causal anonymous walks. ICLR

  55. Wu J, Wang X, Feng F, He X, Chen L, Lian J, Xie X (2021) Self-supervised graph learning for recommendation. SIGIR pp 726–735

  56. Xu D, Cheng W, Luo D, Liu X, Zhang X (2019) Spatio-temporal attentive rnn for node classification in temporal attributed graphs. IJCAI pp 3947–3953

  57. Xu D, Liang J, Cheng W, Wei H, Chen H, Zhang X (2021) Transformer-style relational reasoning with dynamic memory updating for temporal network modeling. AAAI pp 4546–4554

  58. Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2019) Self-attention with functional time representation learning. NIPS pp 15889–15899

  59. Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. ICLR

  60. Yang M, Zhou M, Kalander M, Huang Z, King I (2021) Discrete-time temporal network embedding via implicit hierarchical learning in hyperbolic space. KDD pp 1975–1985

  61. Zhou F, Cao C (2021) Overcoming catastrophic forgetting in graph neural networks with experience replay. AAAI

  62. Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, Zhang W (2021) Informer: Beyond efficient transformer for long sequence time-series forecasting. In: The Thirty-Fifth AAAI conference on artificial intelligence, AAAI 2021, p. online. AAAI Press

  63. Zuo Y, Liu G, Lin H, Guo J, Hu X, Wu J (2008) Embedding temporal network via neighborhood formation. KDD pp 2857–2866

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (No. 61972135), the Natural Science Foundation of Heilongjiang Province in China (No. LH2020F043), the Innovation Talents Project of Science and Technology Bureau of Harbin (No. 2017RAQXJ094), and the Foundation of Graduate Innovative Research of Heilongjiang University in China (No.YJSCX2021-076HLJU)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong Liu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, M., Quan, ZW., Wu, JM. et al. Embedding temporal networks inductively via mining neighborhood and community influences. Appl Intell 52, 16069–16088 (2022). https://doi.org/10.1007/s10489-021-03102-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-03102-x

Keywords

Navigation