Abstract
Network embedding aims to generate an embedding for each node in a network, which facilitates downstream machine learning tasks such as node classification and link prediction. Current work mainly focuses on transductive network embedding, i.e. generating fixed node embeddings, which is not suitable for real-world applications. This paper proposes a novel continual temporal network embedding method called ConMNCI by mining neighborhood and community influences inductively. We propose an aggregator function that integrates neighborhood influence with community influence to generate node embeddings at any time, and introduce the idea from continual learning to enhance inductive learning. We conduct extensive experiments on several real-world datasets and compare ConMNCI with several state-of-the-art baseline methods on various tasks, including node classification and network visualization. The experimental results show that ConMNCI significantly outperforms the state-of-the-art baselines.
Similar content being viewed by others
References
Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. In: International conference on learning representations
Bastings J, Filippova K (2020) The elephant in the interpretability room: Why use attention as explanation when we have saliency methods?. In: Proceedings of the Third BlackboxNLP workshop on analyzing and interpreting neural networks for NLP, pp 149–155
Bochner S (1934) A theorem on fourier-stieltjes integrals. Bulletin of The American Mathematical Society, pp 271–277
Bruna J, Zaremba W, Szlam A, LeCun Y (2014) Spectral networks and locally connected networks on graphs. ICLR
Cao S, Lu W, Xu Q (2015) Grarep: Learning graph representations with global structural information. CIKM
Cavallari S, Zheng WV, Cai H, Chang CCK, Cambria E (2017) Learning community embedding with community detection and node embedding on graphs. CIKM, pp 377–386
Cho K, Merrienboer vB, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using rnn encoder-decoder for statistical machine translation. EMNLP pp 1724–1734
Cui P, Wang X, Pei J, Zhu W (2019) A survey on network embedding. IEEE Transactions on Knowledge and Data Engineering
Ditzler G, Roveri M, Alippi C, Polikar R (2015) Learning in nonstationary environments: A survey. IEEE Computational Intelligence Magazine pp 12–25
Erdos P (1961) Graph theory and probability. Canadian Journal of Mathematics
Fanzhen L, Shan X, Jia W, Chuan Z, Wenbin H, Cecile P, Surya N, Jian Y, S PY (2020) Deep learning for community detection: Progress, challenges and opportunities. IJCAI pp 4981–4987
Grossberg S (1980) How does a brain build a cognitive code? Psychological review pp 1–51
Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. KDD pp 855–864
Hamilton, LW, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. NIPS pp 1024–1034
Hanley AJ, McNeil JB (1982) The meaning and use of the area under a receiver operating characteristic (roc) curve. Radiology pp 29–36
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Computation pp 1735–1780
Hu L, Li C, Shi C, Yang C, Shao C (2020) Graph neural news recommendation with long-term and short-term interest modeling. Information Processing and Management
Kim D, Oh A (2020) How to find your friendly neighborhood: Graph attention design with self-supervision. ICLR
Kingma PD, Ba LJ (2015) Adam: A method for stochastic optimization. ICLR
Kirkpatrick J, Pascanu R, Rabinowitz CN, Veness J, Desjardins G, Rusu AA, Milan K, Quan J, Ramalho T, Grabska-Barwinska A, Hassabis D, Clopath C, Kumaran D, Hadsell R (2017) Overcoming catastrophic forgetting in neural networks. In: Proceedings of the national academy of sciences of the United States of America
Kullback S, Leibler AR (1951) On information and sufficiency. The Annals of Mathematical Statistics pp 79–86
Kumar S, Hooi B, Makhija D, Kumar M, Faloutsos C, Subrahmanian V (2018) Rev2: Fraudulent user prediction in rating platforms. In: WSDM, ACM, pp 333–341
Kumar S, Spezzano F, Subrahmanian V, Faloutsos C (2016) Edge weight prediction in weighted signed networks. In: ICDM, IEEE, pp 221–230
Kumar S, Zhang X, Leskovec J (2018) Learning dynamic embeddings from temporal interactions. arXiv: Machine Learning
Li J, Wang Y, McAuley JJ (2020) Time interval aware self-attention for sequential recommendation. WSDM pp 322–330
Liu M, Liu Y (2021) Inductive representation learning in temporal networks via mining neighborhood and community influences. In: SIGIR 2021: 44th international ACM SIGIR conference on research and development in information retrieval
Liu M, Quan Z, Liu Y (2020) Network representation learning algorithm based on neighborhood influence sequence. ACML pp 609–624
Maaten vdL, Hinton G (2008) Visualizing data using t-sne. Journal of Machine Learning Research
McClelland LJ, McNaughton LB, O’Reilly CR (1995) Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychological review pp 419–457
McCloskey M, Cohen JN (1989) Catastrophic interference in connectionist networks: The sequential learning problem. Psychology of Learning and Motivation pp 109–165
Mehran SK, Rishab G, Sepehr E, Janahan R, Jaspreet S, Sanjay T, Stella W, Cathal S, Pascal P, Marcus B (2019) Time2vec: Learning a vector representation of time. arXiv: Social and Information Networks
Mermillod M, Bugaiska A, Bonin P (2013) The stability-plasticity dilemma: investigating the continuum from catastrophic forgetting to age-limited learning effects. FRONTIERS IN PSYCHOLOGY pp 504–504
Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. CoRR
Nguyen HG, Lee BJ, Rossi AR, Ahmed KN, Koh E, Kim S (2018) Continuous-time dynamic network embeddings. WWW pp 969–976
Ni J, Li J, McAuley J (2019) Justifying recommendations using distantly-labeled reviews and fined-grained aspects. EMNLP/IJCNLP pp 188–197
Niepert M, Ahmed MH, Kutzkov K (2016) Learning convolutional neural networks for graphs. In: ICLR, pp 2014–2023
Ou M, Cui P, Pei J, Zhu, W (2016) Asymmetric transitivity preserving graph embedding. KDD
Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl TB, Leiserson CE (2020) EvolveGCN: Evolving graph convolutional networks for dynamic graphs. In: Proceedings of the Thirty-Fourth AAAI conference on artificial intelligence
Parisi IG, Kemker R, Part LJ, Kanan C, Wermter S (2019) Continual lifelong learning with neural networks: A review. Neural Networks pp 54–71
Perozzi B, Al-Rfou’ R, Skiena S (2014) Deepwalk: online learning of social representations. KDD pp 701–710
Qi C, Zhang J, Jia H, Mao Q, Wang L, Song H (2021) Deep face clustering using residual graph convolutional network. Knowledge Based Systems 211:106561
Grossberg S (2013) Adaptive resonance theory: how a brain learns to consciously attend, learn, and recognize a changing world. Neural Networks pp 1–47
Sankar A, Wu Y, Gou L, Zhang W, Yang H (2020) Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In: WSDM, pp 519–527
Srinivasan B, Ribeiro B (2020) On the equivalence between node embeddings and structural graph representations. ICLR
Sun FY, Qu M, Hoffmann J, Huang CW, Tang J (2019) vgraph: A generative model for joint community detection and node representation learning. NIPS pp 512–522
Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q (2015) Line: Large-scale information network embedding. WWW
Thrun BS, Mitchell MT (1993) Lifelong robot learning. Lifelong Robot Learning
Trivedi R, Farajtabar M, Biswal P, Zha H (2019) Dyrep - learning representations over dynamic graphs. ICLR
Tsai HYH, Bai S, Yamada M, Morency LP, Salakhutdinov R (2019) Transformer dissection: An unified understanding for transformer’s attention via the lens of kernel. EMNLP/IJCNLP 1:4343–4352
Tu C, Liu H, Liu Z, Sun M (2017) Cane: Context-aware network embedding for relation modeling. In: ACL, pp 1722–1731
Tu C, Zeng X, Wang H, Zhang Z, Liu Z, Sun M, Zhang B, Lin L (2018) A unified framework for community detection and network representation learning. IEEE Transactions on Knowledge and Data Engineering pp 1–1
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez NA, Kaiser L, Polosukhin I (2017) Attention is all you need. NIPS pp 5998–6008
WANG D, Cui P, Zhu W (2016) Structural deep network embedding. KDD
Wang Y, Chang YY, Liu Y, Leskovec J, Li P (2021) Inductive representation learning in temporal networks via causal anonymous walks. ICLR
Wu J, Wang X, Feng F, He X, Chen L, Lian J, Xie X (2021) Self-supervised graph learning for recommendation. SIGIR pp 726–735
Xu D, Cheng W, Luo D, Liu X, Zhang X (2019) Spatio-temporal attentive rnn for node classification in temporal attributed graphs. IJCAI pp 3947–3953
Xu D, Liang J, Cheng W, Wei H, Chen H, Zhang X (2021) Transformer-style relational reasoning with dynamic memory updating for temporal network modeling. AAAI pp 4546–4554
Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2019) Self-attention with functional time representation learning. NIPS pp 15889–15899
Xu D, Ruan C, Korpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. ICLR
Yang M, Zhou M, Kalander M, Huang Z, King I (2021) Discrete-time temporal network embedding via implicit hierarchical learning in hyperbolic space. KDD pp 1975–1985
Zhou F, Cao C (2021) Overcoming catastrophic forgetting in graph neural networks with experience replay. AAAI
Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, Zhang W (2021) Informer: Beyond efficient transformer for long sequence time-series forecasting. In: The Thirty-Fifth AAAI conference on artificial intelligence, AAAI 2021, p. online. AAAI Press
Zuo Y, Liu G, Lin H, Guo J, Hu X, Wu J (2008) Embedding temporal network via neighborhood formation. KDD pp 2857–2866
Acknowledgements
This work was supported by the National Natural Science Foundation of China (No. 61972135), the Natural Science Foundation of Heilongjiang Province in China (No. LH2020F043), the Innovation Talents Project of Science and Technology Bureau of Harbin (No. 2017RAQXJ094), and the Foundation of Graduate Innovative Research of Heilongjiang University in China (No.YJSCX2021-076HLJU)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Liu, M., Quan, ZW., Wu, JM. et al. Embedding temporal networks inductively via mining neighborhood and community influences. Appl Intell 52, 16069–16088 (2022). https://doi.org/10.1007/s10489-021-03102-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-021-03102-x