Skip to main content

Dynamic Network Embedding by Using Sparse Deep Autoencoder

  • Conference paper
  • First Online:
  • 1472 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13606))

Abstract

Learning network representation, which aims to capture various properties of networks in low-dimensional feature space, has recently attracted significant attention. Almost all existing static network embedding and dynamic network embedding methods that employ deep models adopt dense structures. Deep models can ensure that the network embedding achieves a good effect on the task (link prediction, network reconstruction, etc.); however, all works of this kind ignore the high complexity of the deep model training process. In this paper, we propose an embedding method that learns dynamic network embedding by using a sparse deep model. The general idea underpinning our approach involves greatly reducing the number of connections between layers in the deep model. Moreover, the sparse structure of the deep model evolved during the training process to achieve the purpose of fitting the network data. Experimental results on simulated benchmark networks and real-world networks prove that, compared with existing network embedding methods utilizing dense structures, our method is able to greatly reduce the number of training weights, while minimally affecting or sometimes even improving the effect of network tasks.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Anwar, S., Hwang, K., Sung, W.: Structured pruning of deep convolutional neural networks. ACM J. Emerg. Technol. Comput. Syst. 13(3), 32 (2017)

    Article  Google Scholar 

  2. Girvan, M., Newman, M.E.: Community structure in social and biological networks. Proc. Natl. Acad. Sci. 99(12), 7821–7826 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  3. Goyal, P., Chhetri, S.R., Canedo, A.: Dyngraph2vec: capturing network dynamics using dynamic graph representation learning. Knowl. Based Syst. 187, 104816 (2020)

    Article  Google Scholar 

  4. Goyal, P., Kamra, N., He, X., Liu, Y.: Dyngem: deep embedding method for dynamic graphs. In: 3rd IJCAI International Workshop on Representation Learning for Graphs. IJCAI. Melbourne, VIC, Australia (2017)

    Google Scholar 

  5. Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural network. In: Annual Conference on Neural Information Processing Systems 2015, pp. 1135–1143. NIPS, Montreal, Quebec, Canada (2015)

    Google Scholar 

  6. Jiao, P., et al.: Temporal network embedding for link prediction via vae joint attention mechanism. IEEE Transactions on Neural Networks and Learning Systems (2021)

    Google Scholar 

  7. Jin, X., Yuan, X., Feng, J., Yan, S.: Training skinny deep neural networks with iterative hard thresholding methods. arXiv preprint arXiv:1607.05423 (2016)

  8. Lancichinetti, A., Fortunato, S.: Benchmarks for testing community detection algorithms on directed and weighted graphs with overlapping communities. Phys. Rev. E Statist. Nonlinear Soft Matter Phys. 80(1), 016118 (2009)

    Article  Google Scholar 

  9. Lee, J., Perkins, D.: A simulated annealing algorithm with a dual perturbation method for clustering. Pattern Recogn. 112, 107713 (2020)

    Google Scholar 

  10. Li, T., Zhang, J., Philip, S.Y., Zhang, Y., Yan, Y.: Deep dynamic network embedding for link prediction. IEEE Access 6, 29219–29230 (2018)

    Article  Google Scholar 

  11. Liu, Q., Long, C., Zhang, J., Xu, M., Lv, P.: Triatne: tipartite adversarial training for network embeddings. IEEE Trans. Cybern. 52, 9634–9645 (2021)

    Google Scholar 

  12. Ma, L., Li, J., Lin, Q., Gong, M., Coello, C.A.C., Ming, Z.: Cost-aware robust control of signed networks by using a memetic algorithm. IEEE Trans. Cybern. 50(10), 4430–4443 (2020)

    Article  Google Scholar 

  13. Miikkulainen, R., et al.: Evolving deep neural networks. In: Artificial Intelligence in the Age of Neural Networks and Brain Computing, pp. 293–312. Elsevier (2019)

    Google Scholar 

  14. Mocanu, D.C., et al.: Evolutionary training of sparse artificial neural networks: a network science perspective. arXiv preprint arXiv:1707.04780 (2017)

  15. Nasiri, E., Berahmand, K., Rostami, M., Dabiri, M.: A novel link prediction algorithm for protein-protein interaction networks by attributed graph embedding. Comput. Biol. Med. 137, 104772 (2021)

    Article  Google Scholar 

  16. Newman, M.E., Girvan, M.: Finding and evaluating community structure in networks. Phys. Rev. E Statist. Nonlinear Soft Matter Phys. 69(2), 026113 (2004)

    Article  Google Scholar 

  17. Nguyen, G.H., Lee, J.B., Rossi, R.A., Ahmed, N.K., Koh, E., Kim, S.: Continuous-time dynamic network embeddings. In: The Web Conference 2018, pp. 969–976. Lyons, FRANCE (2018)

    Google Scholar 

  18. Pareja, A., et al: Evolvegcn: evolving graph convolutional networks for dynamic graphs. In: 34th AAAI Conference on Artificial Intelligence, pp. 5363–5370. AAAI, New York, NY, USA (2020)

    Google Scholar 

  19. Park, P.S., Blumenstock, J.E., Macy, M.W.: The strength of long-range ties in population-scale social networks. Science 362(6421), 1410–1413 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  20. Rossi, R.A., Ahmed, N.K.: The network data repository with interactive graph analytics and visualization. In: 29th AAAI Conference on Artificial Intelligence. AAAI, Austin, Texas, USA (2015). http://networkrepository.com

  21. Santolini, M., Barabási, A.L.: Predicting perturbation patterns from the topology of biological networks. Proc. Natl. Acad. Sci. 115(27), E6375–E6383 (2018)

    Article  Google Scholar 

  22. Teng, X., Liu, J., Li, L.: A synchronous feature learning method for multiplex network embedding. Inform. Sci. 574, 176–191 (2021)

    Article  MathSciNet  Google Scholar 

  23. Tóth, J., Toman, H., Hajdu, A.: Efficient sampling-based energy function evaluation for ensemble optimization using simulated annealing. Pattern Recogn. 107, 107510 (2020)

    Article  Google Scholar 

  24. Wang, D., Cui, P., Zhu, W.: Structural deep network embedding. In: 22nd ACM International Conference on Knowledge Discovery and Data Mining, pp. 1225–1234. ACM, San Francisco, CA, USA (2016)

    Google Scholar 

  25. Xue, G., Zhong, M., Li, J., Chen, J., Zhai, C., Kong, R.: Dynamic network embedding survey. Neurocomputing 472, 212–223 (2022)

    Article  Google Scholar 

  26. Yang, M., Zhou, M., Kalander, M., Huang, Z., King, I.: Discrete-time temporal network embedding via implicit hierarchical learning in hyperbolic space. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 1975–1985 (2021)

    Google Scholar 

  27. Yuan, X., Savarese, P., Maire, M.: Growing efficient deep networks by structured continuous sparsification. arXiv preprint arXiv:2007.15353 (2020)

  28. Zhang, W., Guo, X., Wang, W., Tian, Q., Pan, L., Jiao, P.: Role-based network embedding via structural features reconstruction with degree-regularized constraint. Knowl. Based Syst. 218, 106872 (2021)

    Article  Google Scholar 

  29. Zhao, Z., Zhou, H., Li, C., Tang, J., Zeng, Q.: Deepemlan: deep embedding learning for attributed networks. Inform. Sci. 543, 382–397 (2021)

    Article  MathSciNet  Google Scholar 

  30. Zhou, L.k., Yang, Y., Ren, X., Wu, F., Zhuang, Y.: Dynamic network embedding by modeling triadic closure process. In: 32nd AAAI Conference on Artificial Intelligence. AAAI, New Orleans, Louisiana, USA (2018)

    Google Scholar 

  31. Zhou, X., Zhang, W., Xu, H., Zhang, T.: Effective sparsification of neural networks with global sparsity constraint. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3599–3608 (2021)

    Google Scholar 

  32. Zhu, D., Cui, P., Zhang, Z., Pei, J., Zhu, W.: High-order proximity preserved embedding for dynamic networks. IEEE Trans. Knowl. Data Eng. 30(11), 2134–2144 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zengyang Shao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tang, H., Shao, Z., Zhang, Y., Ma, L., Lin, Q. (2022). Dynamic Network Embedding by Using Sparse Deep Autoencoder. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13606. Springer, Cham. https://doi.org/10.1007/978-3-031-20503-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20503-3_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20502-6

  • Online ISBN: 978-3-031-20503-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics