Skip to main content

Event Sparse Net: Sparse Dynamic Graph Multi-representation Learning with Temporal Attention for Event-Based Data

  • Conference paper
  • First Online:
Pattern Recognition and Computer Vision (PRCV 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14433))

Included in the following conference series:

  • 433 Accesses

Abstract

Graph structure data has seen widespread utilization in modeling and learning representations, with dynamic graph neural networks being a popular choice. However, existing approaches to dynamic representation learning suffer from either discrete learning, leading to the loss of temporal information, or continuous learning, which entails significant computational burdens. Regarding these issues, we propose an innovative dynamic graph neural network called Event Sparse Net (ESN). By encoding time information adaptively as snapshots and there is an identical amount of temporal structure in each snapshot, our approach achieves continuous and precise time encoding while avoiding potential information loss in snapshot-based methods. Additionally, we introduce a lightweight module, namely Global Temporal Attention, for computing node representations based on temporal dynamics and structural neighborhoods. By simplifying the fully-connected attention fusion, our approach significantly reduces computational costs compared to the currently best-performing methods. We assess our methodology on four continuous/discrete graph datasets for link prediction to assess its effectiveness. In comparison experiments with top-notch baseline models, ESN achieves competitive performance with faster inference speed.

Supported by the National Natural Science Foundation of China under Grant 62002074 and 62072452; Supported by the Shenzhen Science and Technology Program JCYJ20200109115627045, in part by the Regional Joint Fund of Guangdong under Grant 2021B1515120011.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhu, T., Li, J., Hu, X., Xiong, P., Zhou, W.: The dynamic privacy-preserving mechanisms for online dynamic social networks. IEEE Trans. Knowl. Data Eng. 34(06), 2962–2974 (2022)

    Article  Google Scholar 

  2. Grover, A., Leskovec, J.: node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 855–864 (2016)

    Google Scholar 

  3. Goering, S., Klein, E.: Fostering neuroethics integration with neuroscience in the brain initiative: comments on the nih neuroethics roadmap. AJOB Neurosci. 11(3), 184–188 (2020)

    Article  Google Scholar 

  4. Fout, A.M.: Protein interface prediction using graph convolutional networks. Ph.D. dissertation, Colorado State University (2017)

    Google Scholar 

  5. Jiang, L., Cheng, Y., Yang, L., Li, J., Yan, H., Wang, X.: A trust-based collaborative filtering algorithm for e-commerce recommendation system. J. Ambient. Intell. Humaniz. Comput. 10(8), 3023–3034 (2019)

    Article  Google Scholar 

  6. Ying, R., He, R., Chen, K., Eksombatchai, P., Hamilton, W.L., Leskovec, J.: Graph convolutional neural networks for web-scale recommender systems. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 974–983 (2018)

    Google Scholar 

  7. Chen, C., Huang, T.: Camdar-adv: generating adversarial patches on 3d object. Int. J. Intell. Syst. 36(3), 1441–1453 (2021)

    Article  MathSciNet  Google Scholar 

  8. Shi, Z., Chang, C., Chen, H., Du, X., Zhang, H.: PR-NET: progressively-refined neural network for image manipulation localization. Int. J. Intell. Syst. 37(5), 3166–3188 (2022)

    Article  Google Scholar 

  9. Yan, H., Chen, M., Hu, L., Jia, C.: Secure video retrieval using image query on an untrusted cloud. Appl. Soft Comput. 97, 106782 (2020)

    Article  Google Scholar 

  10. Ai, S., Koe, A.S.V., Huang, T.: Adversarial perturbation in remote sensing image recognition. Appl. Soft Comput. 105, 107252 (2021)

    Article  Google Scholar 

  11. Wang, X., Li, J., Li, J., Yan, H.: Multilevel similarity model for high-resolution remote sensing image registration. Inf. Sci. 505, 294–305 (2019)

    Article  MathSciNet  Google Scholar 

  12. Wang, X., Li, J., Yan, H.: An improved anti-quantum mst3 public key encryption scheme for remote sensing images. Enterp. Inf. Syst. 15(4), 530–544 (2021)

    Article  Google Scholar 

  13. Li, J., et al.: Efficient and secure outsourcing of differentially private data publishing with multiple evaluators. IEEE Trans. Depend. Secure Comput. 19(01), 67–76 (2022)

    Article  Google Scholar 

  14. Yan, H., Hu, L., Xiang, X., Liu, Z., Yuan, X.: PPCL: privacy-preserving collaborative learning for mitigating indirect information leakage. Inf. Sci. 548, 423–437 (2021)

    Article  MathSciNet  Google Scholar 

  15. Barros, C.D., Mendonça, M.R., Vieira, A.B., Ziviani, A.: A survey on embedding dynamic graphs. ACM Comput. Surv. (CSUR) 55(1), 1–37 (2021)

    Article  Google Scholar 

  16. Cai, H., Zheng, V.W., Chang, K.C.-C.: A comprehensive survey of graph embedding: problems, techniques, and applications. IEEE Trans. Knowl. Data Eng. 30(9), 1616–1637 (2018)

    Article  Google Scholar 

  17. Cui, P., Wang, X., Pei, J., Zhu, W.: A survey on network embedding. IEEE Trans. Knowl. Data Eng. 31(5), 833–852 (2018)

    Article  Google Scholar 

  18. Kazemi, S.M., et al.: Representation learning for dynamic graphs: a survey. J. Mach. Learn. Res. 21(70), 1–73 (2020)

    MathSciNet  Google Scholar 

  19. Skarding, J., Gabrys, B., Musial, K.: "Foundations and modeling of dynamic networks using dynamic graph neural networks: a survey. IEEE Access 9, 79143–79168 (2021)

    Article  Google Scholar 

  20. Pang, Y., et al.: SPARSE-DYN: sparse dynamic graph multirepresentation learning via event-based sparse temporal attention network. Int. J. Intell. Syst. 37(11), 8770–8789 (2022)

    Article  Google Scholar 

  21. Chen, J., Xu, X., Wu, Y., Zheng, H.: GC-LSTM: graph convolution embedded LSTM for dynamic link prediction. arXiv preprint arXiv:1812.04206 (2018)

  22. Goyal, P., Chhetri, S.R., Canedo, A.: dyngraph2vec: capturing network dynamics using dynamic graph representation learning. Knowl.-Based Syst. 187, 104816 (2020)

    Article  Google Scholar 

  23. Pang, Y., et al.: Graph decipher: a transparent dual-attention graph neural network to understand the message-passing mechanism for the node classification. Int. J. Intell. Syst. 37(11), 8747–8769 (2022)

    Article  Google Scholar 

  24. Jiang, N., Jie, W., Li, J., Liu, X., Jin, D.: GATRUST: a multi-aspect graph attention network model for trust assessment in OSNS. IEEE Trans. Knowl. Data Eng. 01, 1–1 (2022)

    Google Scholar 

  25. Kumar, S., Zhang, X., Leskovec, J.: Predicting dynamic embedding trajectory in temporal interaction networks. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1269–1278 (2019)

    Google Scholar 

  26. Ma, Y., Guo, Z., Ren, Z., Tang, J., Yin, D.: Streaming graph neural networks. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 719–728 (2020)

    Google Scholar 

  27. Sankar, A., Wu, Y., Gou, L., Zhang, W., Yang, H.: DYSAT: deep neural representation learning on dynamic graphs via self-attention networks. In: Proceedings of the 13th International Conference on Web Search and Data Mining, pp. 519–527 (2020)

    Google Scholar 

  28. Xu, D., Ruan, C., Korpeoglu, E., Kumar, S., Achan, K.: Inductive representation learning on temporal graphs. arXiv preprint arXiv:2002.07962 (2020)

  29. Zhou, L., Yang, Y., Ren, X., Wu, F., Zhuang, Y.: Dynamic network embedding by modeling triadic closure process. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, no. 1 (2018)

    Google Scholar 

  30. Chen, X., Zhang, F., Zhou, F., Bonsangue, M.: Multi-scale graph capsule with influence attention for information cascades prediction. Int. J. Intell. Syst. 37(3), 2584–2611 (2022)

    Article  Google Scholar 

  31. Vaswani, A., et al.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30, 5998–6008 (2017)

    Google Scholar 

  32. Guo, Q., Qiu, X., Liu, P., Shao, Y., Xue, X., Zhang, Z.: Star-transformer. arXiv preprint arXiv:1902.09113 (2019)

  33. Latapy, M., Viard, T., Magnien, C.: Stream graphs and link streams for the modeling of interactions over time. Soc. Netw. Anal. Min. 8(1), 1–29 (2018)

    Article  Google Scholar 

  34. Taheri, A., Gimpel, K., Berger-Wolf, T.: Learning to represent the evolution of dynamic graphs with recurrent models. In: Companion Proceedings of The 2019 World Wide Web Conference, pp. 301–307 (2019)

    Google Scholar 

  35. Hajiramezanali, E., Hasanzadeh, A., Duffield, N., Narayanan, K.R., Zhou, M., Qian, X.: Variational graph recurrent neural networks. arXiv preprint arXiv:1908.09710 (2019)

  36. Klimt, B., Yang, Y.: The enron corpus: a new dataset for email classification research. In: Boulicaut, J.-F., Esposito, F., Giannotti, F., Pedreschi, D. (eds.) ECML 2004. LNCS (LNAI), vol. 3201, pp. 217–226. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30115-8_22

    Chapter  Google Scholar 

  37. Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, pp. 249–256. JMLR Workshop and Conference Proceedings (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Teng Huang or Xi Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, D. et al. (2024). Event Sparse Net: Sparse Dynamic Graph Multi-representation Learning with Temporal Attention for Event-Based Data. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14433. Springer, Singapore. https://doi.org/10.1007/978-981-99-8546-3_17

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8546-3_17

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8545-6

  • Online ISBN: 978-981-99-8546-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics