skip to main content
10.1145/3617184.3618055acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiccsieConference Proceedingsconference-collections
research-article

Dynamic Heterogeneous Link Prediction Based on Hierarchical Attention Model

Published:28 December 2023Publication History

ABSTRACT

In the actual world, many networks and graphs are fundamentally dynamic and heterogeneous. They contain many kinds of nodes and relations, and they are evolving with time. However, the majority of conventional link prediction techniques are currently limited to static or homogeneous networks and have the drawbacks of not fully exploiting the networks' time-domain evolutionary information as well as their extensive semantic and structural characteristics. In this paper, we propose a link prediction method (Att-ConvLSTM) that uses hierarchical attention to learn heterogeneous information and combines recurrent neural networks with temporal attention to capture evolutionary patterns. The proposed approach is found to be superior in AUC and Precision after comparison and analysis with various types of link predicting algorithms.

References

  1. Zhang M. Graph neural networks: link prediction. Graph Neural Networks: Foundations, Frontiers, and Applications, 2022: 195-223.Google ScholarGoogle Scholar
  2. Singer, U., Guy, I., Radinsky, K.: Node Embedding over Temporal Graphs. in:IJCAI. pp. 4605-4612 (2019)Google ScholarGoogle Scholar
  3. Pareja, A., Domeniconi, G., Chen, J., Ma, T., Suzumura, T., Kanezashi, H., Kaler,T., Schardl, T.B., Leiserson, C.E.: EvolveGCN: Evolving graph convolutional net-works for dynamic graphs. in: AAAI (2020)Google ScholarGoogle Scholar
  4. Sajadmanesh, S., Bazargani, S., Zhang, J., Rabiee, H.R.: Continuous-time rela-tionship prediction in dynamic heterogeneous information networks. ACM TKDD13(4), 44:1–44:31 (Jul 2019)Google ScholarGoogle Scholar
  5. Yin, Y., Ji, L., Zhang, J., Pei, Y.: Dhne: Network representation learning methodfor dynamic heterogeneous networks. IEEE Access 7, 134782–134792 (2019)Google ScholarGoogle ScholarCross RefCross Ref
  6. Dong Y, Liu Q, Du B, Weighted feature fusion of convolutional neural network and graph attention network for hyperspectral image classification. IEEE Transactions on Image Processing, 2022, 31: 1559-1572.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Shi X, Chen Z, Wang H, Convolutional LSTM network: A machine learning approach for precipitation nowcasting. Advances in Neural Information Processing Systems, 2015, 2015: 802-81Google ScholarGoogle Scholar
  8. Perozzi, B., Al-Rfou, R., Skiena, S.: DeepWalk: Online Learning of Social Repre-sentations. In: SIGKDD. pp. 701–710 (2014)Google ScholarGoogle Scholar
  9. Dong, Y., Chawla, N.V., Swami, A.: Metapath2Vec: Scalable RepresentationLearning for Heterogeneous Networks. In: SIGKDD. pp. 135–144 (2017)Google ScholarGoogle Scholar
  10. Sankar A, Wu Y, Gou L, Dynamic graph representation learning via self-attention networks. arXiv preprint arXiv:1812.09430, 2018.Google ScholarGoogle Scholar
  11. Xue H, Yang L, Jiang W, Modeling dynamic heterogeneous network for link prediction using hierarchical attention with temporal rnn//Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2020, Ghent, Belgium, September 14–18, 2020, Proceedings, Part I. Springer International Publishing, 2021: 282-298.Google ScholarGoogle Scholar
  12. Baytas I M, Xiao C, Wang F, Heterogeneous hyper-network embedding//2018 IEEE International Conference on Data Mining (ICDM). IEEE, 2018: 875-880.Google ScholarGoogle Scholar

Index Terms

  1. Dynamic Heterogeneous Link Prediction Based on Hierarchical Attention Model

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICCSIE '23: Proceedings of the 8th International Conference on Cyber Security and Information Engineering
      September 2023
      370 pages
      ISBN:9798400708800
      DOI:10.1145/3617184

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 28 December 2023

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited
    • Article Metrics

      • Downloads (Last 12 months)22
      • Downloads (Last 6 weeks)6

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format