Skip to main content
Log in

Enhancing missing facts inference in knowledge graph using triplet subgraph attention embeddings

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Due to the heterogeneous structure of the knowledge graph (KG), relationships between entities remain missing. However, optimal use of KG requires inference of missing fact triplet (entity-relation-entity). The fact inference predicts a missing relationship using an embedding approach in a supervised learning setup, representing entities and relationships in a low-dimensional vector space. Recent work uses attention-aware embeddings, but when applied directly to entire KG, attention mechanisms can be computationally expensive, especially for large graphs. The attention-based KG embedding model uses negative sampling, which can cause a gradient vanishing problem during learning. This paper proposes a novel triplet subgraph attention embedding (TSAE) model that combines a simplified graph attention mechanism with a neural network to learn embedding without negative sampling requirements. The attention layer processes the triplet-level subgraph entities to learn the central entity features by aggregating the neighbor’s features. A neural network processes attention-aware triplet entity features through hidden layers to compute the likelihood of relationship types between triplet entities. TSAE generates more fine-grained entity embeddings using simplified attention mechanism, reduces computational complexity, and offers interpretable embeddings. Experimental results on the benchmark data sets exhibit TSAE superiority over the baselines. The case study shows the efficacy of the model for the KG completion task.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Fig. 3
Algorithm 2
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Code Availibility

The code and data used in this study will be made available from the corresponding author on reasonable request.

References

  1. Ali M, Berrendorf M, Hoyt CT et al. (2021) PyKEEN 1.0: A Python Library for Training and Evaluating Knowledge Graph Embeddings. J Mach Learn Res 22(82):1–6

  2. Bordes A, Usunier N, Garcia-Duran A et al. (2013) Translating embeddings for modeling multi-relational data. Adv Neural Infor Process Sys 26

  3. Cai L, Wang WY (2018) Kbgan: Adversarial learning for knowledge graph embeddings. In: Proc. 16th Annual Conf. NAACL Human Lang. Tech

  4. Destandau M, Fekete JD (2021) The missing path: Analysing incompleteness in knowledge graphs. Inf Visualizat 20(1):66–82

    Article  Google Scholar 

  5. Dettmers T, Minervini P, Stenetorp P et al (2018) Convolutional 2d knowledge graph embeddings. In: Proc. AAAI Conf. on AI

  6. Ebisu T, Ichise R (2019) Generalized translation-based embedding of knowledge graph. IEEE Trans Knowl and Data Engg 32(5):941–951

    Article  Google Scholar 

  7. Hou X, Ma R, Yan L et al. (2023) T-gae: A timespan-aware graph attention-based embedding model for temporal knowledge graph completion. Inf Sci 119225

  8. Hsu PY, Chen CT, Chou C et al. (2022) Explainable mutual fund recommendation system developed based on knowledge graph embeddings. Appl Intell 1–26

  9. Huang J, Zhang T, Zhu J et al (2021) A deep embedding model for knowledge graph completion based on attention mechanism. Neural Comput Appl 33(15):9751–9760

    Article  Google Scholar 

  10. Ji G, He S, Xu L et al. (2015) Knowledge graph embedding via dynamic mapping matrix. In: Proce. 7th Int. Joint Conf. NLP, pp 687–696

  11. Ji K, Hui B, Luo G (2020) Graph attention networks with local structure awareness for knowledge graph completion. IEEE Access 8:224860–224870

    Article  Google Scholar 

  12. Jiang X, Wang Q, Wang B (2019) Adaptive convolution for multi-relational learning. Proc. NACACL, Human Lang. Tech., pp 978–987

    Google Scholar 

  13. Khobragade A, Mahajan R, Langi H et al (2022) Effective negative triplet sampling for knowledge graph embedding. Jour Info and Optim Sci 43(8):2075–2087

    Google Scholar 

  14. Khobragade A, Ghumbre S, Pachghare V (2023) Infer the missing facts of d3fend using knowledge graph representation learning. Int J Web Inf Syst

  15. Khobragade AR, Ghumbre SU (2022) Study and analysis of various link predictions in knowledge graph: A challenging overview. Intell Decis Tech 16(4):653–663

    Google Scholar 

  16. Li C, Peng X, Niu Y et al (2021) Learning graph attention-aware knowledge graph embedding. Neurocomputing 461:516–529

    Article  Google Scholar 

  17. Li Q, Wang D, Feng S et al (2021) Global graph attention embedding network for relation prediction in knowledge graphs. IEEE Tran Neural Netw and Lear Sys 33(11):6712–6725

    Article  MathSciNet  Google Scholar 

  18. Li W, Zhang X, Wang Y et al (2019) Graph2seq: Fusion embedding learning for knowledge graph completion. IEEE Access 7:157960–157971

    Article  Google Scholar 

  19. Li W, Peng R, Li Z (2021) Knowledge graph completion by jointly learning structural features and soft logical rules. IEEE Trans Knowl Data Eng

  20. Li Z, Liu H, Zhang Z et al (2021) Learning knowledge graph embedding with heterogeneous relation attention networks. IEEE Tran Neural Netw and Lear Sys 33(8):3961–3973

    Article  MathSciNet  Google Scholar 

  21. Lin Y, Liu Z, Sun M et al. (2015) Learning entity and relation embeddings for knowledge graph completion. In: Proc. AAAI Conf. on AI

  22. Liu S, Tan N, Yang H et al (2021) An intelligent question answering system of the liao dynasty based on knowledge graph. Inter Jour Computat Intelli Sys 14:1–12

    Google Scholar 

  23. Maddalena L, Giordano M, Manzo M, et al (2022) Whole-graph embedding and adversarial attacks for life sciences. In: Trends in Biomathematics: Stability and Oscillations in Environmental, Social, and Biological Models: BIOMAT 2021. Springer, p 1–21

  24. Mahdisoltani F, Biega J, Suchanek F (2014) Yago3: A knowledge base from multilingual wikipedias. In: Proc. 7th Conference on CIDR, CIDR Conference

  25. Nathani D, Chauhan J, Sharma C et al. (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. In: Proc. 57th Annual Meeting of the ACL, pp 4710–4723

  26. Nguyen DQ, Nguyen TD, Nguyen DQ et al (2018) A novel embedding model for knowledge base completion based on convolutional neural network. Proc. NACACL, Human Lang. Tech., pp 327–333

  27. Nickel M, Tresp V, Kriegel HP et al. (2011) A three-way model for collective learning on multi-relational data. In: Proc. ICML, pp 3104482–3104584

  28. Paszke A, Gross S, Massa F et al. (2019) Pytorch: An imperative style, high-performance deep learning library. In: Proceedings of Neural Information Processing Systems. Curran Associates Inc., pp 8024–8035

  29. Song D, Zhang F, Lu M et al (2021) Dtranse: Distributed translating embedding for knowledge graph. IEEE Trans Parallel and Distributed Sys 32(10):2509–2523

    Article  Google Scholar 

  30. Song HJ, Park SB (2018) Enriching translation-based knowledge graph embeddings through continual learning. IEEE Access 6:60489–60497

    Article  Google Scholar 

  31. Trouillon T, Welbl J, Riedel S et al (2016) Complex embeddings for simple link prediction. Proc. Int. Conf. ML, PMLR, pp 2071–2080

    Google Scholar 

  32. Veličković P, Cucurull G, Casanova A et al. (2018) Graph attention networks. In: Proc. 6th Int. Conf. Learn. Represent

  33. Wang J, Zhang Z, Shi Z et al (2022) Duality-induced regularizer for semantic matching knowledge graph embeddings. IEEE Trans Pattern Ana and ML 45(2):1652–1667

    Article  Google Scholar 

  34. Wang Z, Zhang J, Feng J et al. (2014) Knowledge graph embedding by translating on hyperplanes. In: Proc. AAAI Conf. on AI

  35. Wangde F, Khobragade A, Shinde O (2022) Analysis of translational and tensor factorization knowledge graph embedding models. In: Proc. 6th Int. Conf. Comput., Comm., Cont. Autom., IEEE, pp 1–5

  36. Yang B, Yih Wt, He X et al. (2014) Embedding entities and relations for learning and inference in knowledge bases. In: Proc. ICLR

  37. Yu R, Wang L, Xin Y et al. (2023) A gated graph attention network based on dual graph convolution for node embedding. Applied Intelligence pp 1–14

  38. Zhang R, Trisedya BD, Li M et al (2022) A benchmark and comprehensive survey on knowledge graph entity alignment via representation learning. VLDB J 31(5):1143–1168

    Article  Google Scholar 

  39. Zhang S, Sun Z, Zhang W (2020) Improve the translational distance models for knowledge graph embedding. J Intell Inf Syst 55:445–467

    Article  Google Scholar 

  40. Zhang Z, Li Z, Liu H et al (2020) Multi-scale dynamic convolutional network for knowledge graph embedding. IEEE Trans Know and Data Engg 34(5):2335–2347

    Article  Google Scholar 

  41. Zhang Z, Huang J, Tan Q (2022) Association rules enhanced knowledge graph attention network. Knowl-Based Syst 239:108038

    Article  Google Scholar 

Download references

Acknowledgements

The authors thank the Department of Computer Science and Engineering, COEP Technological University, for supporting us in using the GPU server facility purchased under TEQIP-III (a World Bank initiative).

Funding

No funding was received to carry out this study.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: Anish Khobragade, Shashikant Ghumbre; Methodology: Anish Khobragade; Validation, formal analysis, and investigation: Anish Khobragade, Shashikant Ghumbre; Resources: Vinod Pachghare; Article original draft preparation: Anish Khobragade; Article review and editing: Anish Khobragade, Vinod Pachghare; Supervision: Shashikant Ghumbre, Vinod Pachghare. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Anish Khobragade.

Ethics declarations

Ethical and informed consent for data used

The datasets used for this experiment are publicly available by the respective organizations/authors to further improve the knowledge graph research field. Thus, informed consent is not required to use the dataset. References and citations to relevant datasets are included in the manuscript.

Conflict of interests

The authors declare no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khobragade, A., Ghumbre, S. & Pachghare, V. Enhancing missing facts inference in knowledge graph using triplet subgraph attention embeddings. Appl Intell 54, 1497–1510 (2024). https://doi.org/10.1007/s10489-023-05254-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-05254-4

Keywords