Skip to main content
Log in

SEPAKE: a structure-enhanced and position-aware knowledge embedding framework for knowledge graph completion

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Knowledge Graphs (KGs) provide supportively structured knowledge and have been applied to various downstream applications. Given a large amount of incomplete knowledge in KGs, knowledge graph completion (KGC) is proposed to reason over known facts and infer the missing links. The previous graph embedding approaches learn graph structure (i.e., triple structure/neighborhood structure) but cannot handle unseen entities, which is addressed by textual encoding approaches that utilize the textual knowledge of graph elements (i.e., entities/relations). However, the previous textual encoding approaches only resort to triples and thus cannot exploit the knowledge of neighbors, which provides abundant evidence to facilitate prediction. Moreover, they are insensitive to changes in the position of elements in triples when performing text modeling, and thus cannot effectively distinguish triples with the same elements but completely different semantics, which is detrimental to the final result. To address the above challenges, we propose a novel Structure-Enhanced and Position-Aware Knowledge Embedding (SEPAKE) framework. Specifically, masked elements reconstruction is devised to predict missing elements by reasoning over the contexts of subgraphs. As such, we incorporate the graph structure while maintaining the feature that textual information can be encoded. Meanwhile, position-aware learning is conducted to capture the semantic knowledge implied by the relative positions of elements in textualization. In addition, we employ task-specific adapters to store knowledge in a unified way to facilitate the storage and transfer of knowledge. Extensive experiments demonstrate the effectiveness of our framework, and we achieve state-of-the-art performance on standard datasets compared with textual encoding approaches. Besides, our proposed framework can efficiently improve the previous approaches by optionally pluggable adapters, further verifying the advancement and applicability of our work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data Availability

The datasets generated during and/or analysed during the current study are available in the LibKGE repository, https://github.com/uma-pi1/kge.

References

  1. Ji S, Pan S, Cambria E, Marttinen P, Philip SY (2021) A survey on knowledge graphs: Representation, acquisition, and applications. IEEE Trans Neural Netw Learn Syst 33(2):494–514

    Article  MathSciNet  Google Scholar 

  2. Rossi A, Barbosa D, Firmani D, Matinata A, Merialdo P (2021) Knowledge graph embedding for link prediction: A comparative analysis. ACM Trans Knowl Discov Data (TKDD) 15(2):1–49

    Article  Google Scholar 

  3. Li W, Peng R, Li Z (2022) Improving knowledge graph completion via increasing embedding interactions. Appl Intell 52(8):9289–9307

    Article  Google Scholar 

  4. Guo J, Fan Y, Pang L, Yang L, Ai Q, Zamani H, Wu C, Croft WB, Cheng X (2020) A deep look into neural ranking models for information retrieval. Inf Process Manag 57(6):102067

    Article  Google Scholar 

  5. Cai H, Zheng VW, Chang KC-C (2018) A comprehensive survey of graph embedding: Problems, techniques, and applications. IEEE Trans Knowl Data Eng 30(9):1616–1637

    Article  Google Scholar 

  6. Yang Z, Dong S (2020) Hagerec: Hierarchical attention graph convolutional network incorporating knowledge graph for explainable recommendation. Knowl-Based Syst 204:106194

    Article  Google Scholar 

  7. Guo Q, Zhuang F, Qin C, Zhu H, Xie X, Xiong H, He Q (2020) A survey on knowledge graph-based recommender systems. IEEE Trans Knowl Data Eng 34(8):3549–3568

    Article  Google Scholar 

  8. Saxena A, Tripathi A, Talukdar P (2020) Improving multi-hop question answering over knowledge graphs using knowledge base embeddings. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 4498–4507

  9. Hogan A, Blomqvist E, Cochez M, d’Amato C, Melo GD, Gutierrez C, Kirrane S, Gayo JEL, Navigli R, Neumaier S et al (2021) Knowledge graphs. ACM Comput Surv (CSUR) 54(4):1–37

    Google Scholar 

  10. Chen X, Jia S, Xiang Y (2020) A review: Knowledge reasoning over knowledge graph. Expert Syst Appl 141:112948

    Article  Google Scholar 

  11. Ranganathan V, Barbosa D (2022) Hoplop: multi-hop link prediction over knowledge graph embeddings. World Wide Web 25(2):1037–1065

    Article  Google Scholar 

  12. Vashishth S, Sanyal S, Nitin V, Talukdar P (2020) Composition-based multi-relational graph convolutional networks. ICLR

  13. Schlichtkrull M, Kipf NT, Bloem P, Berg vdR, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. ESWC

  14. Wang B, Shen T, Long G, Zhou T, Wang Y, Chang Y (2021) Structure-augmented text representation learning for efficient knowledge graph completion. WWW, 1737–1748

  15. Kim B, Hong T, Ko Y, Seo J (2020) Multi-task learning for knowledge graph completion with pre-trained language models. COLING, 1737–1743

  16. Chen S, Liu X, Gao J, Jiao J, Zhang R, Ji Y (2021) Hitter - hierarchical transformers for knowledge graph embeddings. EMNLP, 10395–10407

  17. Feng J, Wei Q, Cui J, Chen J (2022) Novel translation knowledge graph completion model based on 2d convolution. Appl Intell 52(3):3266–3275

    Article  Google Scholar 

  18. Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. NIPS, 2787–2795

  19. Wang Z, Zhang J, Feng J, Chen Z (2014) Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28

  20. Lin Y, Liu Z, Sun M, Liu Y, Zhu X (2015) Learning entity and relation embeddings for knowledge graph completion. In: Twenty-ninth AAAI Conference on Artificial Intelligence

  21. Sun Z, Deng Z-H, Nie J-Y, Tang J (2019) Rotate: Knowledge graph embedding by relational rotation in complex space. ICLR

  22. Yang B, Yih SW-t, He X, Gao J, Deng L (2015) Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of the International Conference on Learning Representations (ICLR) 2015

  23. Trouillon T, Welbl J, Riedel S, Gaussier r, Bouchard G (2016) Complex embeddings for simple link prediction. ICML, 2071–2080

  24. Balazevic I, Allen C, Hospedales T (2019) TuckER: Tensor factorization for knowledge graph completion. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 5185–5194. Association for Computational Linguistics, Hong Kong, China. https://doi.org/10.18653/v1/D19-1522. https://aclanthology.org/D19-1522

  25. Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2d knowledge graph embeddings. national conference on artificial intelligence

  26. Vashishth S, Sanyal S, Nitin V, Agrawal N, Talukdar P (2020) Interacte: Improving convolution-based knowledge graph embeddings by increasing feature interactions. Proc AAAI Conf Artif Intell 34:3009–3016

    Google Scholar 

  27. Xie R, Liu Z, Jia J, Luan H, Sun M (2016) Representation learning of knowledge graphs with entity descriptions. THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2659–2665

  28. Xiao H, Huang M, Meng L, Zhu X (2017) Ssp: Semantic space projection for knowledge graph embedding with text descriptions. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 3104–3110

  29. Kenton JDM-WC, Toutanova LK (2019) Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186

  30. Yao L, Mao C, Luo Y (2019) Kg-bert: Bert for knowledge graph completion. arXiv:1909.03193

  31. Xiaozhi W, Tianyu G, Zhaocheng Z, Zhiyuan L, Juanzi L, Jian T (2021) Kepler: A unified model for knowledge embedding and pre-trained language representation. Transactions of the Association for Computational Linguistics, 176–194

  32. Zhang Z, Han X, Liu Z, Jiang X, Sun M, Liu Q (2019) Ernie: Enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1441–1451

  33. Meng Z, Liu F, Clark HT, Shareghi E, Collier N (2021) Mixture-of-partitions - infusing large biomedical knowledge graphs into bert. EMNLP, 4672–4681

  34. Ruize W, Duyu T, Nan D, Zhongyu W, Xuanjing H, Jianshu j, Cuihong C, Daxin J, Ming Z (2021) K-adapter - infusing knowledge into pre-trained models with adapters. ACL/IJCNLP, 1405–1418

  35. Houlsby N, Giurgiu A, Jastrzebski S, Morrone B, Laroussilhe dQ, Gesmundo A, Attariyan M, Gelly S (2019) Parameter-efficient transfer learning for nlp. International Conference on Machine Learning, 2790–2799

  36. Hadsell R, Rao D, Rusu AA, Pascanu R (2020) Embracing change: Continual learning in deep neural networks. Trends Cogn Sci 24(12):1028–1040

    Article  Google Scholar 

  37. Liu P, Yuan W, Fu J, Jiang Z, Hayashi H, Neubig G (2023) Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing. ACM Comput Surv 55(9):1–35

    Article  Google Scholar 

  38. Wang L, Zhao W, Wei Z, Liu J (2022) Simkgc: Simple contrastive knowledge graph completion with pre-trained language models. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4281–4294

  39. Yao S, Pi D, Chen J (2022) Knowledge embedding via hyperbolic skipped graph convolutional networks. Neurocomputing 480:119–130

    Article  Google Scholar 

  40. Yao S, Pi D, Chen J, Xu Y (2022) Gckg: Novel gated convolutional embedding model for knowledge graphs. Expert Syst Appl 208:118142

    Article  Google Scholar 

  41. Zhiqing S, Shikhar V, Soumya S, Partha T, Yiming Y (2020) A re-evaluation of knowledge graph completion methods. ACL, 5516–5522

  42. Dai Quoc Nguyen TDN, Nguyen DQ, Phung D (2018) A novel embedding model for knowledge base completion based on convolutional neural network. In: Proceedings of NAACL-HLT, pp. 327–333

  43. Nathani D, Chauhan J, Sharma C, Kaul M (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 4710–4723

  44. Vu T, Nguyen TD, Nguyen DQ, Phung D, et al (2019) A capsule network-based embedding model for knowledge graph completion and search personalization. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 2180–2189

  45. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A, Kopf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J, Chintala S (2019) Pytorch: An imperative style, high-performance deep learning library. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)

  46. He K, Chen X, Xie S, Li Y, Dollár P, Girshick R (2022) Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009

Download references

Acknowledgements

This work is jointly supported by National Natural Science Foundation of China (61877043) and National Natural Science of China (61877044).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuewei Li.

Ethics declarations

Conflicts of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Mei Yu, Tingxu Jiang, Jian Yu, Mankun Zhao, Jiujiang Gou, Ming Yang, Ruiguo Yu are contributed equally to this work.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, M., Jiang, T., Yu, J. et al. SEPAKE: a structure-enhanced and position-aware knowledge embedding framework for knowledge graph completion. Appl Intell 53, 23113–23123 (2023). https://doi.org/10.1007/s10489-023-04723-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-04723-0

Keywords

Navigation