Skip to main content
Log in

Joint semantic embedding with structural knowledge and entity description for knowledge representation learning

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Previous works mainly employ triple structural information in learning representations for knowledge graph, which results in poor performance of link prediction especially when it predicts new or few-fact entities. It is intuitive to introduce text information to supplement the missing semantic information for knowledge representation. However, existing methods only make alignment on the level of word or score function, and have not yet considered textual and structural information. Moreover, since the textual information is potentially redundant to represent the entities, how to extract relevant information and simultaneously alleviate the irrelevant information contained in the text is a challenging task. To tackle the above problems, this paper proposes a novel knowledge representation learning framework of joint semantic embedding using structural knowledge and entity description (JointSE). Firstly, we design a mutual attention mechanism to filter the effective information of fact triples and entity descriptions with respect to specific relationships. Secondly, we project the triples into the text semantic space using dot product to connect the triple with the relevant text description. In addition, we enhance triple-based entity representation and text-based entity representation using graph neural network to capture more useful graph structure information. Finally, extensive experiments on benchmark datasets and Chinese legal provisions dataset demonstrate that JointSE realizes the effective fusion of triple information, text semantic information, and graph structure information. We observe that JointSE is superior to previous methods in entity prediction and relationship prediction tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig.7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. https://huggingface.co/bert-base-chinese

  2. https://huggingface.co/bert-base-uncased

  3. https://www.dgl.ai/

  4. https://github.com/zxlzr/LegalPP

  5. https://www.gdzwfw.gov.cn/

  6. https://github.com/TimDettmers/ConvE

  7. https://wordnet.princeton.edu/

References

  1. Bordes A, Usunier N, García-Durán A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. In: Twenty-seventh Conference on Neural Information Processing Systems, pp 2787–2795

  2. Yang B, Yih W-T, He X, Gao J, Deng L (2015) Embedding entities and relations for learning and inference in knowledge bases. In: Proc. of the 3th International Conference on Learning Representations (ICLR)

  3. Liu X, Tan H, Chen Q, Lin G (2021) RAGAT: relation aware graph attention network for knowledge graph completion. IEEE Access 9:20840–20849

    Article  Google Scholar 

  4. Xie R, Liu Z, Jia J, Luan H, Sun M (2016) Representation learning of knowledge graphs with entity descriptions. In: The Thirtieth AAAI Conference on Artificial Intelligence, pp 2659–2665

  5. Wang Z, Zhang J, Feng J, Chen Z (2014) Knowledge graph and text jointly embedding. In: Conference on Empirical Methods in Natural Language Processing, pp 1591–1601

  6. Xie R, Liu Z, Sun M (2016) Representation learning of knowledge graphs with hierarchical types. In: the 25th International Joint Conference on Artificial Intelligence, pp 2965–2971

  7. Guo S, Wang Q, Wang B, Wang L, Guo L (2015) Semantically smooth knowledge graph embedding. In: The 53rd Annual Meeting of the Association for Computational Linguistics, pp 84–94

  8. Lin Y, Liu Z, Luan H, Sun M, Rao S, Liu S (2015) Modeling relation paths for representation learning of knowledge bases. In: Conference on Empirical Methods in Natural Language Processing, pp 705–714

  9. Xie R, Liu Z, Luan H, Sun M (2017) Image-embodied knowledge representation learning. In: the 26th International Joint Conference on Artificial Intelligence, pp 3140–3146

  10. Du W, Li B, Wang R (2020) Knowledge graph representation learning method fusing entity description and type. J Chin Inf Process 34(7):50–59

    Google Scholar 

  11. Shen Y, Ding N, Zheng H, Li Y, Yang M (2021) Modeling relation paths for knowledge graph completion. IEEE Trans Knowl Data Eng 33:3607–3617

    Article  Google Scholar 

  12. Socher R, Chen D, Manning CD, Ng A (2013) Reasoning with neural tensor networks for knowledge base completion. In: Twenty-seventh Conference on Neural Information Processing Systems, pp 926–934

  13. Zhang D, Yuan B, Wang DK, Liu R (2015) Joint semantic relevance learning with text data and graph knowledge. In: Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality (CVSC), pp 32–40

  14. Long T, Lowe R, Cheung JCK, Precup D (2016) Leveraging lexical resources for learning entity embeddings in multi-relational data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp 112–117

  15. Wang Z, Li J-Z (2016) Text-enhanced representation learning for knowledge graph. In: the 25th International Joint Conference on Artificial Intelligence, pp 1293–1299

  16. Dai S, Liang Y, Liu S, Wang Y, Shao W, Lin X, Feng X (2018) Learning entity and relation embeddings with entity description for knowledge graph completion. In: The 2nd International Conference on Artificial Intelligence: Technologies and Applications, pp 194–197

  17. Chen W, Luo C, Wang S, Ye Y (2017) Representation learning with complete semantic description of knowledge graphs. In: 2017 International Conference on Machine Learning and Cybernetics (ICMLC) 1:143–149

  18. Le QV, Mikolov T (2014) Distributed representations of sentences and documents. In: Proceedings of the 31st International Conference on Machine Learning (ICML)

  19. Xiong S, Chen Z, Duan P, Liu X (2019) Context-aware representation learning of knowledge graphs with credibility vectors. J Wuhan Univ (Nat Sci Ed) 65(05):488–494

    MATH  Google Scholar 

  20. Zhong H, Zhang J, Wang Z, Wan H-f, Chen Z (2015) Aligning knowledge and text embeddings by entity descriptions. In: Conference on Empirical Methods in Natural Language Processing, pp 267–272

  21. Fan M, Zhou Q, Zheng TF, Grishman R (2017) Distributed representation learning for knowledge graphs with entity descriptions. Pattern Recognit Lett 93:31–37

    Article  Google Scholar 

  22. Xiao H, Huang M, Meng L, Zhu X (2017) SSP: semantic space projection for knowledge graph embedding with text descriptions. In: The Thirty-First AAAI Conference on Artificial Intelligence, pp 3104–3110

  23. Wu J, Xie R, Liu Z, Sun M (2016) Knowledge representation via joint learning of sequential text and knowledge graphs. ArXiv abs/1609.07075

  24. Xu J, Qiu X, Chen K, Huang X (2017) Knowledge graph representation with jointly structural and textual encoding. In: the 26th International Joint Conference on Artificial Intelligence, pp 1318–1324

  25. Peng M, Yao Y, Xie Q, Gao W (2019) Joint Knowledge representation model based on CNN with attention mechanism. J Chin Inf Process 33(2):53–58

    Google Scholar 

  26. Liu C, Zhang Y, Yu M, Yu R, Li X, Zhao M, Xu T, Liu H, Yu J (2019) Text-augmented knowledge representation learning based on convolutional network. In: The 26th International Conference on Neural Information Processing, pp 187–198

  27. Liu C, Zhang Y, Yu M, Li X, Zhao M, Xu T, Yu J, Yu R (2019) Text-enhanced knowledge representation learning based on gated convolutional networks. In: 2019 IEEE 31st International Conference on Tools with Artificial Intelligence, pp 308–315

  28. Cheng J, Yang Z, Dang J, Pan C, Zhang F (2019) Representation learning of knowledge graphs with multi-scale capsule network. In: IDEAL, pp 282–290

  29. Cheng J, Zhang F, Yang Z (2020) Knowledge graph representation learning with multi-scale capsule-based embedding model incorporating entity descriptions. IEEE Access 8:203028–203038

    Article  Google Scholar 

  30. Tu C, Liu H, Liu Z, Sun M (2017) CANE: context-aware network embedding for relation modeling. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp 1722–1731

  31. An B, Chen B, Han X, Sun L (2018) Accurate text-enhanced knowledge graph representation learning. In: The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics, pp 745–755

  32. Han X, Liu Z, Sun M (2018) Neural knowledge acquisition via mutual attention between knowledge graph and text. In: The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI)

  33. Wang Y, Zhang H, Shi G, Liu Z, Zhou Q (2020) A model of text-enhanced knowledge graph representation learning with mutual attention. IEEE Access 8:52895–52905

    Article  Google Scholar 

  34. Yao L, Mao C, Luo Y (2019) KG-BERT: BERT for knowledge graph completion. ArXiv abs/1909.03193

  35. Shen T, Mao Y, He P, Long G, Trischler A, Chen W (2020) Exploiting structured knowledge in text via graph-guided representation learning. In: Conference on Empirical Methods in Natural Language Processing (EMNLP)

  36. Wang X, Gao T, Zhu Z, Liu Z, Li J-Z, Tang J (2021) KEPLER: a unified model for knowledge embedding and pre-trained language representation. Trans Assoc Comput Linguist 9:176–194

    Article  Google Scholar 

  37. Sun T, Shao Y, Qiu X, Guo Q, Hu Y, Huang X, Zhang Z (2020) CoLAKE: contextualized language and knowledge embedding. In: The 28th International Conference on Computational Linguistics (COLING)

  38. Wang B, Shen T, Long G, Zhou T, Chang Y (2021) Structure-augmented text representation learning for efficient knowledge graph completion. Proc Web Conf 2021:1737–1748

    Google Scholar 

  39. Daza D, Cochez M, Groth PT (2021) Inductive entity representations from text via link prediction. Proc Web Conf 2021:798–808

    Google Scholar 

  40. Li L, Bi Z, Ye H, Deng S, Chen H, Tou H, Zhang N, Zhang N, Chen H (2021) Text-guided legal knowledge graph reasoning. In: China Conference on Knowledge Graph and Semantic Computing (CCKS)

  41. Hu L, Zhang M, Li S, Shi J, Shi C, Yang C, Liu Z (2021) Text-graph enhanced knowledge graph representation learning. Front Artif Intell 4:697856

    Article  Google Scholar 

  42. Yang D, He T, Wang H, Wang J (2021) Survey on knowledge graph embedding learning. J Softw 33(9):1–22

    Google Scholar 

  43. Velickovic P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graph attention networks. In: The Sixth International Conference on Learning Representations (ICLR)

  44. Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2D knowledge graph embeddings. In: Proceedings of the ThirtySecond AAAI Conference on Artificial Intelligence, pp 1811–1818

  45. Wang R, Li B, Hu S, Du W, Zhang M (2020) Knowledge graph embedding via graph attenuated attention networks. IEEE Access 8:5212–5224

    Article  Google Scholar 

  46. Sadeghi A, Collarana D, Graux D, Lehmann J (2021) Embedding knowledge graphs attentive to positional and centrality qualities. In: The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, pp 548–564

  47. Bai Y, Ying R, Ren H, Leskovec J (2021) Modeling heterogeneous hierarchies with relation-specific hyperbolic cones. In: 35th Conference on Neural Information Processing System (NeurIPS)

  48. Gao H, Yang K, Yang Y, Zakari RY, Owusu JW, Qin K (2021) QuatDE: dynamic quaternion embedding for knowledge graph completion. ArXiv abs/2105.09002

Download references

Acknowledgements

This work is supported by the National Key Research and Development Program of China (No. 2018YFB0704400).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hao Wang.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wei, X., Zhang, Y. & Wang, H. Joint semantic embedding with structural knowledge and entity description for knowledge representation learning. Neural Comput & Applic 35, 3883–3902 (2023). https://doi.org/10.1007/s00521-022-07923-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07923-w

Keywords

Navigation