Skip to main content

A Contextualized Entity Representation for Knowledge Graph Completion

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2020)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 12274))

Abstract

Knowledge graphs (KGs) have achieved great success in many AI-related applications in the past decade. Although KGs contain billions of real facts, they are usually not complete. This problem arises to the task of missing link prediction whose purpose is to perform link prediction between entities. Knowledge graph embedding has proved to be a highly effective technology in many tasks such as knowledge reasoning, filling in the missing links, and semantic search. However, many existing embedding models focus on learning static embeddings of entities which pose several problems, most notably that all senses of a polysemous entity have to share the same representation. We, in this paper, propose a novel embedding method, which is named KG embedding with a contextualized entity representation (KGCR for short), to learn the contextual representations of entities for link prediction. KGCR encodes the contextual representations of an entity by considering the forward and backward contexts of relations which helps to capture the different senses of an entity when appearing at different positions of a relation or in different relations. Our approach is capable to model three major relational patterns, i.e., symmetry, antisymmetry, and inversion. Experimental results demonstrate that KGCR can capture the contextual semantics of entities in knowledge graphs and outperforms existing state-of-the-art (SOTA) baselines on benchmark datasets for filling in the missing link task.

Supported by the Key R&D Program Project of Zhejiang Province under Grant no. 2019C01004 and Zhejiang Education Department Project under Grant no. Y201839942.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Xiong, C.Y., Russell, P., Jamie, C.: Explicit semantic ranking for academic search via knowledge graph embedding. In: Proceedings of the 26th International Conference on World Wide Web, pp. 1271–1279 (2017)

    Google Scholar 

  2. Hao, Y.C., Zhang, Y.Z., Liu, K., et al.: An end-to-end model for question answering over knowledge base with cross-attention combining global knowledge. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp. 221–231 (2017)

    Google Scholar 

  3. Yang, B.S., Tom, M.: Leveraging knowledge bases in LSTMs for improving machine reading. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp. 1436–1446 (2017)

    Google Scholar 

  4. Annervaz, K.M., Chowdhury, S.B.R., Dukkipati, A.: Learning beyond datasets: knowledge graph augmented neural networks for natural language processing. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 313–322 (2018)

    Google Scholar 

  5. Antoine, B., Nicolas, U., Alberto, G., et al.: Translating embeddings for modeling multirelational data. In: Proceedings of Advances in Neural Information Processing Systems, pp. 2787–2795 (2013)

    Google Scholar 

  6. Sun, Z.Q., Deng, Z.H., Nie, J.Y., Tang, J.: Rotate: knowledge graph embedding by relational rotation in complex space. In: Proceedings of The Seventh International Conference on Learning Representations (2019)

    Google Scholar 

  7. Yang, B.S., Yih, W.T., He, X.D., et al.: Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of the International Conference on Learning Representations, pp. 1–12 (2015)

    Google Scholar 

  8. Trouillon, T., Welbl, J., Riedel, S., Gaussier, E. Bouchard G.: Complex embeddings for simple link prediction. In: Proceedings of International Conference on Machine Learning, pp. 2071–2080 (2016)

    Google Scholar 

  9. Seyed, M.K., David, P.: Simple embedding for link prediction in knowledge graphs. In: Proceedings of the 32nd Conference on Advances in Neural Information Processing Systems, pp. 4289–4300 (2018)

    Google Scholar 

  10. Socher, R., Chen, D.Q., Manning, C.D., Ng, A.: Reasoning with neural tensor networks for knowledge base completion. In: Proceedings of Advances in Neural Information Processing Systems, pp. 926–934 (2013)

    Google Scholar 

  11. Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2D knowledge graph embeddings. In: Proceedings of Thirty-Second AAAI Conference on Artificial Intelligence, pp. 1811–1818 (2018)

    Google Scholar 

  12. Nguyen, D.Q., Nguyen, T.D., Nguyen, D.Q., et al.: A novel embedding model for knowledge base completion based on convolutional neural network. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 327–333 (2018)

    Google Scholar 

  13. Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., et al. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38

    Chapter  Google Scholar 

  14. Guo, L.B., Sun, Z.Q., Hu, W.: Learning to exploit long-term relational dependencies in knowledge graphs. In: Proceedings of the 36th International Conference on Machine Learning, pp. 2505–2514 (2019)

    Google Scholar 

  15. Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Proceedings of Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  16. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bailin Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pu, F., Yang, B., Ying, J., You, L., Xu, C. (2020). A Contextualized Entity Representation for Knowledge Graph Completion. In: Li, G., Shen, H., Yuan, Y., Wang, X., Liu, H., Zhao, X. (eds) Knowledge Science, Engineering and Management. KSEM 2020. Lecture Notes in Computer Science(), vol 12274. Springer, Cham. https://doi.org/10.1007/978-3-030-55130-8_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-55130-8_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-55129-2

  • Online ISBN: 978-3-030-55130-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics