Abstract
Recently, knowledge embedding on knowledge Graph (KG) has drawn increasing attention from both academia and industry for its concise rationale and promising prospects. However, performances of existing knowledge embedding methods are mostly either far from satisfactory, or exhibits weakness for generalization. In this work, a context-aware knowledge embedding model (CAKE) has been proposed for applications like knowledge completion and link prediction. We model the generative process of KG formation based on latent Dirichlet allocation and hierarchical Dirichlet process, where the latent semantic structure of knowledge elements is learned as contexts. Contextual information, i.e. the context-specific probability distribution over elements, is thereafter leveraged in a translation-based embedding model. Essentially, we develop loss function in a probabilistic style to approximately realize the “attention” mechanism in our model. In this work, the learned embeddings of entities and relations are applied to link prediction and triple classification in experiments and our model shows the best performance compared with multiple baselines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: Conference on Neural Information Processing Systems (NeurIPS), pp. 2787–2795 (2013)
Dasgupta, S.S., Ray, S.N., Talukdar, P.: HyTE: hyperplane-based temporally aware knowledge graph embedding. In: Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 2001–2011 (2018)
Du, L., Lu, Z., Wang, Y., Song, G., Wang, Y., Chen, W.: Galaxy network embedding: a hierarchical community structure preserving approach. In: International Joint Conferences on Artificial Intelligence (IJCAI), pp. 2079–2085 (2018)
Du, L., Wang, Y., Song, G., Lu, Z., Wang, J.: Dynamic network embedding: an extended approach for skip-gram based network embedding. In: International Joint Conferences on Artificial Intelligence (IJCAI), pp. 2086–2092 (2018)
Hoffart, J., Suchanek, F.M., Berberich, K., Weikum, G.: YAGO2: a spatially and temporally enhanced knowledge base from Wikipedia. Artif. Intell. (AI) 194, 28–61 (2013)
Ji, G., Liu, K., He, S., Zhao, J.: Knowledge graph completion with adaptive sparse transfer matrix. In: AAAI Conference on Artificial Intelligence (AAAI), pp. 985–991 (2016)
Jia, Y., Wang, Y., Lin, H., Jin, X., Cheng, X.: Locally adaptive translation for knowledge graph embedding. In: AAAI Conference on Artificial Intelligence (AAAI), pp. 992–998 (2016)
Jiang, T., et al.: Encoding temporal information for time-aware link prediction. In: Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 2350–2354 (2016)
Levy, O., Goldberg, Y.: Neural word embedding as implicit matrix factorization. In: Conference on Neural Information Processing Systems (NeurIPS), pp. 2177–2185 (2014)
Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: AAAI Conference on Artificial Intelligence (AAAI), pp. 2181–2187 (2015)
Nickel, M., Rosasco, L., Poggio, T.: Holographic embeddings of knowledge graphs. In: AAAI Conference on Artificial Intelligence (AAAI), pp. 1955–1961 (2016)
Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
Qian, W., Fu, C., Zhu, Y., Cai, D., He, X.: Translating embeddings for knowledge graph completion with relation attention mechanism. In: International Joint Conferences on Artificial Intelligence (IJCAI), pp. 4286–4292 (2018)
Shi, B., Weninger, T.: ProjE: embedding projection for knowledge graph completion. In: AAAI Conference on Artificial Intelligence (AAAI), pp. 1236–1242 (2017)
Steenwinckel, B., et al.: Walk extraction strategies for node embeddings with RDF2Vec in knowledge graphs. In: Kotsis, G., et al. (eds.) DEXA 2021. CCIS, vol. 1479, pp. 70–80. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87101-7_8
Tang, J., Qu, M., Wang, M., Zhang, M., Yan, J., Mei, Q.: Line: large-scale information network embedding. In: International World Wide Web Conference (WWW), pp. 1067–1077 (2015)
Teh, Y.W., Jordan, M.I., Beal, M.J., Blei, D.M.: Sharing clusters among related groups: hierarchical Dirichlet processes. In: Conference on Neural Information Processing Systems (NeurIPS), pp. 1385–1392 (2005)
Wang, R., et al.: AceKG: a large-scale knowledge graph for academic data mining. In: ACM International Conference on Information and Knowledge Management (CIKM), pp. 1487–1490 (2018)
Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: AAAI Conference on Artificial Intelligence (AAAI), pp. 1112–1119 (2014)
Xiao, H., Huang, M., Zhu, X.: TransG: a generative model for knowledge graph embedding. In: Annual Meeting of the Association for Computational Linguistics (ACL), pp. 2316–2325 (2016)
Zhang, W., Paudel, B., Zhang, W., Bernstein, A., Chen, H.: Interaction embeddings for prediction and explanation in knowledge graphs. In: ACM International Conference on Web Search and Data Mining (WSDM), pp. 96–104 (2019)
Acknowledgement
This work was supported by the National Key R &D Program of China [2018YFB1004700]; the National Natural Science Foundation of China [61872238, 61972254]; and the State Key Laboratory of Air Traffic Management System and Technology [SKLATM20180X].
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
A Appendix: Optimization of CAKE
A Appendix: Optimization of CAKE
For entity \(\epsilon \), the gradient of its embedding is
For relation \(\pi \), the gradient of its embedding is
For context \(\delta \), the gradient of its normal vector is
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Chen, J., Ke, H., Mo, H., Gao, X., Chen, G. (2022). CAKE: A Context-Aware Knowledge Embedding Model of Knowledge Graph. In: Strauss, C., Cuzzocrea, A., Kotsis, G., Tjoa, A.M., Khalil, I. (eds) Database and Expert Systems Applications. DEXA 2022. Lecture Notes in Computer Science, vol 13426. Springer, Cham. https://doi.org/10.1007/978-3-031-12423-5_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-12423-5_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-12422-8
Online ISBN: 978-3-031-12423-5
eBook Packages: Computer ScienceComputer Science (R0)