Abstract
Schematic knowledge, as a critical ingredient of knowledge graphs, defines logical axioms based on concepts to support for eliminating heterogeneity, integration, and reasoning over knowledge graphs (KGs). Although some well-known KGs contain large scale schematic knowledge, they are far from complete, especially schematic knowledge stating that two concepts have subclassOf relations (also called subclassOf axioms) and schematic knowledge stating that two concepts are logically disjoint (also called disjointWith axioms). One of the most important characters of these axioms is their logical properties such as transitivity and symmetry. Current KG embedding models focus on encoding factual knowledge (i.e., triples) in a KG and cannot directly be applied to further schematic knowledge (i.e., axioms) completion. The main reason is that they ignore these logical properties. To solve this issue, we propose a novel model named CosE for schematic knowledge. More precisely, CosE projects each concept into two semantic spaces. One is an angle-based semantic space that is utilized to preserve transitivity or symmetry of an axiom. The other is a translation-based semantic space utilized to measure the confidence score of an axiom. Moreover, two score functions tailored for subclassOf and disjointWith are designed to learn the representation of concepts with these two relations sufficiently. We conduct extensive experiments on link prediction on benchmark datasets like YAGO and FMA ontologies. The results indicate that CosE outperforms state-of-the-art methods and successfully preserve the transitivity and symmetry of axioms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Miller, G.: WordNet: An Electronic Lexical Database. MIT press, Cambridge (1998)
Lehmann, J., et al.: DBpedia - a large-scale, multilingual knowledge base extracted from Wikipedia. Semant. Web 6(2), 167–195 (2015)
Suchanek, F.M., Kasneci, G., Weikum, G.: YAGO: a large ontology from wikipedia and WordNet. J. Web Sem. 6(3), 203–217 (2008)
Gutiérrez-Basulto, V., Schockaert, S.: From knowledge graph embedding to ontology embedding? an analysis of the compatibility between vector space representations and rules. In: KR, pp. 379–388 (2018)
Wang, Q., Mao, Z., Wang, B., Guo, L.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017)
Weston, J., Bordes, A., Yakhnenko, O., Usunier, N.: Connecting language and knowledge bases with embedding models for relation extraction. In: EMNLP, pp. 1366–1371 (2013)
Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: NIPS, pp. 2787–2795 (2013)
Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: AAAI, pp. 1112–1119 (2014)
Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: AAAI, pp. 2181–2187 (2015)
Nickel, M., Tresp, V., Kriegel, H.-P.: A three-way model for collective learning on multi-relational data. In: ICML, pp. 809–816 (2011)
Yang, B., Yih, W.-T., He, X., Gao, J., Deng, L.: Embedding Entities and Relations for Learning and Inference in Knowledge Bases. CoRR, abs/1412.6575 (2014)
Nickel, M., Rosasco, L., Poggio, T.A., et al.: Holographic embeddings of knowledge graphs. In: AAAI, pp. 1955–1961 (2016)
Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: ICML, pp. 2071–2080 (2016)
Diaz, G.I., Fokoue, A., Sadoghi, M.: EmbedS: scalable, ontology-aware graph embeddings. In: EBDT, pp. 433–436 (2018)
Lv, X., Hou, L., Li, J., Liu, Z.: Differentiating concepts and instances for knowledge graph embedding. In: EMNLP, pp. 1971–1979 (2018)
Fu, X., Qi, G., Zhang, Y., Zhou, Z.: Graph-based approaches to debugging and revision of terminologies in DL-Lite. Knowl.-Based Syst. 100, 1–12 (2016)
Socher, R., Chen, D., Manning, C.D., Ng, A.: Reasoning with neural tensor networks for knowledge base completion. In: NIPS, pp. 926–934 (2013)
Bordes, A., Weston, J., Collobert, R., Bengio, Y.: Learning structured embeddings of knowledge bases. In: AAAI, pp. 301–306 (2011)
Xiao, H., Huang, M., Hao, Y., Zhu, X.: TransA: An Adaptive Approach for Knowledge Graph Embedding. CoRR, abs/1509.05490 (2015)
Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: ACL, pp. 687–696 (2015)
Dong, X., et al.: Knowledge vault: a web-scale approach to probabilistic knowledge fusion. In: SIGKDD, pp. 601–610 (2014)
Liu, Q., et al.: Probabilistic Reasoning via Deep Learning: Neural Association Models. CoRR, abs/1603.07704 (2016)
Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., et al. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38
Shi, B., Weninger, T.: ProjE: embedding projection for knowledge graph completion. In: AAAI, pp. 1236–1242 (2017)
Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2D knowledge graph embeddings. In: AAAI, pp. 1811–1818 (2018)
Chen, M., Tian, Y., Chen, X., Xue, Z., Zaniolo, C.: On2Vec: embedding-based relation prediction for ontology population. In: SIAM, pp. 315–323 (2018)
Guo, S., Wang, Q., Wang, L., Wang, B., Guo, L.: Jointly embedding knowledge graphs and logical rules. In: EMNLP, pp. 192–202 (2016)
Guo, S., Wang, Q., Wang, L., Wang, B., Guo, L.: Knowledge graph embedding with iterative guidance from soft rules. In: AAAI, pp. 4816–4823 (2018)
Noy, N.F., Musen, M.A., Mejino Jr, J.L.V., Rosse, C.: Pushing the envelope: challenges in a frame-based representation of human anatomy. Data Knowl. Eng. 48(3), 335–359 (2004)
Gao, H., Qi, G., Ji, Q.: Schema induction from incomplete semantic data. Intell. Data Anal. 22(6), 1337–1353 (2018)
Han, X., et al.: OpenKE: an open toolkit for knowledge embedding. In: EMNLP, pp. 139–144 (2018)
Xie, R., Liu, Z., Lin, F., Lin, L.: Does william shakespeare really write hamlet? knowledge representation learning with confidence. In: AAAI, pp. 4954–4961 (2018)
Wang, M., Wang, R., Liu, J., Chen, Y., Zhang, L., Qi, G.: Towards empty answers in SPARQL: approximating querying with RDF embedding. In: Vrandečić, D., et al. (eds.) ISWC 2018. LNCS, vol. 11136, pp. 513–529. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00671-6_30
Acknowledgements
This work was partially supported by the National Key Research and Development Program of China under grant (2017YFB1002801, 2018YFC0830200), the Natural Science Foundation of China grant (U1736204), the Fundamental Research Funds for the Central Universities (3209009601).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Gao, H., Zheng, X., Li, W., Qi, G., Wang, M. (2019). Cosine-Based Embedding for Completing Schematic Knowledge. In: Tang, J., Kan, MY., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science(), vol 11838. Springer, Cham. https://doi.org/10.1007/978-3-030-32233-5_20
Download citation
DOI: https://doi.org/10.1007/978-3-030-32233-5_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32232-8
Online ISBN: 978-3-030-32233-5
eBook Packages: Computer ScienceComputer Science (R0)