ABSTRACT
Knowledge graph (KG) embedding seeks to learn vector representations for entities and relations. Conventional models reason over graph structures, but they suffer from the issues of graph incompleteness and long-tail entities. Recent studies have used pre-trained language models to learn embeddings based on the textual information of entities and relations, but they cannot take advantage of graph structures. In the paper, we show empirically that these two kinds of features are complementary for KG embedding. To this end, we propose CoLE, a Co-distillation Learning method for KG Embedding that exploits the complementarity of graph structures and text information. Its graph embedding model employs Transformer to reconstruct the representation of an entity from its neighborhood subgraph. Its text embedding model uses a pre-trained language model to generate entity representations from the soft prompts of their names, descriptions and relational neighbors. To let the two models promote each other, we propose co-distillation learning that allows them to distill selective knowledge from each other's prediction logits. In our co-distillation learning, each model serves as both a teacher and a student. Experiments on benchmark datasets demonstrate that the two models outperform their related baselines, and the ensemble method CoLE with co-distillation learning advances the state-of-the-art of KG embedding.
- Ivana Balazevic, Carl Allen, and Timothy M. Hospedales. 2019. TuckER: Tensor Factorization for Knowledge Graph Completion. In EMNLP-IJCNLP. ACL, Hong Kong, China, 5184--5193.Google Scholar
- Antoine Bordes, Nicolas Usunier, Alberto García-Durá n, Jason Weston, and Oksana Yakhnenko. 2013. Translating Embeddings for Modeling Multi-relational Data. In NIPS. Curran Associates, Inc., Lake Tahoe, NV, USA, 2787--2795.Google Scholar
- Zongsheng Cao, Qianqian Xu, Zhiyong Yang, Xiaochun Cao, and Qingming Huang. 2021. Dual Quaternion Knowledge Graph Embeddings. In AAAI. AAAI Press, online, 6894--6902.Google Scholar
- Ines Chami, Adva Wolf, Da-Cheng Juan, Frederic Sala, Sujith Ravi, and Christopher Ré. 2020. Low-Dimensional Hyperbolic Knowledge Graph Embeddings. In ACL. ACL, online, 6901--6914.Google Scholar
- Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, and Yangfeng Ji. 2021. HittER: Hierarchical Transformers for Knowledge Graph Embeddings. In EMNLP. ACL, online, 10395--10407.Google Scholar
- Louis Clouâ tre, Philippe Trempe, Amal Zouaq, and Sarath Chandar. 2021. MLMLM: Link Prediction with Mean Likelihood Masked Language Model. In Findings of ACL. ACL, online, 4321--4331.Google Scholar
- Caglar Demir and Axel-Cyrille Ngonga Ngomo. 2021. Convolutional Complex Knowledge Graph Embeddings. In ESWC. Springer, Hersonissos, Greece, 409--424.Google Scholar
- Tim Dettmers, Pasquale Minervini, Pontus Stenetorp, and Sebastian Riedel. 2018. Convolutional 2D Knowledge Graph Embeddings. In AAAI. AAAI Press, New Orleans, Louisiana, USA, 1811--1818.Google Scholar
- Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In NAACL. ACL, Minneapolis, MN, USA, 4171--4186.Google Scholar
- Hady ElSahar, Pavlos Vougiouklis, Arslen Remaci, Christophe Gravier, Jonathon S. Hare, Fré dé rique Laforest, and Elena Simperl. 2018. T-REx: A Large Scale Alignment of Natural Language with Knowledge Base Triples. In LREC. ELRA, Miyazaki, Japan, 3448--3452.Google Scholar
- Luis Galárraga, Simon Razniewski, Antoine Amarilli, and Fabian M. Suchanek. 2017. Predicting Completeness in Knowledge Bases. In WSDM. ACM, Cambridge, UK, 375--383.Google Scholar
- Lingbing Guo, Zequn Sun, and Wei Hu. 2019. Learning to Exploit Long-term Relational Dependencies in Knowledge Graphs. In ICML. PMLR, Long Beach, CA, USA, 2505--2514.Google Scholar
- Xu Han, Shulin Cao, Xin Lv, Yankai Lin, Zhiyuan Liu, Maosong Sun, and Juanzi Li. 2018. OpenKE: An Open Toolkit for Knowledge Embedding. In EMNLP System Demonstrations. ACL, Brussels, Belgium, 139--144.Google Scholar
- Shaoxiong Ji, Shirui Pan, Erik Cambria, Pekka Marttinen, and Philip S. Yu. 2022. A Survey on Knowledge Graphs: Representation, Acquisition, and Applications. IEEE Transactions on Neural Networks and Learning Systems, Vol. 33, 2 (2022), 494--514.Google ScholarCross Ref
- Yankai Lin, Zhiyuan Liu, Huan-Bo Luan, Maosong Sun, Siwei Rao, and Song Liu. 2015a. Modeling Relation Paths for Representation Learning of Knowledge Bases. In EMNLP. ACL, Lisbon, Portugal, 705--714.Google Scholar
- Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, and Xuan Zhu. 2015b. Learning Entity and Relation Embeddings for Knowledge Graph Completion. In AAAI. AAAI Press, Austin, Texas, USA, 2181--2187.Google Scholar
- Xin Lv, Yankai Lin, Yixin Cao, Lei Hou, Juanzi Li, Zhiyuan Liu, Peng Li, and Jie Zhou. 2022. Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable Evaluation and a Reasonable Approach. In Findings of ACL. ACL, Dublin, Ireland, 3570--3581.Google Scholar
- Rahul Nadkarni, David Wadden, Iz Beltagy, Noah A. Smith, Hannaneh Hajishirzi, and Tom Hope. 2021. Scientific Language Models for Biomedical Knowledge Base Completion: An Empirical Study. In AKBC. OpenReview.net, London, UK.Google Scholar
- Fabio Petroni, Tim Rocktäschel, Sebastian Riedel, Patrick S. H. Lewis, Anton Bakhtin, Yuxiang Wu, and Alexander H. Miller. 2019. Language Models as Knowledge Bases?. In EMNLP-IJCNLP. ACL, Hong Kong, China, 2463--2473.Google Scholar
- Petar Ristoski and Heiko Paulheim. 2016. RDF2Vec: RDF Graph Embeddings for Data Mining. In ISWC. Springer, Kobe, Japan, 498--514.Google Scholar
- Andrea Rossi, Denilson Barbosa, Donatella Firmani, Antonio Matinata, and Paolo Merialdo. 2021. Knowledge Graph Embedding for Link Prediction: A Comparative Analysis. ACM Transactions on Knowledge Discovery from Data, Vol. 15, 2 (2021), 14:1--14:49.Google ScholarDigital Library
- Apoorv Saxena, Adrian Kochsiek, and Rainer Gemulla. 2022. Sequence-to-Sequence Knowledge Graph Completion and Question Answering. In ACL. ACL, Dublin, Ireland, 2814--2828.Google Scholar
- Michael Sejr Schlichtkrull, Thomas N. Kipf, Peter Bloem, Rianne van den Berg, Ivan Titov, and Max Welling. 2018. Modeling Relational Data with Graph Convolutional Networks. In ESWC. Springer, Heraklion, Crete, Greece, 593--607.Google Scholar
- Zhiqing Sun, Zhi-Hong Deng, Jian-Yun Nie, and Jian Tang. 2019. RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space. In ICLR. OpenReview.net, New Orleans, LA, USA, 1--18.Google Scholar
- Kristina Toutanova, Danqi Chen, Patrick Pantel, Hoifung Poon, Pallavi Choudhury, and Michael Gamon. 2015. Representing Text for Joint Embedding of Text and Knowledge Bases. In EMNLP. ACL, Lisbon, Portugal, 1499--1509.Google Scholar
- Théo Trouillon, Johannes Welbl, Sebastian Riedel, Éric Gaussier, and Guillaume Bouchard. 2016. Complex Embeddings for Simple Link Prediction. In ICML. PMLR, New York City, NY, USA, 2071--2080.Google Scholar
- Shikhar Vashishth, Soumya Sanyal, Vikram Nitin, and Partha P. Talukdar. 2020. Composition-based Multi-Relational Graph Convolutional Networks. In ICLR. OpenReview.net, Addis Ababa, Ethiopia, 1--16.Google Scholar
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is All you Need. In NIPS. Curran Associates, Inc., Long Beach, CA, USA, 5998--6008.Google Scholar
- Bo Wang, Tao Shen, Guodong Long, Tianyi Zhou, Ying Wang, and Yi Chang. 2021a. Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion. In WWW. ACM / IW3C2, Ljubljana, Slovenia, 1737--1748.Google Scholar
- Quan Wang, Pingping Huang, Haifeng Wang, Songtai Dai, Wenbin Jiang, Jing Liu, Yajuan Lyu, Yong Zhu, and Hua Wu. 2019. CoKE: Contextualized Knowledge Graph Embedding. CoRR, Vol. abs/1911.02168 (2019), 1--10.Google Scholar
- Quan Wang, Zhendong Mao, Bin Wang, and Li Guo. 2017. Knowledge Graph Embedding: A Survey of Approaches and Applications. IEEE Transactions on Knowledge and Data Engineering, Vol. 29, 12 (2017), 2724--2743.Google ScholarCross Ref
- Shen Wang, Xiaokai Wei, Cícero Nogueira dos Santos, Zhiguo Wang, Ramesh Nallapati, Andrew O. Arnold, Bing Xiang, Philip S. Yu, and Isabel F. Cruz. 2021b. Mixed-Curvature Multi-Relational Graph Neural Network for Knowledge Graph Completion. In WWW. ACM, Ljubljana, Slovenia, 1761--1771.Google Scholar
- Zhen Wang, Jianwen Zhang, Jianlin Feng, and Zheng Chen. 2014. Knowledge Graph Embedding by Translating on Hyperplanes. In AAAI. AAAI Press, Québec City, Québec, Canada, 1112--1119.Google Scholar
- Liang Yao, Chengsheng Mao, and Yuan Luo. 2019. KG-BERT: BERT for Knowledge Graph Completion. CoRR, Vol. abs/1909.03193 (2019), 1--8.Google Scholar
- Ningyu Zhang, Xin Xie, Xiang Chen, Shumin Deng, Chuanqi Tan, Fei Huang, Xu Cheng, and Huajun Chen. 2022. Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings. CoRR, Vol. abs/2201.05575 (2022), 1--9.Google Scholar
- Borui Zhao, Quan Cui, Renjie Song, Yiyu Qiu, and Jiajun Liang. 2022. Decoupled Knowledge Distillation. CoRR, Vol. abs/2203.08679 (2022), 1--12.Google Scholar
Index Terms
- I Know What You Do Not Know: Knowledge Graph Embedding via Co-distillation Learning
Recommendations
Knowledge Graph Embedding via Entities’ Type Mapping Matrix
Neural Information ProcessingAbstractKnowledge graph (KG) is the most popular method for presenting knowledge in search engines and other natural-language processing (NLP) applications. However, KG remains incomplete, inconsistent, and not completely accurate. To deal with the ...
Multimodal Link Prediction Method for Commodity Knowledge Graph
HP3C '23: Proceedings of the 2023 7th International Conference on High Performance Compilation, Computing and CommunicationsKnowledge graph link prediction is to predict whether there is a relation between two entities according to the existing knowledge graph data, which is of great significance for improving the existing knowledge graph. Based on the application of ...
Learning hyperbolic attention-based embeddings for link prediction in knowledge graphs
AbstractKnowledge graph (KG) embedding methods aim to learn low-dimensional representations of entities and relations to predict new valid triples for KG completion. Most of the existing KG embedding models learn embeddings in Euclidean space, ...
Highlights- Euclidean space cannot accurately preserve the hierarchies present in KGs.
- ...
Comments