Abstract
At present, most of the methods for knowledge graph completion (KGC) task highly rely on external knowledge base or graph representation learning. However, how to complete this task without using any external prior knowledge is still a huge challenge and difficulty. To this end, we propose a novel framework which converts the plausibility evaluation of knowledge triple task to the question and answer (QA) task with the thought of KG-BERT and prompt learning. We also test the effect of different question types on the results. Secondly, by fine-tuning two pre-trained language models BERT-wwm-ext and ERNIE-Gram on these generated sequences, so that they can complete the QA task. We won the 5th place at CCKS 2022 track 1 rematch stage, which proved the effectiveness of our method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Wang, X., et al.: KEPLER: a unified model for knowledge embedding and pre-trained language representation. Trans. Assoc. Comput. Linguist. 9, 176–194 (2021)
Zhang, S., et al.: Quaternion knowledge graph embeddings. Adv. Neural Inform. Process. Syst. 32 (2019)
Lee, K., Devlin, J., Chang, M.-W., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT (2019)
Liu, Y., et al.: Roberta: a robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Sun, Y., et al.: Ernie: enhanced representation through knowledge integration. arXiv preprint arXiv:1904.09223 (2019)
Yao, L., Chengsheng M., Yuan, L.: KG-BERT: BERT for knowledge graph completion. arXiv preprint arXiv:1909.03193 (2019)
Antoine, B., et al.: Translating embeddings for modeling multi-relational data. Adv. Neural Inform. Process. Syst. 26 (2013)
Wang, Z., et al. “Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28, no. 1 (2014)
Sun, Z., et al.: RotatE: knowledge graph embedding by relational rotation in complex space. In: International Conference on Learning Representations (2018)
Wang, B., et al.: Structure-augmented text representation learning for efficient knowledge graph completion. In: Proceedings of the Web Conference 2021 (2021)
Nils, R., Gurevych, I.: Sentence-BERT: sentence embeddings using siamese BERT-networks. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (2019)
Lv, X., et al.: Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable Evaluation and a Reasonable Approach. Findings of the Association for Computational Linguistics: ACL 2022 (2022)
Cui, Y., et al.: Pre-training with whole word masking for chinese BERT. IEEE/ACM Trans. Audio Speech Lang. Process. 29, 3504–3514 (2021)
Xiao, D., et al.: ERNIE-gram: pre-training with explicitly N-gram masked language modeling for natural language understanding. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
Diederik, P.K., Ba, J.: Adam: A Method for Stochastic Optimization. ICLR (Poster) (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Jia, S., Cao, J. (2022). The Method for Plausibility Evaluation of Knowledge Triple Based on QA. In: Zhang, N., Wang, M., Wu, T., Hu, W., Deng, S. (eds) CCKS 2022 - Evaluation Track. CCKS 2022. Communications in Computer and Information Science, vol 1711. Springer, Singapore. https://doi.org/10.1007/978-981-19-8300-9_25
Download citation
DOI: https://doi.org/10.1007/978-981-19-8300-9_25
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-8299-6
Online ISBN: 978-981-19-8300-9
eBook Packages: Computer ScienceComputer Science (R0)