Abstract
Previous knowledge graph embedding approaches usually map entities to representations and utilize score functions to predict the target entities, yet they typically struggle to reason rare or emerging unseen entities. In this paper, we propose kNN-KGE, a new knowledge graph embedding approach with pre-trained language models, by linearly interpolating its entity distribution with k-nearest neighbors. We compute the nearest neighbors based on the distance in the entity embedding space from the knowledge store. Our approach can allow rare or emerging entities to be memorized explicitly rather than implicitly in model parameters. Experimental results demonstrate that our approach can improve inductive and transductive link prediction results and yield better performance for low-resource settings with only a few triples, which might be easier to reason via explicit memory (Code is available at: https://github.com/zjunlp/KNN-KG).
P. Wang and X. Xie—Equal contribution.
Similar content being viewed by others
Notes
- 1.
or head entity prediction denoted by \((?, r, e_i)\).
References
Balazevic, I., Allen, C., Hospedales, T.M.: Tucker: tensor factorization for knowledge graph completion. In: Proceeding of EMNLP (2019)
Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: Proceedings of NeurIPS (2013)
Chami, I., Wolf, A., Juan, D., Sala, F., Ravi, S., Ré, C.: Low-dimensional hyperbolic knowledge graph embeddings. In: Proceedings of ACL (2020)
Chen, M., et al.: Meta-learning based knowledge extrapolation for knowledge graphs in the federated setting. In: Raedt, L.D. (ed.) Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI 2022, Vienna, Austria, 23–29 July 2022, pp. 1966–1972. ijcai.org (2022). https://doi.org/10.24963/ijcai.2022/273
Chen, M., et al.: Meta-knowledge transfer for inductive knowledge graph embedding. In: Amigó, E., Castells, P., Gonzalo, J., Carterette, B., Culpepper, J.S., Kazai, G. (eds.) SIGIR 2022: The 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, Madrid, Spain, 11–15 July 2022, pp. 927–937. ACM (2022). https://doi.org/10.1145/3477495.3531757
Chen, X., et al.: KnowPrompt: knowledge-aware prompt-tuning with synergistic optimization for relation extraction. In: Proceedings of the ACM Web Conference 2022. ACM (2022). https://doi.org/10.1145/3485447.3511998
Daza, D., Cochez, M., Groth, P.: Inductive entity representations from text via link prediction. In: Proceedings of WWW (2021)
Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2D knowledge graph embeddings. In: Proceedings of AAAI (2018)
Kassner, N., Schütze, H.: BERT-kNN: Adding a kNN search component to pretrained language models for better QA. In: Findings of EMNLP (2020)
Khandelwal, U., Levy, O., Jurafsky, D., Zettlemoyer, L., Lewis, M.: Generalization through memorization: Nearest neighbor language models. In: Proceedings of ICLR (2020)
Kim, B., Hong, T., Ko, Y., Seo, J.: Multi-task learning for knowledge graph completion with pre-trained language models. In: Proceedings of the 28th International Conference on Computational Linguistics, pp. 1737–1743. International Committee on Computational Linguistics, Barcelona (2020). https://doi.org/10.18653/v1/2020.coling-main.153, https://aclanthology.org/2020.coling-main.153
Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: Proceedings of AAAI (2015)
Liu, S., Grau, B., Horrocks, I., Kostylev, E.: Indigo: GNN-based inductive knowledge graph completion using pair-wise encoding. In: Proceedings of NeurIPS (2021)
Qiao, S., et al.: Reasoning with language model prompting: a survey (2023)
Saxena, A., Kochsiek, A., Gemulla, R.: Sequence-to-sequence knowledge graph completion and question answering. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 2814–2828. Association for Computational Linguistics, Dublin (2022). https://doi.org/10.18653/v1/2022.acl-long.201, https://aclanthology.org/2022.acl-long.201
Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., et al. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38
Sun, Z., Deng, Z.H., Nie, J.Y., Tang, J.: RotatE: knowledge graph embedding by relational rotation in complex space. In: ICLR (2019)
Toutanova, K., Chen, D., Pantel, P., Poon, H., Choudhury, P., Gamon, M.: Representing text for joint embedding of text and knowledge bases. In: Proceedings of EMNLP (2015)
Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: Proceedings of ICML (2016)
Wang, B., Shen, T., Long, G., Zhou, T., Wang, Y., Chang, Y.: Structure-augmented text representation learning for efficient knowledge graph completion. In: Proceedings of WWW (2021)
Yang, B., Yih, W., He, X., Gao, J., Deng, L.: Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of ICLR (2015)
Yao, L., Mao, C., Luo, Y.: KG-BERT: BERT for knowledge graph completion. CoRR (2019)
Yao, Y., et al.: Editing large language models: problems, methods, and opportunities (2023)
Zhang, N., Deng, S., Sun, Z., Chen, J., Zhang, W., Chen, H.: Relation adversarial network for low resource knowledge graph completion. In: Proceedings of WWW (2020)
Zhang, N., et al.: AliCG: fine-grained and evolvable conceptual graph construction for semantic search at Alibaba. In: Proceedings of KDD (2021)
Zhang, N., Li, L., Chen, X., Liang, X., Deng, S., Chen, H.: Multimodal analogical reasoning over knowledge graphs (2023). https://openreview.net/forum?id=NRHajbzg8y0P
Zhang, N., Xie, X., Chen, X., Deng, S., Ye, H., Chen, H.: Knowledge collaborative fine-tuning for low-resource knowledge graph completion. J. Softw. 33(10), 3531–3545 (2022)
Zhao, W.X., et al.: A survey of large language models (2023)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, P., Xie, X., Wang, X., Zhang, N. (2023). Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings. In: Liu, F., Duan, N., Xu, Q., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2023. Lecture Notes in Computer Science(), vol 14302. Springer, Cham. https://doi.org/10.1007/978-3-031-44693-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-031-44693-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-44692-4
Online ISBN: 978-3-031-44693-1
eBook Packages: Computer ScienceComputer Science (R0)