Abstract
Graph Neural Networks (GNNs) are inherently suited for modeling graph-structured data and have been extensively utilized in Knowledge Graph Embedding (KGE). Current GNN-based KGE models primarily focus on message aggregation among entities, often neglecting the aggregation of messages related to relations. Additionally, the interaction information between entities and relations, as well as their distinctions, is overlooked during the updating of relations. To address these issues, we propose the Entity-Relation Aggregation Mechanism Graph Neural Network (ERAGNN), where relations are also considered as nodes in the graph for message aggregation. The ERAGNN layer comprises an entity aggregation sublayer and a relation aggregation sublayer. The entity aggregation sublayer employs an entity-relation composition operation to aggregate messages across entity nodes, while the relation aggregation sublayer utilizes an entity-entity composition operation. Furthermore, shared-weight matrices are implemented to enhance interactions between entities and relations. Lastly, an attention mechanism is incorporated to differentiate neighboring messages during the update of relation embeddings. Experimental results demonstrate that ERAGNN achieves state-of-the-art link prediction performance on three benchmark datasets: FB15k-237, WN18RR, and WN18.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Availability of data and materials
Link to data is included in the manuscript.
References
Ji S, Pan S, Cambria E, Marttinen P, Yu PS (2022) A survey on knowledge graphs: Representation, acquisition, and applications. IEEE Trans Neural Netw Learn Syst 33(2):494–514. https://doi.org/10.1109/TNNLS.2021.3070843
Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J (2008) Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD international conference on management of data, pp 1247–1250
Miller GA (1995) Wordnet: a lexical database for English. Commun ACM 38(11):39–41
Suchanek FM, Kasneci G, Weikum G (2007) Yago: a core of semantic knowledge. In: Proceedings of the 16th international conference on world wide web, pp 697–706
Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. In: Proceedings of the 26th international conference on neural information processing systems-volume 2, pp 2787–2795
Wang Z, Zhang J, Feng J, Chen Z (2014) Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the twenty-eighth AAAI conference on artificial intelligence, pp 1112–1119
Lin Y, Liu Z, Sun M, Liu Y, Zhu X (2015) Learning entity and relation embeddings for knowledge graph completion. In: Proceedings of the twenty-ninth AAAI conference on artificial intelligence, pp 2181–2187
Nickel M, Tresp V, Kriegel H-P (2011) A three-way model for collective learning on multi-relational data. In: Proceedings of the 28th international conference on international conference on machine learning, pp 809–816
Yang B, Yih SW-t, He X, Gao J, Deng L (2015) Embedding entities and relations for learning and inference in knowledge bases. In: International conference on learning representations, pp 1–12
Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2d knowledge graph embeddings. In: Proceedings of the thirty-second AAAI conference on artificial intelligence and thirtieth innovative applications of artificial intelligence conference and eighth AAAI symposium on educational advances in artificial intelligence, pp 1811–1818
Nguyen TD, Nguyen DQ, Phung D, et al (2018)A novel embedding model for knowledge base completion based on convolutional neural network. In: Proceedings of the 2018 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 2 (Short Papers), pp 327–333
Trouillon T, Welbl J, Riedel S, Gaussier É, Bouchard G (2016) Complex embeddings for simple link prediction. In: International conference on machine learning, pp 2071–2080. PMLR
Sun Z, Deng Z-H, Nie J-Y, Tang J (2018) Rotate: Knowledge graph embedding by relational rotation in complex space. In: International conference on learning representations, pp 1–18
Zhang S, Tay Y, Yao L, Liu Q (2019) Quaternion knowledge graph embeddings. In: Proceedings of the 33rd international conference on neural information processing systems, pp 2735–2745
Schlichtkrull M, Kipf TN, Bloem P, Van Den Berg R, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: European semantic web conference, pp 593–607. Springer
Shang C, Tang Y, Huang J, Bi J, He X, Zhou B (2019) End-to-end structure-aware convolutional networks for knowledge base completion. In: Proceedings of the AAAI conference on artificial intelligence, vol 33, pp 3060–3067
Cai L, Yan B, Mai G, Janowicz K, Zhu R (2019) Transgcn: Coupling transformation assumptions with graph convolutional networks for link prediction. In: Proceedings of the 10th international conference on knowledge capture, pp 131–138
Bansal T, Juan D-C, Ravi S, McCallum A (2019) A2n: Attending to neighbors for knowledge graph inference. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 4387–4392
Zhang Z, Zhuang F, Zhu H, Shi Z, Xiong H, He Q (2020) Relational graph neural network with hierarchical attention for knowledge graph completion. In: Proceedings of the AAAI conference on artificial intelligenc, vol 34, pp 9612–9619
Nathani D, Chauhan J, Sharma C, Kaul M (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 4710–4723
Vashishth S, Sanyal S, Nitin V, Talukdar P (2019) Composition-based multi-relational graph convolutional networks. In: International conference on learning representations, pp 1–15
Zhang Y, Chen X, Yang Y, Ramamurthy A, Li B, Qi Y, Song L (2019) Efficient probabilistic logic reasoning with graph neural networks. In: International conference on learning representations, pp 1–20
Gao C, Sun C, Shan L, Lin L, Wang M (2020) Rotate3d: Representing relations as rotations in three-dimensional space for knowledge graph embedding. In: Proceedings of the 29th ACM international conference on information & knowledge management, pp 385–394
Vashishth S, Sanyal S, Nitin V, Agrawal N, Talukdar P (2020) Interacte: Improving convolution-based knowledge graph embeddings by increasing feature interactions. In: Proceedings of the AAAI conference on artificial intelligence, vol 34, pp 3009–3016
Wu Z, Pan S, Chen F, Long G, Zhang C, Philip SY (2020) A comprehensive survey on graph neural networks. IEEE Trans Neural Netw Learn Syst 32(1):4–24
Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: International conference on learning representations
Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2018) Graph attention networks. In: International Conference on Learning Representations, pp 1–12
Arora S (2020) A survey on graph neural networks for knowledge graph completion. arXiv:2007.12374
Chen M, Zhang Y, Kou X, Li Y, Zhang Y (2021) r-gat: Relational graph attention network for multi-relational graphs. arXiv:2109.05922
Liu X, Tan H, Chen Q, Lin G (2021) Ragat: Relation aware n network for knowledge graph completion. IEEE Access 9:20840–20849
Li Z, Liu H, Zhang Z, Liu T, Xiong NN (2021) Learning knowledge graph embedding with heterogeneous relation attention networks. IEEE Trans Neural Netw Learn Syst
Zhang Z, Wang J, Ye J, Wu F (2022) Rethinking graph convolutional networks in knowledge graph completion. In: Proceedings of the ACM web conference 2022, pp 798–807
Li W, Peng R, Li Z (2022) Improving knowledge graph completion via increasing embedding interactions. Appl Intell 1–19
He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778
Toutanova K, Chen D (2015) Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd workshop on continuous vector space models and their compositionality, pp 57–66
Allen C, Balazevic I, Hospedales T (2020) Interpreting knowledge graph relation representation from word embeddings. In: International conference on learning representations, pp 1–16
Acknowledgements
We would like to thank Professor Xin Wang and Zhiyong Feng from the College of Intelligence and Computing of Tianjin University for their valuable revising of the manuscript.
Funding
This research was partially funded by the Major Research Plan of the National Natural Science Foundation of China, No. 92471206.
Author information
Authors and Affiliations
Contributions
Conceptualization: Guozheng Rao, Li Zhang, Guoshun Xu, Qing Cong; Methodology: Guozheng Rao, Qing Cong, Guoshun Xu; Formal analysis and investigation: Qing Cong, Guoshun Xu; Writing - original draft preparation: Li Zhang, Guoshun Xu, Guozheng Rao, Qing Cong; Writing - review and editing: Li Zhang; Funding acquisition: Guozheng Rao; Resources: Qing Cong; Supervision: Guozheng Rao, Li Zhang
Corresponding authors
Ethics declarations
Conflicts of interest
The authors declare no conflicts of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Xu, G., Rao, G., Zhang, L. et al. Entity-relation aggregation mechanism graph neural network for knowledge graph embedding. Appl Intell 55, 43 (2025). https://doi.org/10.1007/s10489-024-05907-y
Accepted:
Published:
DOI: https://doi.org/10.1007/s10489-024-05907-y