Abstract
Joint extraction of entities and relations aims at recognizing relational triples (subject s, relation r, object o) from unstructured text. For any entity pair (s, o) in correct relational triples, they do not appear independently, but depending on each other. While existing approaches usually model entity pairs only by sharing the encoder layer, which is insufficient to exploit entity pair intrinsic connection. To solve this problem, we propose to strengthen entity pairs connection (SEPC) by utilizing the duality property of entity pairs, which can further improve the joint extraction. The entity pairs recognization is transformed to finding subject conditioned on the object and finding object conditioned on the subject, and the dual supervised learning is introduced to model their connection. We finally demonstrate the effectiveness of our proposed method on two widely used datasets NYT and WebNLG (Code and data available: https://github.com/zjp9574/SEPC).
Supported by the National Key Research and Development Program of China (grant 2016YFB0801003), the Strategic Priority Research Program of Chinese Academy of Sciences (grant XDC02040400), the Key Research and Development Program for Guangdong Province (grant No.2019B010137003) and the National Natural Science Foundation of China (grant No.61902394).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
De Sa, C., et al.: DeepDive: declarative knowledge base construction. ACM SIGMOD Rec. 45(1), 60–67 (2016)
Getman, J., Ellis, J., Strassel, S., Song, Z., Tracey, J.: Laying the groundwork for knowledge base population: nine years of linguistic resources for TAC KBP. In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018) (2018)
Singh, B., Dubey, V., Sheetlani, J.: A review and analysis on knowledge discovery and data mining techniques. Int. J. Adv. Technol. Eng. Explor. 5(41), 70–77 (2018)
Zelenko, D., Aone, C., Richardella, A.: Kernel methods for relation extraction. J. Mach. Learn. Res. 3(Feb), 1083–1106 (2003)
Zhou, G., Su, J., Zhang, J., Zhang, M.: Exploring various knowledge in relation extraction. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics (ACL 2005), pp. 427–434 (2005)
Chan, Y.S., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560 (2011)
Gupta, P., Schütze, H., Andrassy, B.: Table filling multi-task recurrent neural network for joint entity and relation extraction. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 2537–2547 (2016)
Katiyar, A., Cardie, C.: Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 917–928 (2017)
Zeng, X., Zeng, D., He, S., Liu, K., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Australia, (Volume 1: Long Papers), pp. 506–514. Association for Computational Linguistics, July 2018
Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1227–1236 (2017)
Zeng, X., He, S., Zeng, D., Liu, K., Liu, S., Zhao, J.: Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, pp. 367–377. Association for Computational Linguistics, November 2019
Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1476–1488 (2020)
Xia, Y., Qin, T., Chen, W., Bian, J., Yu, N., Liu, T.Y.: Dual supervised learning. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 3789–3798 (2017)
Gormley, M.R., Yu, M., Dredze, M.: Improved relation extraction with feature-rich compositional embedding models. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1774–1784 (2015)
Bekoulis, G., Deleu, J., Demeester, T., Develder, C.: Joint entity recognition and relation extraction as a multi-head selection problem (2018)
Fu, T.J., Li, P.H., Ma, W.Y.: GraphRel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1409–1418 (2019)
Yu, B., et al.: Joint extraction of entities and relations based on a novel decomposition strategy. In: Proceedings of ECAI (2020)
Sundaram, N., Brox, T., Keutzer, K.: Dense point trajectories by GPU-accelerated large displacement optical flow. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6311, pp. 438–451. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15549-9_32
Brislin, R.W.: Back-translation for cross-cultural research. J. Cult. Psychol. 1(3), 185–216 (1970)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (1) (2019)
Zhu, J. Y., Park, T., Isola, P., Efros, A.A.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2223–2232 (2017)
Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS (LNAI), vol. 6323, pp. 148–163. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15939-8_10
Gardent, C., Shimorina, A., Narayan, S., Perez-Beltrachini, L.: Creating training corpora for NLG micro-planning. In: 55th Annual Meeting of the Association for Computational Linguistics (ACL) (2017)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhao, J., Zhang, P., Liu, T., Shi, J. (2021). SEPC: Improving Joint Extraction of Entities and Relations by Strengthening Entity Pairs Connection. In: Karlapalem, K., et al. Advances in Knowledge Discovery and Data Mining. PAKDD 2021. Lecture Notes in Computer Science(), vol 12712. Springer, Cham. https://doi.org/10.1007/978-3-030-75762-5_64
Download citation
DOI: https://doi.org/10.1007/978-3-030-75762-5_64
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-75761-8
Online ISBN: 978-3-030-75762-5
eBook Packages: Computer ScienceComputer Science (R0)