Abstract
Extracting relational triples from unstructured text is essential for the construction of large-scale knowledge graphs, QA and other downstream tasks. The purpose of dialogue relation extraction is to extract the relations between entities from the multi-person dialogue texts. The existing dialogue relation extraction models only focused on coarse-grained global information and ignored fine-grained local information. In this paper, we propose a dialogue relation extraction model BERT-MG to capture the features on different granularity at different BERT layers to take advantage of the fine-grained dialogue features. Moreover, we design a type-confidence mechanism to use the entity type information to assist relation inference. Experimental results on the DialogRE dataset prove that our proposed model BERT-MG outperforms the SOTA baselines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chen, X., et al.: AdaPrompt: adaptive prompt-based finetuning for relation extraction. CoRR abs/2104.07650 (2021)
Christopoulou, F., Miwa, M., Ananiadou, S.: Connecting the dots: document-level neural relation extraction with edge-oriented graphs. In: EMNLP-IJCNLP, pp. 4924–4935 (2019)
Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT, pp. 4171–4186 (2019)
Jawahar, G., Sagot, B., Seddah, D.: What does BERT learn about the structure of language? In: ACL, pp. 3651–3657 (2019)
Kovaleva, O., Romanov, A., Rogers, A., Rumshisky, A.: Revealing the dark secrets of BERT. In: EMNLP-IJCNLP, pp. 4364–4373 (2019)
Li, B., Ye, W., Sheng, Z., Xie, R., Xi, X., Zhang, S.: Graph enhanced dual attention network for document-level relation extraction. In: COLING, pp. 1551–1560 (2020)
Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: ACL, pp. 1340–1350 (2019)
Sun, C., Qiu, X., Xu, Y., Huang, X.: How to fine-tune BERT for text classification? In: CCL, pp. 194–206 (2019)
Sun, Z., Deng, Z., Nie, J., Tang, J.: RotatE: knowledge graph embedding by relational rotation in complex space. In: ICLR (2019)
Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: ACL, pp. 1476–1488 (2020)
Xue, F., Sun, A., Zhang, H., Chng, E.S.: An embarrassingly simple model for dialogue relation extraction. CoRR abs/2012.13873 (2020)
Xue, F., Sun, A., Zhang, H., Chng, E.S.: GDPNet: refining latent multi-view graph for relation extraction. In: AAAI, pp. 14194–14202 (2021)
Yu, D., Sun, K., Cardie, C., Yu, D.: Dialogue-based relation extraction. In: ACL, pp. 4927–4940 (2020)
Zeng, S., Xu, R., Chang, B., Li, L.: Double graph based reasoning for document-level relation extraction. In: EMNLP, pp. 1630–1640 (2020)
Zhang, Y., Wallace, B.C.: A sensitivity analysis of (and practitioners’ guide to) convolutional neural networks for sentence classification. In: IJCNLP, pp. 253–263 (2017)
Acknowledgments
The authors would like to thank the three anonymous reviewers for their comments on this paper. This research was supported by the National Natural Science Foundation of China (No. 61836007, 61772354 and 61773276), and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, Q., Li, P. (2021). Employing Multi-granularity Features to Extract Entity Relation in Dialogue. In: Wang, L., Feng, Y., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2021. Lecture Notes in Computer Science(), vol 13028. Springer, Cham. https://doi.org/10.1007/978-3-030-88480-2_21
Download citation
DOI: https://doi.org/10.1007/978-3-030-88480-2_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-88479-6
Online ISBN: 978-3-030-88480-2
eBook Packages: Computer ScienceComputer Science (R0)