Skip to main content

Employing Multi-granularity Features to Extract Entity Relation in Dialogue

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13028))

  • 2673 Accesses

Abstract

Extracting relational triples from unstructured text is essential for the construction of large-scale knowledge graphs, QA and other downstream tasks. The purpose of dialogue relation extraction is to extract the relations between entities from the multi-person dialogue texts. The existing dialogue relation extraction models only focused on coarse-grained global information and ignored fine-grained local information. In this paper, we propose a dialogue relation extraction model BERT-MG to capture the features on different granularity at different BERT layers to take advantage of the fine-grained dialogue features. Moreover, we design a type-confidence mechanism to use the entity type information to assist relation inference. Experimental results on the DialogRE dataset prove that our proposed model BERT-MG outperforms the SOTA baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chen, X., et al.: AdaPrompt: adaptive prompt-based finetuning for relation extraction. CoRR abs/2104.07650 (2021)

    Google Scholar 

  2. Christopoulou, F., Miwa, M., Ananiadou, S.: Connecting the dots: document-level neural relation extraction with edge-oriented graphs. In: EMNLP-IJCNLP, pp. 4924–4935 (2019)

    Google Scholar 

  3. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT, pp. 4171–4186 (2019)

    Google Scholar 

  4. Jawahar, G., Sagot, B., Seddah, D.: What does BERT learn about the structure of language? In: ACL, pp. 3651–3657 (2019)

    Google Scholar 

  5. Kovaleva, O., Romanov, A., Rogers, A., Rumshisky, A.: Revealing the dark secrets of BERT. In: EMNLP-IJCNLP, pp. 4364–4373 (2019)

    Google Scholar 

  6. Li, B., Ye, W., Sheng, Z., Xie, R., Xi, X., Zhang, S.: Graph enhanced dual attention network for document-level relation extraction. In: COLING, pp. 1551–1560 (2020)

    Google Scholar 

  7. Li, X., et al.: Entity-relation extraction as multi-turn question answering. In: ACL, pp. 1340–1350 (2019)

    Google Scholar 

  8. Sun, C., Qiu, X., Xu, Y., Huang, X.: How to fine-tune BERT for text classification? In: CCL, pp. 194–206 (2019)

    Google Scholar 

  9. Sun, Z., Deng, Z., Nie, J., Tang, J.: RotatE: knowledge graph embedding by relational rotation in complex space. In: ICLR (2019)

    Google Scholar 

  10. Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: ACL, pp. 1476–1488 (2020)

    Google Scholar 

  11. Xue, F., Sun, A., Zhang, H., Chng, E.S.: An embarrassingly simple model for dialogue relation extraction. CoRR abs/2012.13873 (2020)

    Google Scholar 

  12. Xue, F., Sun, A., Zhang, H., Chng, E.S.: GDPNet: refining latent multi-view graph for relation extraction. In: AAAI, pp. 14194–14202 (2021)

    Google Scholar 

  13. Yu, D., Sun, K., Cardie, C., Yu, D.: Dialogue-based relation extraction. In: ACL, pp. 4927–4940 (2020)

    Google Scholar 

  14. Zeng, S., Xu, R., Chang, B., Li, L.: Double graph based reasoning for document-level relation extraction. In: EMNLP, pp. 1630–1640 (2020)

    Google Scholar 

  15. Zhang, Y., Wallace, B.C.: A sensitivity analysis of (and practitioners’ guide to) convolutional neural networks for sentence classification. In: IJCNLP, pp. 253–263 (2017)

    Google Scholar 

Download references

Acknowledgments

The authors would like to thank the three anonymous reviewers for their comments on this paper. This research was supported by the National Natural Science Foundation of China (No. 61836007, 61772354 and 61773276), and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peifeng Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, Q., Li, P. (2021). Employing Multi-granularity Features to Extract Entity Relation in Dialogue. In: Wang, L., Feng, Y., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2021. Lecture Notes in Computer Science(), vol 13028. Springer, Cham. https://doi.org/10.1007/978-3-030-88480-2_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-88480-2_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-88479-6

  • Online ISBN: 978-3-030-88480-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics