Skip to main content

Advertisement

Log in

Position-Aware Attention Mechanism–Based Bi-graph for Dialogue Relation Extraction

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Relation extraction in a dialogue scenario aims to extract the relations between entities in a multi-turn dialogue. Unlike the conventional relation extraction task, the dialogue relation cannot yield a result through a single sentence. Therefore, it is essential to model multi-turn dialogue for reasoning. However, dialogue relation extraction easily causes referential ambiguity owing to the low information density in the dialogue dataset and a large amount of pronoun referential information in the dialogue. In addition, most existing models only consider the token-level information interaction and do not fully utilize the interaction between discourses. To address these issues, a graph neural network–based dialogue relation extraction model is proposed using the position-aware refinement mechanism (PAR-DRE) in this paper. Firstly, PAR-DRE models the dependencies between the speaker’s relevant information and various discourse sentences and introduces pronoun reference information to develop the dialogue into a heterogeneous reference dialogue graph. Secondly, a position-aware refinement mechanism is introduced to capture more discriminative features of nodes containing relative location information. On this basis, an entity graph is built by merging the abovementioned nodes, and the path reasoning mechanism is used to infer the relation between entities in the dialogue. The experimental results on the dialogue dataset indicate that the performance of the F1 value of this method is enhanced by 1.25% compared with the current mainstream approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

The data that support the findings of this study are openly available in DialogRE at https://dataset.org/dialogre/.

References

  1. Nayak T, Majumder N, Goyal P, Poria S. Deep neural approaches to relation triplets extraction: a comprehensive survey. Cogn Comput. 2021;13(5):1215–32.

    Article  Google Scholar 

  2. Bosselut A, Rashkin H, Sap M, Malaviya C, et al. Comet: commonsense transformers for automatic knowledge graph construction, Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy Long Papers. 2019;1:4762–4779.

  3. Yao X, Van Durme B. Information extraction over structured data: question answering with freebase, Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, Baltimore, Maryland Long Papers. 2014;1:956–966.

  4. Zhang Y, Qi P, Manning CD. Graph convolution over pruned dependency trees improves relation extraction, Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium. 2018;2205–2215.

  5. Zheng S, Wang F, Bao H, Hao Y, Zhou P, Xu B. Joint extraction of entities and relations based on a novel tagging scheme, Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, Long Papers. 2017;1:1227–1236.

  6. Gou Y, Lei Y, Liu L, Zhang P, et al. A dynamic parameter enhanced network for distant supervised relation extraction. Knowl-Based Syst. 2020;197:105912.

    Article  Google Scholar 

  7. Lei M, Huang H, Feng C, Gao Y, et al. An input information enhanced model for relation extraction. Neural Comput Appl. 2019;31:9113–26.

    Article  Google Scholar 

  8. Lyu S, Cheng J, Wu X, Cui L, et al. Auxiliary learning for relation extraction. IEEE Trans Emerg Topics Comput Intell. 2020;6(1):182–91.

    Article  Google Scholar 

  9. Li Y, Su H, Shen X, Li W, et al. Dailydialog: A manually labelled multi-turn dialogue dataset, Proceedings of the Eighth International Joint Conference on Natural Language Processing, Hongkong, China, Long Papers. 2017;1:986–995.

  10. Dinan E, Roller S, Shuster K, Fan A, et al. Wizard of Wikipedia: Knowledge-powered conversational agents, in 7th International Conference on Learning Representations, New Orleans, USA. 2019. https://arxiv.org/abs/1811.01241.

  11. Serban I, Sordoni A, Bengio Y, Courville A, et al. Building end-to-end dialogue systems using generative hierarchical neural network models, In Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016;30(1):3776–3783.

  12. Yang L, Qiu M, Qu C, Guo J, et al. Response ranking with deep matching networks and external knowledge in information-seeking conversation systems, The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA. 2018;245–254.

  13. Yu D, Sun K, Cardie C, Yu D. Dialogue-based relation extraction, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 2020;4927–4940.

  14. Wang D, Liu Y. A pilot study of opinion summarization in conversations, Proceedings of the 49th annual meeting of the Association for Computational Linguistics: Human language technologies, Portland, USA. 2011;331–339.

  15. Biber D. Variation across speech and writing, Cambridge University Press. 1991.

  16. Chen H, Hong P. Han W, Majumder N, et al. Dialogue relation extraction with document-level heterogeneous graph attention networks. 2020. https://arxiv.org/abs/2009.05092.

  17. Devlin J, Chang MW, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019;4171–4186.

  18. Xue F, Sun A, Zhang H, Ni J, et al. An embarrassingly simple model for dialogue relation extraction, ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing. 2022;6707–6711.

  19. Nan G, Guo Z, Sekulić I, Lu W. Reasoning with latent structure refinement for document-level relation extraction, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 2020;1546–1557.

  20. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition, Proceedings of the IEEE conference on computer vision and pattern recognition. 2016;770–778.

  21. Ying R, He R, Chen K, Eksombatchai P, et al. Graph convolutional neural networks for web-scale recommender systems, Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, London, United Kingdom. 2018;974–983.

  22. Hamaguchi T, Oiwa H, Shimbo M, Matsumoto Y. Knowledge transfer for out-of-knowledge-base entities: a graph neural network approach, Proceedings of the 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia. 2017;1802–1808.

  23. Peng N, Poon H, Quirk C, Toutanova K, et al. Cross-sentence N-ary relation extraction with graph LSTMs, Transactions of the Association for. Comput Linguist. 2017;5:101–15.

    Google Scholar 

  24. Liu Y, Lapata M. Learning structured text representations, Transactions of the Association for. Comput Linguist. 2018;6:63–75.

    Google Scholar 

  25. Jia R, Wong C, Poon H. Document-level n-ary relation extraction with multiscale representation learning, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, USA, (Long and Short Papers). 2019;1:3693–3704.

  26. Zeng S, Xu R, Chang B, Li L. Double graph-based reasoning for document-level relation extraction, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online. 2020;1630–1640.

  27. Xue F, Sun A, Zhang H, Chng ES. Gdpnet: Refining latent multi-view graph for relation extraction, in 35th AAAI Conference on Artificial Intelligence. 2021;35(16):14194–14202.

  28. Zhao L, Xu W, Gao S, Guo J. Utilizing graph neural networks to improving dialogue-based relation extraction. Neurocomputing. 2021;456:299–311.

    Article  Google Scholar 

  29. Zhou M, Ji D, Li F. Relation extraction in dialogues: A deep learning model based on the generality and specialty of dialogue text. IEEE/ACM Trans Audio, Speech, and Lang Process. 2021;29:2015–26.

    Article  Google Scholar 

  30. Bai X, Chen Y, Song L, Zhang Y. Semantic representation for dialogue modeling, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Long Papers). 2021;1:4430–4445.

  31. Zhao T, Yan Z, Cao Y, Li Z. Enhancing dialogue-based relation extraction by speaker and trigger words prediction. Findings of the Association for Computational Linguistics, Online Event. 2021;4580–4585.

  32. Dai Z, Yang Z, Yang Y, Carbonell JG, et al. Transformer-xl: attentive language models beyond a fixed-length context. Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy, Long Papers. 2019;1:2978–2988.

  33. Graves A, Schmidhuber J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 2005;18(5–6):602–10.

    Article  Google Scholar 

Download references

Funding

This work was supported by National Natural Science Foundation of China (No. U19A2059).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tianxi Huang.

Ethics declarations

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflict of Interest

Guiduo Duan has received research grants from the National Natural Science Foundation of China (No. U19A2059). Yunrui Dong, Jiayu Miao, and Tianxi Huang declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Duan, G., Dong, Y., Miao, J. et al. Position-Aware Attention Mechanism–Based Bi-graph for Dialogue Relation Extraction. Cogn Comput 15, 359–372 (2023). https://doi.org/10.1007/s12559-022-10105-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-022-10105-4

Keywords