Skip to main content
Log in

Multi-granularity bidirectional attention stream machine comprehension method for emotion cause extraction

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Emotion cause extraction is to extract the cause information that triggers the emotion expression of described person. This task in text plays a critical role in natural language processing applications, such as sentiment analysis and semantic comprehension system. However, most existing methods for this emotion cause extraction task only focus on feature engineering and ignore the latent semantic information between emotion word and context to hinder the performance. In this paper, we propose a novel computational multi-granularity bidirectional attention stream network based on a machine comprehension frame to settle this problem. The context and query are embedded by this multistage hierarchical process based on the fine-grained levels of embeddings. Then, the bidirectional attention stream mechanism is applied to get an emotional query-aware context representation. Meanwhile, we have conducted extensive experiments on available Chinese emotion cause dataset. The experimental results demonstrate that our approach significantly outperforms the state-of-the-art methods and is able to extract the emotion cause.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. http://hlt.hitsz.edu.cn/?page_id=694.

  2. http://news.sina.com.cn/society/.

References

  1. Wang R, Li S, Zhou G et al (2015) Joint sentiment and emotion classification with integer linear programming. In: Proceedings of international conference on database systems for advanced applications. Springer, pp 259–265

  2. Wang Y, Feng S, Wang D et al (2016) Multi-label Chinese microblog emotion classification via convolutional neural network. In: Proceedings of Asia-Pacific web conference. Springer, pp 567–580

  3. Lee SYM, Chen Y et al (2010) A text-driven rule-based system for emotion cause detection. In: Proceedings of the NAACL HLT 2010 workshop on computational approaches to analysis and generation of emotion in text, pp 45–53

  4. Gao K, Xu H, Wang J (2015) A rule-based approach to emotion cause detection for Chinese micro-blogs. Expert Syst Appl 42(9):4517–4528

    Article  Google Scholar 

  5. Li W, Xu H (2014) Text-based emotion classification using emotion cause extraction. Expert Syst Appl 41(4):1742–1749

    Article  Google Scholar 

  6. Gui L, Hu J, He Y et al (2017) A question answering approach to emotion cause extraction. In: EMNLP

  7. Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473

  8. Xiong C, Merity S, Socher R (2016) Dynamic memory networks for visual and textual question answering. In: ICLR, pp 2397–2406

  9. Xiong C, Zhong V, Socher R (2016) Dynamic coattention networks for question answering. arXiv preprint arXiv:1611.01604

  10. Seo M, Kembhavi A, Farhadi A et al (2016) Bidirectional attention flow for machine comprehension. arXiv preprint arXiv:1611.01603

  11. Yu AW, Dohan D, Luong MT et al (2018) Qanet: Combining local convolution with global self-attention for reading comprehension. arXiv preprint arXiv:1804.09541

  12. Ekman P (1984) Expression and the nature of emotion. Approaches Emot 3:19–344

    Google Scholar 

  13. Xu L-H, Lin HF, Yang ZH (2007) Text orientation identification based on semantic comprehension. J Chin Inf Process 21(1):96–100

    Google Scholar 

  14. Ghazi D, Inkpen D, Szpakowicz S (2015) Detecting emotion stimuli in emotion-bearing sentences. In: International conference on intelligent text processing and computational linguistics. Springer, Cham, pp 152–165

  15. Gui L, Yuan L, Xu R, Liu B et al (2014) Emotion cause detection with linguistic construction in chinese weibo text. Commun Comput Inf Sci 496(s 3–4):457–464

    Google Scholar 

  16. Gui L, Wu D, Xu R et al (2016) Event-driven emotion cause extraction with corpus construction. In: Proceedings of the 2016 conference on empirical methods in natural language processing, pp 1639–1649

  17. Mikolov T, Sutskever I, Chen K, et al (2013) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp 3111–3119

  18. Du J, Xu R, He Y et al (2017) Stance classification with target-specific neural attention. In: Twenty-sixth international joint conference on artificial intelligence, pp 3988–3994

  19. Tan Z, Wang M, Xie J et al (2018) Deep semantic role labeling with self-attention. In: AAAI

  20. Shen T, Zhou T, Long G, et al (2018) Disan: directional self-attention network for rnn/cnn-free language understanding. In: AAAI, 2018

  21. Yuan Z, Wu S, Wu F et al (2018) Domain attention model for multi-domain sentiment classification. Knowl-Based Syst 155:1–10

    Article  Google Scholar 

  22. Wu Y, Mao H, Yi Z (2018) Audio classification using attention-augmented convolutional neural network. Knowl-Based Syst 161:90–100

    Article  Google Scholar 

  23. Yang M, Qu Q, Chen X et al. (2018) Feature-enhanced attention network for target-dependent sentiment classification. Neurocomputing 307:91–97

    Article  Google Scholar 

  24. Lu J, Yang J, Batra D, et al (2016) Hierarchical question-image co-attention for visual question answering. In: Advances in neural information processing systems, pp 289–297

  25. He X, Yang Y, Shi B et al. (2019) VD-SAN: Visual-Densely Semantic Attention Network for Image Caption Generation. Neurocomputing 328:48–55

    Article  Google Scholar 

  26. Roberts LE, Husain FT, Eggermont JJ (2013) Role of attention in the generation and modulation of tinnitus. Neurosci Biobehav Rev 37(8):1754–1773

    Article  Google Scholar 

  27. See A, Liu PJ, Manning CD (2017) Get to the point: summarization with pointer-generator networks. arXiv preprint arXiv:1704.04368

  28. Li H, Zhu J, Liu T et al (2018) Multi-modal sentence summarization with modality attention and image filtering. In: IJCAI, pp 4152–4158

  29. Liu J, Chen Y, Liu K et al (2018) Event detection via gated multilingual attention mechanism. AAAI

  30. Lin Y, Liu Z, Sun M (2017) Neural relation extraction with multi-lingual attention. In: Proceedings of the 55th annual meeting of the association for computational linguistics (volume 1: long papers), vol 1, pp 34–43

  31. Feng Q, Gao C, Wang L et al (2018) Spatio-temporal fall event detection in complex scenes using attention guided LSTM. Pattern Recognit Lett

  32. Hu L, Zhang B, Hou L et al (2017) Adaptive online event detection in news streams. Knowl-Based Syst 138:105–112

    Article  Google Scholar 

  33. Zendel BR, de Boysson C, Mellah S et al (2016) The impact of attentional training on event-related potentials in older adults. Neurobiol Aging 47:10–22

    Article  Google Scholar 

  34. Hermann KM, Kocisky T, Grefenstette E et al (2015) Teaching machines to read and comprehend. In: Advances in neural information processing systems, pp 1693–1701

  35. Cui Y, Chen Z, Wei S et al (2016) Attention-over-attention neural networks for reading comprehension. arXiv preprint arXiv:1607.04423

  36. Gao Q, Hu J, Xu R (2017) Overview of ntcir-13 eca task. In: Proceedings of the 13th NTCIR conference, Tokyo, Japan

  37. Zhang X, Zhao J, LeCun Y (2015) Character-level convolutional networks for text classification. In: Advances in neural information processing systems, pp 649–657

  38. Zhang H, Li J, Ji Y et al (2017) Understanding subtitles by character-level sequence-to-sequence learning. IEEE Trans Ind Inf 13(2):616–624

    Article  Google Scholar 

  39. Kim Y (2014) Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882

  40. Srivastava RK, Greff K, Schmidhuber J (2015) Highway networks. arXiv preprint arXiv:1505.00387

  41. Chung J, Gulcehre C, Cho KH et al (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555

  42. Mallinson J, Sennrich R, Lapata M (2017) Paraphrasing revisited with neural machine translation. In: Proceedings of the 15th conference of the European chapter of the association for computational linguistics: volume 1, long papers, vol 1, pp 881–893

  43. Chen D, Fisch A, Weston J et al (2017) Reading Wikipedia to answer open-domain questions. arXiv preprint arXiv:1704.00051

  44. Sukhbaatar S, Weston J, Fergus R (2015) End-to-end memory networks. In: Advances in neural information processing systems, pp 2440–2448

  45. Sordoni A, Bachman P, Trischler A et al (2016) Iterative alternating neural attention for machine reading. arXiv preprint arXiv:1606.02245

  46. Hill F, Bordes A, Chopra S et al (2016) The goldilocks principle: Reading children’s books with explicit memory representations. In: ICLR

  47. Srivastava N, Hinton G, Krizhevsky A et al (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  48. Russo I, Caselli T, Rubino F, Boldrini E, Martínez-Barco P (2011) Emocause: an easy-adaptable approach to emotion cause contexts. In: Workshop on computational approaches to subjectivity and sentiment analysis, pp 153–160

  49. Chen Y, Lee SYM, Li S et al (2010) Emotion cause detection with linguistic constructions. In: Proceedings of the 23rd international conference on computational linguistics. Association for Computational Linguistics, pp 179–187

  50. Graves A, Jaitly N, Mohamed A (2013) Hybrid speech recognition with deep bidirectional LSTM. In: IEEE workshop on automatic speech recognition and understanding (ASRU). IEEE, 2013, pp 273–278

Download references

Acknowledgements

This work is partially supported by grant from the Natural Science Foundation of China (Nos. 61632011, 61572102, 61702080, 61602079, 61806038), the Ministry of Education Humanities and Social Science Project (No. 16YJCZH12), the Fundamental Research Funds for the Central Universities (DUT18ZD102), the National Key Research Development Program of China (No. 2016YFB1001103), Ministry of Education Humanities and Social Science Project (Nos. 18YJCZH208, 19YJCZH199), China Postdoctoral Science Foundation (Nos. 2018M631788, 2018M641691), and the Foundation of State Key Laboratory of Cognitive Intelligence, iFLYTEK, P.R. China (COGOS-20190001, Intelligent Medical Question Answering based on User Profiling and Knowledge Graph).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongfei Lin.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Human and animal rights

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Diao, Y., Lin, H., Yang, L. et al. Multi-granularity bidirectional attention stream machine comprehension method for emotion cause extraction. Neural Comput & Applic 32, 8401–8413 (2020). https://doi.org/10.1007/s00521-019-04308-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04308-4

Keywords

Navigation