Skip to main content
Log in

Exploiting event-aware and role-aware with tree pruning for document-level event extraction

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Document-level event extraction aims to extract event information from a passage, which is more challenging than sentence-level event extraction. Despite some success, the existing document-level event extraction methods still face the following two shortcomings: (a) Insufficient Dependencies: the dependencies between event type and role type are not fully exploited in a document. (b) Redundancy: The redundant role features in document-level event extraction hinder the performance improvement. In this paper, we propose dual-channel Conditional Interaction with Tree Pruning (CITP) to solve the above two challenges simultaneously. For the insufficient dependencies issue, CITP constructs a dual-channel condition interaction module to simultaneously extract both event types and role types in a document, which improves the feature interaction between event types and role types. Also, it can effectively enhance candidate argument features and sentence features with event-aware and role-aware. For the redundancy issue, CITP expands the ordered tree and adopts the classified role types as guidance to prune unnecessary branches, which reduces the impact of redundant role features and improves event extraction performance. Experimental results on the widely used ChFinAnn dataset have proved that our model achieves state-of-the-art compared to other models, with higher effectiveness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Chen Y, Liu S, Zhang X, Liu K, Zhao J (2017) Automatically labeled data generation for large scale event extraction. In: Proceedings of the 55th annual meeting of the association for computational linguistics, vol 1: Long Papers, pp 409–419

  2. Yang S, Feng D, Qiao L, Kan Z, Li D (2019) Exploring pre-trained language models for event extraction and generation. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 5284–5294

  3. Liu J, Chen Y, Liu K, Bi W, Liu X (2020) Event extraction as machine reading comprehension. In: Proceedings of the 2020 conference on empirical methods in natural language processing (EMNLP), pp 1641–1651

  4. Yang H, Chen Y, Liu K, Xiao Y, Zhao J (2018) DCFEE: a document-level Chinese financial event extraction system based on automatically labeled training data. In: Proceedings of ACL 2018, system demonstrations. Association for Computational Linguistics, Melbourne, Australia, pp 50–55. https://doi.org/10.18653/v1/P18-4009. https://www.aclweb.org/anthology/P18-4009

  5. Zheng S, Cao W, Xu W, Bian J (2019) Doc2edag: an end-to-end document-level framework for Chinese financial event extraction. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), pp 337–346

  6. Du X, Cardie C (2020) Document-level event role filler extraction using multi-granularity contextualized encoding. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 8010–8020

  7. Du X, Rush AM, Cardie C (2020) Document-level event-based extraction using generative template-filling transformers. CoRR abs/2008.09249. arXiv:2008.09249

  8. Yang H, Sui D, Chen Y, Liu K, Zhao J, Wang T (2021) Document-level event extraction via parallel prediction networks. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing, vol 1: Long Papers, pp 6298–6308

  9. Xu R, Liu T, Li L, Chang B (2021) Document-level event extraction via heterogeneous graph-based interaction model with a tracker. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing, vol 1: Long Papers, pp 3533–3546

  10. Hong Y, Zhang J, Ma B, Yao J, Zhou G, Zhu Q (2011) Using cross-entity inference to improve event extraction. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 1127–1136

  11. Li Q, Ji H, Huang L (2013) Joint event extraction via structured prediction with global features. In: Proceedings of the 51st annual meeting of the association for computational linguistics, vol 1: Long Papers, pp 73–82

  12. McClosky D, Surdeanu M, Manning CD (2011) Event extraction as dependency parsing. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 1626–1635

  13. Diao Y, Lin H, Yang L, Fan X, Wu D, Yang Z, Wang J, Xu K (2020) FBSN: a hybrid fine-grained neural network for biomedical event trigger identification. Neurocomputing 381:105–112. https://doi.org/10.1016/j.neucom.2019.09.042

    Article  Google Scholar 

  14. Chen Y, Xu L, Liu K, Zeng D, Zhao J (2015) Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing, vol 1: Long Papers. Association for Computational Linguistics, Beijing, pp 167–176. https://doi.org/10.3115/v1/P15-1017. https://www.aclweb.org/anthology/P15-1017

  15. Wang Z, Guo Y, Wang J (2021) Empower Chinese event detection with improved Atrous convolution neural networks. Neural Comput Appl 33(11):5805–5820

    Article  Google Scholar 

  16. Nguyen TH, Cho K, Grishman R (2016) Joint event extraction via recurrent neural networks. In: Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. Association for Computational Linguistics, San Diego, pp 300–309. https://doi.org/10.18653/v1/N16-1034. https://www.aclweb.org/anthology/N16-1034

  17. Liu X, Luo Z, Huang H (2018) Jointly multiple events extraction via attention-based graph information aggregation. In: Proceedings of the 2018 conference on empirical methods in natural language processing. Association for Computational Linguistics, Brussels, pp 1247–1256. https://doi.org/10.18653/v1/D18-1156. https://www.aclweb.org/anthology/D18-1156

  18. Zhang J, He Q, Zhang Y (2021) Syntax grounded graph convolutional network for joint entity and event extraction. Neurocomputing 422:118–128. https://doi.org/10.1016/j.neucom.2020.09.044

    Article  Google Scholar 

  19. Hu B, Liu Y, Chen N, Wang L, Liu N, Cao X (2022) SEGCN-DCR: a syntax-enhanced event detection framework with decoupled classification rebalance. Neurocomputing 481:55–66. https://doi.org/10.1016/j.neucom.2022.01.069

    Article  Google Scholar 

  20. Yan H, Jin X, Meng X, Guo J, Cheng X (2019) Event detection with multi-order graph convolution and aggregated attention. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), pp 5766–5770

  21. Lu S, Li S, Xu Y, Wang K, Lan H, Guo J (2022) Event detection from text using path-aware graph convolutional network. Appl Intell 52:4987–4998

    Article  Google Scholar 

  22. Lv J, Zhang Z, Jin L, Li S, Li X, Xu G, Sun X (2021) Hgeed: hierarchical graph enhanced event detection. Neurocomputing 453:141–150. https://doi.org/10.1016/j.neucom.2021.04.087

    Article  Google Scholar 

  23. Nguyen TH, Grishman R (2018) Graph convolutional networks with argument-aware pooling for event detection. In: Proceedings of the thirty-second AAAI conference on artificial intelligence and thirtieth innovative applications of artificial intelligence conference and eighth AAAI symposium on educational advances in artificial intelligence, pp 5900–5907

  24. Duan S, He R, Zhao W (2017) Exploiting document level information to improve event detection via recurrent neural networks. In: Proceedings of the eighth international joint conference on natural language processing, vol. 1: Long Papers. Asian Federation of Natural Language Processing, Taipei, pp 352–361. https://www.aclweb.org/anthology/I17-1036

  25. Chen Y, Yang H, Liu K, Zhao J, Jia Y (2018) Collective event detection via a hierarchical and bias tagging networks with gated multi-level attention mechanisms. In: Proceedings of the 2018 conference on empirical methods in natural language processing. Association for Computational Linguistics, Brussels, pp 1267–1276. https://doi.org/10.18653/v1/D18-1158. https://www.aclweb.org/anthology/D18-1158

  26. Zhao Y, Jin X, Wang Y, Cheng X (2018) Document embedding enhanced event detection with hierarchical and supervised attention. In: Proceedings of the 56th annual meeting of the association for computational linguistics. vol. 2: Short Papers. Association for Computational Linguistics, Melbourne, pp 414–419. https://doi.org/10.18653/v1/P18-2066. https://www.aclweb.org/anthology/P18-2066

  27. Liu J, Chen Y, Liu K (2019) Exploiting the ground-truth: an adversarial imitation based knowledge distillation approach for event detection. In: Proceedings of the thirty-third AAAI conference on artificial intelligence and thirty-first innovative applications of artificial intelligence conference and ninth AAAI symposium on educational advances in artificial intelligence, pp 6754–6761

  28. Tong M, Xu B, Wang S, Cao Y, Hou L, Li J, Xie J (2020) Improving event detection via open-domain trigger knowledge. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 5887–5897

  29. Miller GA (1995) Wordnet: a lexical database for english. Commun ACM 38(11):39–41. https://doi.org/10.1145/219717.219748

    Article  Google Scholar 

  30. Du X, Cardie C (2020) Event extraction by answering (almost) natural questions. In: Proceedings of the 2020 conference on empirical methods in natural language processing (EMNLP), pp 671–683

  31. McLean V (1992) Fourth message understanding conference (MUC-4). In: Proceedings of fourth message understanding conference (MUC-4)

  32. Patwardhan S, Riloff E (2009) A unified model of phrasal and sentential evidence for information extraction. In: Proceedings of the 2009 conference on empirical methods in natural language processing, pp 151–160

  33. Huang R, Riloff E (2011) Peeling back the layers: detecting event role fillers in secondary contexts. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 1137–1147

  34. Huang R, Riloff E (2012) Bootstrapped training of event extraction classifiers. In: Proceedings of the 13th conference of the European chapter of the association for computational linguistics, pp 286–295

  35. Ebner S, Xia P, Culkin R, Rawlins K, Van Durme B (2020) Multi-sentence argument linking. In: ACL

  36. Yang B, Mitchell TM (2016) Joint extraction of events and entities within a document context. In: Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. Association for Computational Linguistics, San Diego, pp 289–299. https://doi.org/10.18653/v1/N16-1033.https://www.aclweb.org/anthology/N16-1033

  37. Liu J, Chen Y, Xu J (2022) Document-level event argument linking as machine reading comprehension. Neurocomputing 488:414–423. https://doi.org/10.1016/j.neucom.2022.03.016

    Article  Google Scholar 

  38. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Guyon I, von Luxburg U, Bengio S, Wallach HM, Fergus R, Vishwanathan SVN, Garnett R (eds) NIPS, pp 5998–6008. http://dblp.uni-trier.de/db/conf/nips/nips2017.html#VaswaniSPUJGKP17

  39. Devlin J, Chang M-W, Lee K, Toutanova K (2019) BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, vol 1: Long and Short Papers. Association for Computational Linguistics, Minneapolis, pp 4171–4186. https://doi.org/10.18653/v1/N19-1423. https://www.aclweb.org/anthology/N19-1423

  40. Kingma DP, Ba J (2014) Adam: A Method for Stochastic Optimization. arxiv:1412.6980Comment: Published as a conference paper at the 3rd International Conference for Learning Representations, San Diego, 2015. http://arxiv.org/abs/1412.6980

Download references

Acknowledgements

We thank the support of the National Key R&D Program of China (Grant No. 2021YFB3900504).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zequn Zhang.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lv, J., Zhang, Z., Xu, G. et al. Exploiting event-aware and role-aware with tree pruning for document-level event extraction. Neural Comput & Applic 35, 11061–11072 (2023). https://doi.org/10.1007/s00521-023-08282-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-023-08282-w

Keywords

Navigation