skip to main content
10.1145/3651671.3651687acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlcConference Proceedingsconference-collections
research-article

A New Chinese Event Detection Method based on PMTNet

Published: 07 June 2024 Publication History

Abstract

Event detection is the key to improve the accuracy of information extraction task. However, the identification model of trigger words is often interfered by text ambiguity and redundant information. This paper focuses on accurately identifying trigger words to improve the accuracy of event detection and extraction tasks. Firstly, a prompt information library for event detection is constructed using learnable continuous vectors. Then, we propose a Chinese event detection method that injects prompt information into a multi-task learning network. This approach establishes a link between the event prompt and the text through the prompt attention. Experiment shows that the proposed method outperforms several existing models based on character-level sequence labeling for Chinese event detection accuracy.

References

[1]
Doddington G R, Mitchell A, Przybocki M A. 2004. The automatic content extraction (ace) program-tasks, data, and evaluation[C]//Lrec. 2004, 2(1): 837-840.
[2]
Chen Y, Xu L, Liu K. 2015. Event extraction via dynamic multi-pooling convolutional neural networks[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2015: 167-176.
[3]
Nguyen T H, Cho K, Grishman R. 2016. Joint event extraction via recurrent neural networks[C]//Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. 2016: 300-309.
[4]
Zeng Y, Yang H, Feng Y. 2016. A convolution BiLSTM neural network model for Chinese event extraction[C]//Natural Language Understanding and Intelligent Applications: 5th CCF Conference on Natural Language Processing and Chinese Computing, NLPCC 2016, and 24th International Conference on Computer Processing of Oriental Languages, ICCPOL 2016, Kunming, China, December 2–6, 2016, Proceedings 24. Springer International Publishing, 2016: 275-287.
[5]
Yang S, Feng D, Qiao L. 2019. Exploring pre-trained language models for event extraction and generation[C]//Proceedings of the 57th annual meeting of the association for computational linguistics. 2019: 5284-5294.
[6]
Kenton J D M W C, Toutanova L K. 2019. Bert: Pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of naacL-HLT. 2019, 1: 2.
[7]
Radford A, Wu J, Child R. 2019. Language models are unsupervised multitask learners[J]. OpenAI blog, 2019, 1(8): 9.
[8]
Brown T, Mann B, Ryder N. 2020. Language models are few-shot learners[J]. Advances in neural information processing systems, 2020, 33: 1877-1901.
[9]
Liu P, Yuan W, Fu J. 2023. Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing[J]. ACM Computing Surveys, 2023, 55(9): 1-35.
[10]
Petroni F, Rocktäschel T, Lewis Pl. 2019. Language models as knowledge bases?[J]. arXiv preprint arXiv:1909.01066, 2019.
[11]
Schick T, Schütze H. 2020. Exploiting cloze questions for few shot text classification and natural language inference[J]. arXiv preprint arXiv:2001.07676, 2020.
[12]
Gao T, Fisch A, Chen D. 2020. Making pre-trained language models better few-shot learners[J]. arXiv preprint arXiv:2012.15723, 2020.
[13]
Raffel C, Shazeer N, Roberts A. 2020. Exploring the limits of transfer learning with a unified text-to-text transformer[J]. The Journal of Machine Learning Research, 2020, 21(1): 5485-5551.
[14]
Liu X, Zheng Y, Du Z. 2021. GPT understands, too[J]. arXiv preprint arXiv:2103.10385, 2021.
[15]
Cui L, Wu Y, Liu Jl. 2021. Template-based named entity recognition using BART[J]. arXiv preprint arXiv:2106.01760, 2021.
[16]
Lee D H, Kadakia A, Tan K. 2021. Good examples make a faster learner: Simple demonstration-based learning for low-resource NER[J]. arXiv preprint arXiv:2110.08454, 2021.
[17]
Vaswani A, Shazeer N, Parmar N. 2017. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.
[18]
Li X, Li F, Pan L. 2020. DuEE: a large-scale dataset for Chinese event extraction in real-world scenarios[C]//Natural Language Processing and Chinese Computing: 9th CCF International Conference, NLPCC 2020, Zhengzhou, China, October 14–18, 2020, Proceedings, Part II 9. Springer International Publishing, 2020: 534-545.
[19]
Cui Y, Che W, Liu T. 2020. Revisiting pre-trained models for Chinese natural language processing[J]. arXiv preprint arXiv:2004.13922, 2020.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICMLC '24: Proceedings of the 2024 16th International Conference on Machine Learning and Computing
February 2024
757 pages
ISBN:9798400709234
DOI:10.1145/3651671
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 June 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Chinese Event Detection
  2. Continuous Prompt
  3. Information Extraction
  4. Multi-task Learning
  5. Transformer

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICMLC 2024

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 15
    Total Downloads
  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)3
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media