Abstract
In this paper, we present the method proposed by our team for Track 2 of NLPCC 2023 Shared Task 7, which focuses on the extraction of paragraph-level and whole essay topic sentences in middle school student essays. This paper proposes a two-stage topic sentence extraction framework for each paragraph and the whole essay. In the first stage, we extract topic sentences for each paragraph, considering local semantic and contextual aspects. In the second stage, we derive the text topic sentence for the whole essay from the extracted paragraph-level topic sentences. Compared with the one-stage method, the two-stage method which can focus on the local semantic information of paragraphs related to the task has advantages in paragraph and full-text topic sentence extraction. Comparative experiments show that the extraction performance of the fine-tuned two-stage topic sentence extraction framework surpasses the few-shot large language models (GPT-3.5 et al.). The final comprehensive index also achieved the first-place result in this track.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bang, Y., et al.: A multitask, multilingual, multimodal evaluation of ChatGPT on reasoning, hallucination, and interactivity. arXiv preprint arXiv:2302.04023 (2023)
Bengio, Y., Ducharme, R., Vincent, P.: A neural probabilistic language model. In: Advances in Neural Information Processing Systems, vol. 13 (2000)
Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(ARTICLE), 2493–2537 (2011)
Deng, Z., Ma, F., Lan, R., Huang, W., Luo, X.: A two-stage Chinese text summarization algorithm using keyword information and adversarial learning. Neurocomputing 425, 117–126 (2021). https://doi.org/10.1016/j.neucom.2020.02.102
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Lin, Y., Ji, H., Huang, F., Wu, L.: A joint neural model for information extraction with global features. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7999–8009 (2020)
Liu, J., Xu, Y., Zhao, L.: Automated essay scoring based on two-stage learning. ArXiv abs/1901.07744 (2019)
Lu, Y., et al.: Unified structure generation for universal information extraction. arXiv preprint arXiv:2203.12277 (2022)
Luan, Y., Wadden, D., He, L., Shah, A., Ostendorf, M., Hajishirzi, H.: A general framework for information extraction using dynamic span graphs. arXiv preprint arXiv:1904.03296 (2019)
Ouyang, L., et al.: Training language models to follow instructions with human feedback. In: Advances in Neural Information Processing Systems, vol. 35, pp. 27730–27744 (2022)
Pujari, R., Desai, S., Ganguly, N., Goyal, P.: A novel two-stage framework for extracting opinionated sentences from news articles. arXiv preprint arXiv:2101.09743 (2021)
Qin, C., Zhang, A., Zhang, Z., Chen, J., Yasunaga, M., Yang, D.: Is ChatGPT a general-purpose natural language processing task solver? arXiv preprint arXiv:2302.06476 (2023)
Radford, A., Narasimhan, K., Salimans, T., Sutskever, I.: Improving language understanding by generative pre-training (2018)
Song, W., Zhang, K., Fu, R., Liu, L., Liu, T., Cheng, M.: Multi-stage pre-training for automated Chinese essay scoring. In: Conference on Empirical Methods in Natural Language Processing (2020)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc. (2017)
Wadden, D., Wennberg, U., Luan, Y., Hajishirzi, H.: Entity, relation, and event extraction with contextualized span representations. arXiv preprint arXiv:1909.03546 (2019)
Zeng, X., Zeng, D., He, S., Liu, K., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 506–514 (2018)
Zhang, T., Ji, H., Sil, A.: Joint entity and event extraction with generative adversarial imitation learning. Data Intell. 1(2), 99–120 (2019)
Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. arXiv preprint arXiv:1706.05075 (2017)
Zhong, Z., Chen, D.: A frustratingly easy approach for entity and relation extraction. arXiv preprint arXiv:2010.12812 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Dong, Y., Zheng, F., Chen, H., Ding, Y., Zhou, Y., He, H. (2023). Two-Stage Topic Sentence Extraction for Chinese Student Essays. In: Liu, F., Duan, N., Xu, Q., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2023. Lecture Notes in Computer Science(), vol 14304. Springer, Cham. https://doi.org/10.1007/978-3-031-44699-3_24
Download citation
DOI: https://doi.org/10.1007/978-3-031-44699-3_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-44698-6
Online ISBN: 978-3-031-44699-3
eBook Packages: Computer ScienceComputer Science (R0)