Abstract
Deep learning-based Chinese zero pronoun resolution model has achieved better performance than traditional machine learning-based model. However, the existing work related to Chinese zero pronoun resolution has not yet well integrated linguistic information into the deep learning-based Chinese zero pronoun resolution model. This paper adopts the idea based on the pre-trained model, and integrates the semantic representations in the pre-trained Chinese semantic dependency graph parser into the Chinese zero pronoun resolution model. The experimental results on OntoNotes-5.0 dataset show that our proposed Chinese zero pronoun resolution model with pre-trained Chinese semantic dependency parser improves the F-score by 0.4% compared with our baseline model, and obtains better results than other deep learning-based Chinese zero pronoun resolution models. In addition, we integrate the BERT representations into our model so that the performance of our model was improved by 0.7% compared with our baseline model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Zhao, S., Ng, H.T.: Identification and Resolution of Chinese Zero Pronouns: A Machine Learning Approach, pp. 541–550 (2007)
Chen, C., Ng, V.: Chinese Zero Pronoun Resolution With Deep Neural Networks, vol. 1, no. 778–788 (2016)
Liu, T., Cui, Y., Yin, Q., Zhang, W., Wang, S., Guoping, H.: Generating and Exploiting Large-scale Pseudo Training Data for Zero Pronoun Resolution, vol. 1, pp. 102–111 (2017)
Yin, Q., Zhang, Y., Zhang, W., Liu, T.: Chinese Zero Pronoun Resolution with Deep Memory Network, pp. 1309–1318 (2017)
Yin, Q., Zhang, Y., Zhang, W., Liu, T., Wang, W.Y.: Deep Reinforcement Learning for Chinese Zero Pronoun Resolution, vol. 1, pp. 569–578 (2018)
Yin, Q., Zhang, Y., Zhang, W., Liu, T., Wang, W.Y.: Zero Pronoun Resolution With Attention-based Neural Network, pp. 13–23 (2018)
Che, W., Shao, Y., Liu, T., Ding, Y.: SemEval-2016 task 9: Chinese semantic dependency parsing. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), pp. 1074–1080. Association for Computational Linguistics (2016)
Converse, S.P., Palmer, M.S.: Pronominal Anaphora Resolution in Chinese. University of Pennsylvania (2006)
Kong, F., Zhou, G.: A Tree Kernel-based Unified Framework for Chinese Zero Anaphora Resolution, pp. 882–891 (2010)
Zhang, M., Zhang, Y., Fu, G.: End-to-end neural relation extraction with global optimization. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 1730–1740, Copenhagen, Denmark. Association for Computational Linguistics (2017)
Gao, Y., Zhang, Y., Xiao, T.: Implicit Syntactic Features for Targeted Sentiment Analysis, p. 9 (2017)
Yu, N., Zhang, M., Fu, G.: Transition-Based Neural RST Parsing With Implicit Syntax Features, pp. 559–570 (2018)
Zhang, M., Li, Z., Fu, G., Zhang, M.: Syntax-Enhanced Neural Machine Translation With Syntax-aware Word Representations, pp. 1151–1161 (2019)
Jiang, W., Li, Z., Zhang, M.: Syntax-enhanced ucca semantic parsing. Beijing Da Xue Xue Bao 56(1), 89–96 (2020)
Abend, O., Rappoport, A.: Universal Conceptual Cognitive Annotation (UCCA), p. 11 (2013)
Peters, M.E.: Deep Contextualized Word Representations (2018)
Dozat, T., Manning, C.D.: Simpler but more accurate semantic dependency parsing. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 484–490, Melbourne, Australia. Association for Computational Linguistics, July 2018
Shen, Z., Li, H., Liu, D., Shao, Y.: Dependency-gated cascade biaffine network for Chinese semantic dependency graph parsing. In: Tang, J., Kan, M.-Y., Zhao, D., Li, S., Zan, H. (eds.) NLPCC 2019. LNCS (LNAI), vol. 11838, pp. 840–851. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-32233-5_65
Lee, K., He, L., Lewis, M., Zettlemoyer, L.: End-to-End Neural Coreference Resolution (2017)
Shen, D., et al.: Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms (2018)
Devlin, J., Chang, M., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv Computation and Language (2018)
Acknowledgements
This research project is supported by the National Natural Science Foundation of China (61872402), the Humanities and Social Science Project of the Ministry of Education (17YJAZH068) Science Foundation of Beijing Language and Culture University (supported by the Fundamental Research Funds for the Central Universities) (18ZDJ03) the Open Project Program of the National Laboratory of Pattern Recognition (NLPR).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, L., Shen, Z., Shao, Y. (2020). Semantic-Aware Chinese Zero Pronoun Resolution with Pre-trained Semantic Dependency Parser. In: Sun, M., Li, S., Zhang, Y., Liu, Y., He, S., Rao, G. (eds) Chinese Computational Linguistics. CCL 2020. Lecture Notes in Computer Science(), vol 12522. Springer, Cham. https://doi.org/10.1007/978-3-030-63031-7_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-63031-7_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-63030-0
Online ISBN: 978-3-030-63031-7
eBook Packages: Computer ScienceComputer Science (R0)