Abstract
Dependency parsing aims to identify relationships between words in one sentence. In this paper, we propose a novel graph-based end-to-end dependency parsing model, including POS tagger and Joint Bilinear Model (JBM). Based on prior POS knowledge from dataset, we use POS tagging results to guide the training of JBM. To narrow the gap between edge and label prediction, we pass the knowledge hidden in label prediction procedure in JBM. Motivated by success of deep contextualized word embeddings, this work also finetunes BERT for dependency parsing. Our model achieves 96.85% UAS and 95.01% LAS in English PTB dataset. Moreover, experiments on Universal Dependencies dataset indicates our model also reaches state-of-the-art performance on dependency parsing and POS tagging.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Marcus, M.P., Santorini, B., Marcinkiewicz, M.A.: Building a large annotated corpus of English: the Penn treebank. Comput. Linguist. 19(2), 313–330 (1993)
Zeman, D., et al.: CoNLL 2018 shared task: multilingual parsing from raw text to universal dependencies, pp. 1–21 (2018)
Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing. In ICLR (2017)
Timothy, D., Christopher, D.M.: Simpler but more accurate semantic dependency parsing. ACL, 484–490 (2018)
Wang, X., Huang, J., Tu, K.: Second-order semantic dependency parsing with end-to-end neural networks. In: ACL, pp. 4609–4618 (2019)
Ji, T., Wu, Y., Lan, M.: Graph-based dependency parsing with graph neural networks. In: ACL, pp. 2475–2485 (2019)
Li, Y., Li, Z., Zhang, M., Wang, R., Li, S., Si, L.: Self-attentive biaffine dependency parsing. In: IJCAI, pp. 5067–5073 (2019)
Peters, M.E., et al.: Deep contextualized word representations. In: NAACL-HLT, pp. 2227–2237 (2018)
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT, pp. 4171–4186 (2019)
Kulmizev, A., de Lhoneux, M., Gontrum, J., Fano, E., Nivre, J.: Deep contextualized word embeddings in transition-based and graph-based dependency parsing - a tale of two parsers revisited. In: EMNLP-IJCNLP, pp. 2755–2768 (2019)
Wang, Y., Che, W., Guo, J., Liu, Y., Liu, T.: Cross-lingual BERT transformation for zero-shot dependency parsing. In: EMNLP-IJCNLP, pp. 5720–5726 (2019)
Danqi, C., Christopher, M.: A fast and accurate dependency parser using neural networks. EMNLP 2014, 740–750 (2014)
Dyer, C., Ballesteros, M., Ling, W., Matthews, A., Smith, N.A..: Transition-based dependency parsing with stack long short-term memory. In: ACL, pp. 334–343 (2015)
Andor, D., et al.: Globally normalized transition-based neural networks. In: ACL (2016)
Eliyahu, K., Yoav, G.: Simple and accurate dependency parsing using bidirectional LSTM feature representations. Trans. Assoc. Comput. Linguist. 4, 313–327 (2016)
Clark, K., Luong, M.-T., Manning, C.D., Le, Q.V.: Semi-supervised sequence modeling with cross-view training. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 1914–1925 (2018)
Qi, P., Dozat, T., Zhang, Y., Manning, C.D.: Universal dependency parsing from scratch. In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pp. 160–170 (2018)
Lim, K., Park, C.-E., Lee, C., Poibeau, T.: SEx BiST: a multi-source trainable parser with deep contextualized lexical representations. In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pp. 143–152 (2018)
Che, W., Liu, Y., Wang, Y., Zheng, B., Liu, T.: Towards better UD parsing: deep contextualized word embeddings, ensemble, and treebank concatenation. In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pp. 55–64 (2018)
Ahmad, W.U., Zhang, Z.., Ma, X., Hovy, E.H., Chang, K.-W., Peng, N.: On difficulties of cross-lingual transfer with order differences: a case study on dependency parsing. In: NAACL-HLT, pp. 2440–2452 (2019)
He, J., Zhang, Z., Berg-Kirkpatrick, T., Neubig, G.: Cross-lingual syntactic transfer through unsupervised adaptation of invertible projections. ACL 2019, 3211–3223 (2019)
Acknowledgement
This work was supported in part by the National Natural Science Foundation of China (Grant Nos. U1636211, 61672081, 61370126), the Beijing Advanced Innovation Center for Imaging Technology (Grant No. BAICIT-2016001), and the Fund of the State Key Laboratory of Software Development Environment (Grant No. SKLSDE-2019ZX-17).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Gao, Y., Zhang, K., Li, Z. (2020). Joint Bilinear End-to-End Dependency Parsing with Prior Knowledge. In: Zhu, X., Zhang, M., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2020. Lecture Notes in Computer Science(), vol 12431. Springer, Cham. https://doi.org/10.1007/978-3-030-60457-8_11
Download citation
DOI: https://doi.org/10.1007/978-3-030-60457-8_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-60456-1
Online ISBN: 978-3-030-60457-8
eBook Packages: Computer ScienceComputer Science (R0)