Skip to main content

Graph-Based Dependency Parsing with Recursive Neural Network

  • Conference paper
  • First Online:
Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data (CCL 2015, NLP-NABD 2015)

Abstract

Graph-based dependency parsing models have achieved state-of-the-art performance, yet their defect in feature representation is obvious: these models enforce strong independence assumptions upon tree components, thus restricting themselves to local, shallow features with limited context information. Besides, they rely heavily on hand-crafted feature templates. In this paper, we extend recursive neural network into dependency parsing. This allows us to efficiently represent the whole sub-tree context and rich structural information for each node. We propose a heuristic search procedure for decoding. Our model can also be used in the reranking framework. With words and pos-tags as the only input features, it gains significant improvement over the baseline models, and shows advantages in capturing long distance dependencies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://stp.lingfil.uu.se/~nivre/research/Penn2Malt.html.

  2. 2.

    fanIn is the number of node from incoming layer and fanout is the number for the next layer.

  3. 3.

    UAS: Unlabelled Attachment Score. Following previous work, we excludes tokens with pos tags of {“” , ; .}.

  4. 4.

    The win-over ratio is defined as: r = (the number of dependencies our model gets right \(-\) the number of dependencies the baseline gets right) / total number of dependencies at this distance. \(r>0\) indicates that our model performs better than baseline at this distance, the higher the ratio is, the bigger advantage we gains.

References

  • Bengio, Y., Ducharme, R., Vincent, P., Janvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003)

    Google Scholar 

  • Carreras, X.: Experiments with a higher-order projective dependency parser. In: EMNLP-CoNLL, pp. 957–961 (2007)

    Google Scholar 

  • Chen, D., Manning, C.D.: A fast and accurate dependency parser using neural networks. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 740–750 (2014)

    Google Scholar 

  • Collins, M., Roark, B.: Incremental parsing with the perceptron algorithm. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, p. 111 (2004)

    Google Scholar 

  • Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)

    MathSciNet  Google Scholar 

  • Eisner, J.M.: Three new probabilistic models for dependency parsing: an exploration. In: Proceedings of the 16th Conference on Computational Linguistics, vol. 1, pp. 340–345 (1996)

    Google Scholar 

  • Eisner, J.: Bilexical grammars and their cubic-time parsing algorithms. In: Bunt, H., Nijholt, A. (eds.) Advances in Probabilistic and Other Parsing Technologies. Text, Speech and Language Technology, vol. 16, pp. 29–61. Springer, Netherlands (2000)

    Chapter  Google Scholar 

  • Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: International Conference on Artificial Intelligence and Statistics, pp. 249–256 (2010)

    Google Scholar 

  • Goller, C., Kuchler, A.: Learning task-dependent distributed representations by backpropagation through structure. In: IEEE International Conference on Neural Networks, vol. 1, pp. 347–352 (1996)

    Google Scholar 

  • Hayashi, K., Watanabe, T., Asahara, M., Matsumoto, Y.: Third-order variational reranking on packed-shared dependency forests. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing, pp. 1479–1488 (2011)

    Google Scholar 

  • Huang, L., Sagae, K.: Dynamic programming for linear-time incremental parsing. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics, pp. 1077–1086 (2010)

    Google Scholar 

  • Huang, L., Jiang, W., Liu, Q.: Bilingually-constrained (monolingual) shift-reduce parsing. In: Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing, vol. 3, pp. 1222–1231 (2009)

    Google Scholar 

  • Koo, T., Collins, M.: Efficient third-order dependency parsers. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics, pp. 1–11 (2010)

    Google Scholar 

  • Le, P., Zuidema, W.: The inside-outside recursive neural network model for dependency parsing. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 729–739 (2014)

    Google Scholar 

  • McDonald, R.T., Pereira, F.C.N.: Online learning of approximate dependency parsing algorithms. In: EACL (2006)

    Google Scholar 

  • McDonald, R., Crammer, K., Pereira, F.: Online large-margin training of dependency parsers. In: Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics, pp. 91–98 (2005)

    Google Scholar 

  • Mikolov, T., Karafiát, M., Burget, L., Cernockỳ, J., Khudanpur, S.: Recurrent neural network based language model. In: INTERSPEECH 2010, 11th Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, 26–30 September 2010, pp. 1045–1048 (2010)

    Google Scholar 

  • Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space (2013). arXiv preprint http://arxiv.org/abs/1301.3781

  • Nivre, J., Hall, J., Nilsson, J., Eryigit, G., Marinov, S.: Labeled pseudo-projective dependency parsing with support vector machines. In: Proceedings of the Tenth Conference on Computational Natural Language Learning, pp. 221–225 (2006)

    Google Scholar 

  • Socher, R., Manning, C.D., Ng, A.Y.: Learning continuous phrase representations and syntactic parsing with recursive neural networks. In: Proceedings of the NIPS-2010 Deep Learning and Unsupervised Feature Learning Workshop, pp. 1–9 (2010)

    Google Scholar 

  • Socher, R., Bauer, J., Manning, C.D., Ng, A.Y.: Parsing with compositional vector grammars. In: Proceedings of the ACL conference (2013)

    Google Scholar 

  • Toutanova, K., Klein, D., Manning, C.D., Singer, Y.: Feature-rich part-of-speech tagging with a cyclic dependency network. In: Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology, vol. 1, pp. 173–180 (2003)

    Google Scholar 

  • Yamada, H., Yuji, M.: Statistical dependency analysis with support vector machines. In: Proceedings of IWPT, vol. 3, pp. 195–206 (2003)

    Google Scholar 

  • Zhang, Y., Nivre, J.: Transition-based dependency parsing with rich non-local features. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. short papers, vol. 2, pp. 188–193 (2011)

    Google Scholar 

Download references

Acknowledgments

This work is supported by National Key Basic Research Program of China (2014CB340504) and National Natural Science Foundation of China (61273318).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Baobao Chang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Huang, P., Chang, B. (2015). Graph-Based Dependency Parsing with Recursive Neural Network. In: Sun, M., Liu, Z., Zhang, M., Liu, Y. (eds) Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data. CCL NLP-NABD 2015 2015. Lecture Notes in Computer Science(), vol 9427. Springer, Cham. https://doi.org/10.1007/978-3-319-25816-4_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-25816-4_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-25815-7

  • Online ISBN: 978-3-319-25816-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics