Abstract
Data sparsity is fundamental reason that affects the quality of low-resource neural machine translation models (NMT), although transfer learning methods can alleviate data sparsity by introducing external knowledge. However, the pre-trained model parameters are only suitable for the current task set, which does not ensure better performance improvement in downstream tasks. Although meta-learning methods have better potential, while meta-parameters are determined by the second-order gradient term corresponding to a specific task, which directly leads to the consumption of computing resources. In addition, the integration and unified representation of external knowledge is also the main factor to improve performance. Therefore, we proposed a fast meta-learning method using multiple-aligned word embedding representation, which can map all languages to the word embedding space of the target language without seed dictionary. Meanwhile, we update the meta-parameters by calculating the cumulative gradient on different tasks to replace the second-order term in the ordinary meta-learning method, which not only pays attention to the potential but also improves the calculation efficiency. We conducted experiments on three low-resource translation tasks of the CCMT2019 data set and found that our method significantly improves the model quality compared with traditional methods, which fully reflects the effectiveness of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
- 4.
The Vietnamese corpus has 0.8 million Vietnamese sentences and 10 million Vietnamese monosyllables.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
- 12.
- 13.
- 14.
References
Abdulmumin, I., Galadanci, B.S., Isa, A.: Iterative batch back-translation for neural machine translation: a conceptual model. CoRR abs/2001.11327 (2020). https://arxiv.org/abs/2001.11327
Aji, A.F., Bogoychev, N., Heafield, K., Sennrich, R.: In neural machine translation, what does transfer learning transfer? In: Jurafsky, D., Chai, J., Schluter, N., Tetreault, J.R. (eds.) Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, Online, 5–10 July 2020, pp. 7701–7710. Association for Computational Linguistics (2020). https://doi.org/10.18653/v1/2020.acl-main.688
Cheng, Y., Liu, Y., Yang, Q., Sun, M., Xu, W.: Neural machine translation with pivot languages. CoRR abs/1611.04928 (2016). http://arxiv.org/abs/1611.04928
Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: Precup, D., Teh, Y.W. (eds.) Proceedings of the 34th International Conference on Machine Learning, ICML 2017, Sydney, NSW, Australia, 6–11 August 2017. Proceedings of Machine Learning Research, vol. 70, pp. 1126–1135. PMLR (2017).http://proceedings.mlr.press/v70/finn17a.html
Gu, J., Hassan, H., Devlin, J., Li, V.O.K.: Universal neural machine translation for extremely low resource languages. In: Walker, M.A., Ji, H., Stent, A. (eds.) Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2018, New Orleans, Louisiana, USA, 1–6 June 2018, Volume 1 (Long Papers), pp. 344–354. Association for Computational Linguistics (2018). https://doi.org/10.18653/v1/n18-1032
Gu, J., Wang, Y., Chen, Y., Li, V.O.K., Cho, K.: Meta-learning for low-resource neural machine translation. In: Riloff, E., Chiang, D., Hockenmaier, J., Tsujii, J. (eds.) Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 31 October–4 November 2018, pp. 3622–3631. Association for Computational Linguistics (2018). https://doi.org/10.18653/v1/d18-1398
He, D., et al.: Dual learning for machine translation. In: Lee, D.D., Sugiyama, M., von Luxburg, U., Guyon, I., Garnett, R. (eds.) Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, 5–10 December 2016, Barcelona, Spain, pp. 820–828 (2016), https://proceedings.neurips.cc/paper/2016/hash/5b69b9cb83065d403869739ae7f0995e-Abstract.html
Lample, G., Conneau, A., Ranzato, M., Denoyer, L., Jégou, H.: Word translation without parallel data. In: 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, 30 April–3 May 2018,Conference Track Proceedings. OpenReview.net (2018). https://openreview.net/forum?id=H196sainb
Li, R., Wang, X., Yu, H.: Metamt, a meta learning method leveraging multiple domain data for low resource machine translation. In: The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, The Thirty-Second Innovative Applications of Artificial Intelligence Conference, IAAI 2020, The Tenth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2020, New York, NY, USA, 7–12 February 2020, pp. 8245–8252. AAAI Press (2020). https://aaai.org/ojs/index.php/AAAI/article/view/6339
Zoph, B., Yuret, D., May, J., Knight, K.: Transfer learning for low-resource neural machine translation. In: Su, J., Carreras, X., Duh, K. (eds.) Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, Austin, Texas, USA, 1–4 November 2016, pp. 1568–1575. The Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/d16-1163
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Wu, N., Hou, H., Zheng, W., Sun, S. (2021). Low-Resource Neural Machine Translation Using Fast Meta-learning Method. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Lecture Notes in Computer Science(), vol 13111. Springer, Cham. https://doi.org/10.1007/978-3-030-92273-3_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-92273-3_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-92272-6
Online ISBN: 978-3-030-92273-3
eBook Packages: Computer ScienceComputer Science (R0)