ABSTRACT
The meaning of the same word or sentence is likely to change in different semantic contexts, which challenges general-purpose translation system to maintain stable performance across different domains. Therefore, domain adaptation is an essential researching topic in Neural Machine Translation practice. In order to efficiently train translation models for different domains, in this work we take the Tibetan-Chinese general translation model as the parent model, and obtain two domain-specific Tibetan-Chinese translation models with small-scale in-domain data. The empirical results indicate that the method provides a positive approach for domain adaptation in low-resource scenarios, resulting in better bleu metrics as well as faster training speed over our general baseline models.
- Sutskever I , Vinyals O , Le Q V . Sequence to Sequence Learning with Neural Networks[J]. Advances in neural information processing systems, 2014..Google Scholar
- Philipp Koehn and Rebecca Knowles. Six challenges for neural machine translation. In Proceedings of the First Workshop on Neural Machine Translation, pages 28–39, Vancouver, August 2017. Association for Computational Linguistics. URL http://www.aclweb.org/anthology/W17-3204. A Survey Of Cross-lingual Word Embedding Models[J]Google ScholarCross Ref
- Li Yachao, Xiong Deyi, ZHANG Min, Tibetan-chinese Neural Network machine Translation Research [J]. Journal of Chinese Information Processing, 2017, 31(6): 103-109.Google Scholar
- Warren Weaver. 1949. Translation. In William N. Locke and A. Donald Boothe, editors, Machine Translation of Languages[M], MIT Press, Cambridge, MA, pages 15–23.Reprinted from a memorandum written by Weaver in 1949.Google Scholar
- Minh-Thang Luong. NEURAL MACHINE TRANSLATION[D]. Stanford University, 2016.Google Scholar
- Ilya Sutskever, Oriol Vinyals, and Quoc V. Le. 2014. Sequence to sequence learning with neural networks[C]. In NIPSGoogle ScholarDigital Library
- Yoshua Bengio, Patrice Simard, and Paolo Frasconi. 1994. Learning long-term dependencies with gradient descent is difficult[J]. IEEE Transactions on Neural Networks 5(2):157–166.Google Scholar
- Zoph B , Yuret D , May J , Transfer Learning for Low-Resource Neural Machine Translation[J]. 2016.Google ScholarCross Ref
- Tom Kocmi. Exploring Benefits of Transfer Learning in Neural Machine Translation[D]. Charles University, 2019..Google Scholar
Recommendations
Research on Tibetan-Chinese Neural Machine Translation Integrating Statistical Method
MLNLP '23: Proceedings of the 2023 6th International Conference on Machine Learning and Natural Language ProcessingIn recent years, with the emergence of deep learning methods, Neural Machine Translation has become a new research direction of machine translation. Due to the scarcity of digital resources in Tibetan, there is only a small-scale Tibetan-Chinese ...
Post-editing neural machine translation versus phrase-based machine translation for English---Chinese
This paper aims to shed light on the post-editing process of the recently-introduced neural machine translation (NMT) paradigm. Using simple and more complex texts, we first evaluate the output quality from English to Chinese phrase-based statistical (...
Using Translation Memory to Improve Neural Machine Translations
ICDLT '22: Proceedings of the 2022 6th International Conference on Deep Learning TechnologiesIn this paper, we describe a way of using translation memory (TM) to improve the translation quality and stability of neural machine translation (NMT) systems, especially when the sentences to be translated have high similarity with sentences stored in ...
Comments