skip to main content
10.1145/3446132.3446404acmotherconferencesArticle/Chapter ViewAbstractPublication PagesacaiConference Proceedingsconference-collections
research-article

Domain Adaptation for Tibetan-Chinese Neural Machine Translation

Authors Info & Claims
Published:09 March 2021Publication History

ABSTRACT

The meaning of the same word or sentence is likely to change in different semantic contexts, which challenges general-purpose translation system to maintain stable performance across different domains. Therefore, domain adaptation is an essential researching topic in Neural Machine Translation practice. In order to efficiently train translation models for different domains, in this work we take the Tibetan-Chinese general translation model as the parent model, and obtain two domain-specific Tibetan-Chinese translation models with small-scale in-domain data. The empirical results indicate that the method provides a positive approach for domain adaptation in low-resource scenarios, resulting in better bleu metrics as well as faster training speed over our general baseline models.

References

  1. Sutskever I , Vinyals O , Le Q V . Sequence to Sequence Learning with Neural Networks[J]. Advances in neural information processing systems, 2014..Google ScholarGoogle Scholar
  2. Philipp Koehn and Rebecca Knowles. Six challenges for neural machine translation. In Proceedings of the First Workshop on Neural Machine Translation, pages 28–39, Vancouver, August 2017. Association for Computational Linguistics. URL http://www.aclweb.org/anthology/W17-3204. A Survey Of Cross-lingual Word Embedding Models[J]Google ScholarGoogle ScholarCross RefCross Ref
  3. Li Yachao, Xiong Deyi, ZHANG Min, Tibetan-chinese Neural Network machine Translation Research [J]. Journal of Chinese Information Processing, 2017, 31(6): 103-109.Google ScholarGoogle Scholar
  4. Warren Weaver. 1949. Translation. In William N. Locke and A. Donald Boothe, editors, Machine Translation of Languages[M], MIT Press, Cambridge, MA, pages 15–23.Reprinted from a memorandum written by Weaver in 1949.Google ScholarGoogle Scholar
  5. Minh-Thang Luong. NEURAL MACHINE TRANSLATION[D]. Stanford University, 2016.Google ScholarGoogle Scholar
  6. Ilya Sutskever, Oriol Vinyals, and Quoc V. Le. 2014. Sequence to sequence learning with neural networks[C]. In NIPSGoogle ScholarGoogle ScholarDigital LibraryDigital Library
  7. Yoshua Bengio, Patrice Simard, and Paolo Frasconi. 1994. Learning long-term dependencies with gradient descent is difficult[J]. IEEE Transactions on Neural Networks 5(2):157–166.Google ScholarGoogle Scholar
  8. Zoph B , Yuret D , May J , Transfer Learning for Low-Resource Neural Machine Translation[J]. 2016.Google ScholarGoogle ScholarCross RefCross Ref
  9. Tom Kocmi. Exploring Benefits of Transfer Learning in Neural Machine Translation[D]. Charles University, 2019..Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    ACAI '20: Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence
    December 2020
    576 pages
    ISBN:9781450388115
    DOI:10.1145/3446132

    Copyright © 2020 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 9 March 2021

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate173of395submissions,44%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format