Skip to main content

The Direct Path May Not Be The Best: Portuguese-Chinese Neural Machine Translation

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11805))

Abstract

Machine Translation (MT) has been one of the classic AI tasks from the early days of the field. Portuguese and Chinese are languages with a very large number of native speakers, though this does not carry through to the amount of literature on their processing, or to the amount of resources available to be used, in particular when compared with English. In this paper, we address the feasibility of creating a MT system for Portuguese-Chinese, using only freely available resources, by experimenting with various approaches to pairing source and target parallel data during training. These approaches are (i) using a model for each source-target language pair, (ii) using an intermediate pivot language, and (iii) using a single model that can translate from any language seen in the source side to any language seen on the target side. We find approaches whose performance is higher than that of the strong baseline consisting of an MT service provided by an IT industry giant for the pair Portuguese-Chinese.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    If creating an MT system for many languages, this approach only requires two models per language; a much lower number than when using a model for each language pair.

  2. 2.

    UM-Corpus [11], for zh-en, and UM-PCorpus [3], for zh-pt, should not be confused.

  3. 3.

    https://paracrawl.eu/.

  4. 4.

    Despite this 50% reduction in the size of the corpus, training the many-to-many model took around 808 GPU hours (more than 33 days) to converge.

  5. 5.

    In the most recent WMT 2018 [2], 33 of the 38 systems used deep neural models, and 29 of these 33 were based on the Transformer model.

  6. 6.

    https://portulanclarin.net/workbench/lx/translator.

  7. 7.

    https://github.com/fxsjy/jieba.

  8. 8.

    https://github.com/alvations/sacremoses.

  9. 9.

    https://github.com/rsennrich/subword-nmt.

  10. 10.

    https://translate.google.com/.

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of the International Conference on Learning Representations (ICLR) (2015). arXiv preprint arXiv:1409.0473

  2. Bojar, O., et al.: Findings of the 2018 conference on machine translation (WMT18). In: Proceedings of the Third Conference on Machine Translation, Volume 2: Shared Task Papers, pp. 272–307 (2018)

    Google Scholar 

  3. Chao, L.S., Wong, D.F., Ao, C.H., Leal, A.L.: UM-PCorpus: a large Portuguese-Chinese parallel corpus. In: Proceedings of the LREC 2018 Workshop “Belt & Road: Language Resources and Evaluation”, pp. 38–43 (2018)

    Google Scholar 

  4. Johnson, M., et al.: Google’s multilingual neural machine translation system: enabling zero-shot translation. Trans. Assoc. Comput. Linguist. 5, 339–351 (2017)

    Article  Google Scholar 

  5. Junczys-Dowmunt, M., et al.: Marian: fast neural machine translation in C++. In: Proceedings of ACL 2018, System Demonstrations, pp. 116–121 (2018)

    Google Scholar 

  6. Liu, S., Wang, L., Liu, C.H.: Chinese-Portuguese machine translation: a study on building parallel corpora from comparable texts. In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), pp. 1485–1492 (2018)

    Google Scholar 

  7. Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015)

    Google Scholar 

  8. Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318 (2002)

    Google Scholar 

  9. Sennrich, R., Haddow, B., Birch, A.: Neural machine translation of rare words with subword units. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1715–1725 (2016)

    Google Scholar 

  10. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Neural Information Processing Systems, pp. 3104–3112 (2014)

    Google Scholar 

  11. Tian, L., Wong, D.F., Chao, L.S., Quaresma, P., Oliveira, F., Yi, L.: UM-Corpus: a large English-Chinese parallel corpus for statistical machine translation. In: Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC 2014), pp. 1837–1842 (2014)

    Google Scholar 

  12. Tiedemann, J.: Parallel data, tools and interfaces in OPUS. In: Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC 2012), pp. 2214–2218 (2012)

    Google Scholar 

  13. Vaswani, A., et al.: Attention is all you need. In: Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

Download references

Acknowledgements

The research results presented here were supported by FCT—Foundation for Science and Technology of Portugal, MOST—Ministry of Science and Technology of China, through the project Chinese-Portuguese Deep Machine Translation in eCommerce Domain (441.00 CHINA-BILATERAL), the PORTULAN CLARIN Infrastructure for the Science and Technology of Language, the National Infrastructure for Distributed Computing (INCD) of Portugal, and the ANI/3279/2016 grant. Deyi Xiong was supported by National Natural Science Foundation of China (Grants No. 61622209 and 61861130364).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rodrigo Santos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Santos, R., Silva, J., Branco, A., Xiong, D. (2019). The Direct Path May Not Be The Best: Portuguese-Chinese Neural Machine Translation. In: Moura Oliveira, P., Novais, P., Reis, L. (eds) Progress in Artificial Intelligence. EPIA 2019. Lecture Notes in Computer Science(), vol 11805. Springer, Cham. https://doi.org/10.1007/978-3-030-30244-3_62

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30244-3_62

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30243-6

  • Online ISBN: 978-3-030-30244-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics