Loading [MathJax]/extensions/MathMenu.js
A Novel Sentence-Level Agreement Architecture for Neural Machine Translation | IEEE Journals & Magazine | IEEE Xplore

A Novel Sentence-Level Agreement Architecture for Neural Machine Translation


Abstract:

In neural machine translation (NMT), there is a natural correspondence between source and target sentences. The traditional NMT method does not explicitly model the trans...Show More

Abstract:

In neural machine translation (NMT), there is a natural correspondence between source and target sentences. The traditional NMT method does not explicitly model the translation agreement on sentence-level. In this article, we propose a comprehensive and novel sentence-level agreement architecture to alleviate this problem. It directly minimizes the difference between the representations of the source-side and target-side sentence on sentence-level. First, we compare a variety of sentence representation strategies and propose a “Gated Sum” sentence representation to achieve better sentence semantic information. Then, rather than a single-layer sentence-level agreement architecture, we further propose a multi-layer sentence agreement architecture to make the source and target semantic spaces closer layer by layer. The proposed agreement module can be integrated into NMT as an additional training objective function, and can also be used to enhance the representation of the source-side sentences. Experiments on the NIST Chinese-to-English and the WMT English-to-German translation tasks show that the proposed agreement architecture achieves significant improvements over state-of-the-art baselines, demonstrating the effectiveness and necessity of exploiting sentence-level agreement for NMT.
Page(s): 2585 - 2597
Date of Publication: 02 September 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.