ISCA Archive Interspeech 2021
ISCA Archive Interspeech 2021

Domain-Aware Self-Attention for Multi-Domain Neural Machine Translation

Shiqi Zhang, Yan Liu, Deyi Xiong, Pei Zhang, Boxing Chen

In this paper, we investigate multi-domain neural machine translation (NMT) that translates sentences of different domains in a single model. To this end, we propose a domain-aware self-attention mechanism that jointly learns domain representations with the single NMT model. The learned domain representations are integrated into both the encoder and decoder. We further propose two different domain representation learning approaches: 1) word-level unsupervised learning via a domain attention network and 2) guided learning with an auxiliary loss. The two learning approaches allow our multi-domain NMT to work in different settings as to whether the domain information is available or not. Experiments on both Chinese-English and English-French demonstrate that our multi-domain model outperforms a strong baseline built on the Transformer and other previous multi-domain NMT approaches. Further analyses show that our model is able to learn domain clusters even without prior knowledge about the domain structure.


doi: 10.21437/Interspeech.2021-1477

Cite as: Zhang, S., Liu, Y., Xiong, D., Zhang, P., Chen, B. (2021) Domain-Aware Self-Attention for Multi-Domain Neural Machine Translation. Proc. Interspeech 2021, 2047-2051, doi: 10.21437/Interspeech.2021-1477

@inproceedings{zhang21n_interspeech,
  author={Shiqi Zhang and Yan Liu and Deyi Xiong and Pei Zhang and Boxing Chen},
  title={{Domain-Aware Self-Attention for Multi-Domain Neural Machine Translation}},
  year=2021,
  booktitle={Proc. Interspeech 2021},
  pages={2047--2051},
  doi={10.21437/Interspeech.2021-1477}
}