ISCA Archive Interspeech 2021
ISCA Archive Interspeech 2021

Improving Multilingual Transformer Transducer Models by Reducing Language Confusions

Eric Sun, Jinyu Li, Zhong Meng, Yu Wu, Jian Xue, Shujie Liu, Yifan Gong

In end-to-end multilingual speech recognition, the hypotheses in one language could include word tokens from other languages. Language confusions happen even more frequently when language identifier (LID) is not present during inference. In this paper, we explore to reduce language confusions without using LID in model inference by creating models with multiple output heads and use sequence probability to select the correct head for output hypotheses. We propose head grouping to merge several language outputs into one head to save runtime cost. Head groups are decided by the distances among language clusters learned through language embedding vectors to separate confusable languages apart. We further propose prediction network sharing for languages from the same family. By jointly applying head grouping and prediction network sharing, training data from the same family languages is better shared while confusable languages are divided into different heads to reduce language confusions. Our experiments demonstrate that our multilingual transformer transducer models based on multi-head outputs achieve on average 7.8% and 10.9% relative word error rate reductions without LID being used in inference from one-head baseline model with affordably increased runtime cost on 10 European languages.


doi: 10.21437/Interspeech.2021-1949

Cite as: Sun, E., Li, J., Meng, Z., Wu, Y., Xue, J., Liu, S., Gong, Y. (2021) Improving Multilingual Transformer Transducer Models by Reducing Language Confusions. Proc. Interspeech 2021, 3470-3474, doi: 10.21437/Interspeech.2021-1949

@inproceedings{sun21c_interspeech,
  author={Eric Sun and Jinyu Li and Zhong Meng and Yu Wu and Jian Xue and Shujie Liu and Yifan Gong},
  title={{Improving Multilingual Transformer Transducer Models by Reducing Language Confusions}},
  year=2021,
  booktitle={Proc. Interspeech 2021},
  pages={3470--3474},
  doi={10.21437/Interspeech.2021-1949}
}