ISCA Archive Interspeech 2021
ISCA Archive Interspeech 2021

Conformer Parrotron: A Faster and Stronger End-to-End Speech Conversion and Recognition Model for Atypical Speech

Zhehuai Chen, Bhuvana Ramabhadran, Fadi Biadsy, Xia Zhang, Youzheng Chen, Liyang Jiang, Fang Chu, Rohan Doshi, Pedro J. Moreno

Parrotron is an end-to-end personalizable model that enables many-to-one voice conversion (VC) and automated speech recognition (ASR) simultaneously for atypical speech. In this work, we present the next-generation Parrotron model with improvements in overall accuracy, training and inference speeds. The proposed architecture builds on the recent Conformer encoder comprising of convolution and attention layer based blocks used in ASR. We introduce architectural modifications that subsamples encoder activations to achieve speed-ups in training and inference. In order to jointly improve ASR and voice conversion quality, we show that this requires a corresponding upsampling after the Conformer encoder blocks. We provide an in-depth analysis on how the proposed approach can maximize the efficiency of a speech-to-speech conversion model in the context of atypical speech. Experiments on both many-to-one and one-to-one dysarthric speech conversion tasks show that we can achieve up to 7× speedup and 35% relative reduction in WER over the previous best Transformer Parrotron.


doi: 10.21437/Interspeech.2021-676

Cite as: Chen, Z., Ramabhadran, B., Biadsy, F., Zhang, X., Chen, Y., Jiang, L., Chu, F., Doshi, R., Moreno, P.J. (2021) Conformer Parrotron: A Faster and Stronger End-to-End Speech Conversion and Recognition Model for Atypical Speech. Proc. Interspeech 2021, 4828-4832, doi: 10.21437/Interspeech.2021-676

@inproceedings{chen21w_interspeech,
  author={Zhehuai Chen and Bhuvana Ramabhadran and Fadi Biadsy and Xia Zhang and Youzheng Chen and Liyang Jiang and Fang Chu and Rohan Doshi and Pedro J. Moreno},
  title={{Conformer Parrotron: A Faster and Stronger End-to-End Speech Conversion and Recognition Model for Atypical Speech}},
  year=2021,
  booktitle={Proc. Interspeech 2021},
  pages={4828--4832},
  doi={10.21437/Interspeech.2021-676}
}