ISCA Archive Interspeech 2022
ISCA Archive Interspeech 2022

A user-friendly headset for radar-based silent speech recognition

Pouriya Amini Digehsara, João Vítor Possamai de Menezes, Christoph Wagner, Michael Bärhold, Petr Schaffer, Dirk Plettemeier, Peter Birkholz

Silent speech interfaces allow speech communication to take place in the absence of the acoustic speech signal. Radar-based sensing with radio antennas on the speakers' face can be used as a non-invasive modality to measure speech articulation in such applications. One of the major challenges with this approach is the variability between different sessions, mainly due to the repositioning of the antennas on the face of the speaker. In order to reduce the impact of this influencing factor, we developed a wearable headset that can be 3D-printed with flexible materials and weighs only about 69 g. For evaluation, a radar-based word recognition experiment was performed, where five speakers recorded a speech corpus in multiple sessions, alternatively with the headset and with double-sided tape to place the antennas on the face. By using a bidirectional long short-term memory network for classification, an average intersession word accuracy of 76.50% and 68.18% was obtained using the headset and the tape, respectively. This indicates that the antenna (re-) positioning accuracy with the headset is not worse than that with the double-sided tape while providing other benefits.


doi: 10.21437/Interspeech.2022-10090

Cite as: Amini Digehsara, P., Possamai de Menezes, J.V., Wagner, C., Bärhold, M., Schaffer, P., Plettemeier, D., Birkholz, P. (2022) A user-friendly headset for radar-based silent speech recognition. Proc. Interspeech 2022, 4835-4839, doi: 10.21437/Interspeech.2022-10090

@inproceedings{aminidigehsara22_interspeech,
  author={Pouriya {Amini Digehsara} and João Vítor {Possamai de Menezes} and Christoph Wagner and Michael Bärhold and Petr Schaffer and Dirk Plettemeier and Peter Birkholz},
  title={{A user-friendly headset for radar-based silent speech recognition}},
  year=2022,
  booktitle={Proc. Interspeech 2022},
  pages={4835--4839},
  doi={10.21437/Interspeech.2022-10090}
}