Sign in

Knowledge Distillation Based Training of Universal ASR Source Models for Cross-Lingual Transfer.

Takashi FukudaSamuel Thomas
Published in: Interspeech (2021)
Keyphrases