Login / Signup

Adaptive Knowledge Distillation between Text and Speech Pre-trained Models.

Jinjie NiYukun MaWen WangQian ChenDianwen NgHan LeiTrung Hieu NguyenChong ZhangBin MaErik Cambria
Published in: CoRR (2023)
Keyphrases
  • pre trained
  • text to speech
  • real time
  • neural network
  • prior knowledge
  • text mining
  • human body
  • wide range
  • pairwise
  • control signals