Login / Signup

Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation.

Kangwook JangSungnyun KimSe-Young YunHoirin Kim
Published in: CoRR (2023)
Keyphrases
  • non stationary
  • autoregressive
  • semi supervised learning
  • machine learning
  • data compression
  • learning algorithm
  • expert systems
  • multiresolution
  • prior knowledge
  • fault diagnosis