Login / Signup
Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation.
Kangwook Jang
Sungnyun Kim
Se-Young Yun
Hoirin Kim
Published in:
CoRR (2023)
Keyphrases
</>
non stationary
autoregressive
semi supervised learning
machine learning
data compression
learning algorithm
expert systems
multiresolution
prior knowledge
fault diagnosis