Login / Signup
Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation.
Kangwook Jang
Sungnyun Kim
Se-Young Yun
Hoirin Kim
Published in:
INTERSPEECH (2023)
Keyphrases
</>
complex systems
probabilistic model
fuzzy logic
image compression
semi supervised learning
speech recognition
statistical models
data compression
data sets
decision making
metadata
prior knowledge