Login / Signup

KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation.

Marzieh S. TahaeiElla CharlaixVahid Partovi NiaAli GhodsiMehdi Rezagholizadeh
Published in: NAACL-HLT (2022)
Keyphrases