Login / Signup

Less is More: Task-aware Layer-wise Distillation for Language Model Compression.

Chen LiangSimiao ZuoQingru ZhangPengcheng HeWeizhu ChenTuo Zhao
Published in: CoRR (2022)
Keyphrases