• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Less is More: Task-aware Layer-wise Distillation for Language Model Compression.

Chen LiangSimiao ZuoQingru ZhangPengcheng HeWeizhu ChenTuo Zhao
Published in: CoRR (2022)
Keyphrases