• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-efficiency, and Better Transferability.

Ruifei HeShuyang SunJihan YangSong BaiXiaojuan Qi
Published in: CVPR (2022)
Keyphrases