• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.

Linfeng LiWeixing SuFang LiuMaowei HeXiaodan Liang
Published in: Neural Process. Lett. (2023)
Keyphrases