Login / Signup

Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism.

Chenqi GuoShiwei ZhongXiaofeng LiuQianli FengYinglong Ma
Published in: CoRR (2024)
Keyphrases