Login / Signup

Attention Distillation: self-supervised vision transformer students need more guidance.

Kai WangFei YangJoost van de Weijer
Published in: CoRR (2022)
Keyphrases