Login / Signup

DETRDistill: A Universal Knowledge Distillation Framework for DETR-families.

Jiahao ChangShuo WangGuangkai XuZehui ChenChenhongyi YangFeng Zhao
Published in: CoRR (2022)
Keyphrases