Login / Signup

DETRDistill: A Universal Knowledge Distillation Framework for DETR-families.

Jiahao ChangShuo WangHai-Ming XuZehui ChenChenhongyi YangFeng Zhao
Published in: ICCV (2023)
Keyphrases