DETRDistill: A Universal Knowledge Distillation Framework for DETR-families.
Jiahao ChangShuo WangGuangkai XuZehui ChenChenhongyi YangFeng ZhaoPublished in: CoRR (2022)
Keyphrases
- conceptual framework
- expert systems
- knowledge base
- learning systems
- main contribution
- domain knowledge
- real world
- knowledge representation
- spatial knowledge
- knowledge based systems
- prior knowledge
- machine learning
- probabilistic model
- reinforcement learning
- knowledge sharing
- conceptual model
- computer vision
- knowledge extraction
- representation language