Knowledge Distillation for Low-Power Object Detection: A Simple Technique and Its Extensions for Training Compact Models Using Unlabeled Data.
Amin Banitalebi-DehkordiPublished in: ICCVW (2021)
Keyphrases
- low power
- unlabeled data
- labeled data
- semi supervised learning
- supervised learning
- object detection
- semi supervised
- prior knowledge
- high speed
- low cost
- labeled training data
- power consumption
- training set
- active learning
- training examples
- co training
- training data
- labeled data for training
- supervised learning algorithms
- learning algorithm
- semi supervised classification
- text categorization
- labeled and unlabeled data
- text classification
- knowledge discovery
- probabilistic model
- background knowledge
- labeled examples
- machine learning
- learning models
- decision trees
- domain adaptation
- labeled instances
- data mining
- number of labeled examples
- labeling effort
- unsupervised learning
- similarity measure
- human experts
- high dimensional
- training samples
- model selection