Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-Tailed Classification.
Liuyu XiangGuiguang DingJungong HanPublished in: ECCV (5) (2020)
Keyphrases
- prior knowledge
- supervised learning
- learning systems
- incremental learning
- unsupervised learning
- online learning
- human experts
- knowledge acquisition
- machine learning
- learning algorithm
- learning process
- learning tasks
- knowledge transfer
- knowledge discovery
- classification accuracy
- feature vectors
- support vector
- multiple tasks
- combining multiple
- knowledge level
- subject matter
- neural network
- multi category
- domain experts
- class labels
- model selection
- pattern recognition
- background knowledge
- knowledge management
- support vector machine
- domain knowledge
- organizational learning
- expert systems
- learning phase
- discriminative learning
- effective learning
- reinforcement learning
- knowledge base