Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification.
Liuyu XiangGuiguang DingPublished in: CoRR (2020)
Keyphrases
- knowledge acquisition
- learning systems
- learning algorithm
- learning process
- prior knowledge
- supervised learning
- domain knowledge
- effective learning
- human experts
- pattern recognition
- background knowledge
- learning phase
- online learning
- knowledge base
- incremental learning
- classification algorithm
- machine learning
- text classification
- multiple tasks
- subject matter
- discriminative learning
- combining multiple
- neural network
- prior domain knowledge
- classification method
- domain experts
- support vector machine svm
- active learning
- training set
- learning scenarios
- image classification
- procedural knowledge
- feature space