Contrastive Distillation Is a Sample-Efficient Self-Supervised Loss Policy for Transfer Learning.
Chris LengerichGabriel SynnaeveAmy ZhangHugh LeatherKurt ShusterFrançois ChartonCharysse RedwoodPublished in: CoRR (2022)
Keyphrases
- transfer learning
- knowledge transfer
- labeled data
- learning tasks
- text classification
- machine learning
- reinforcement learning
- collaborative filtering
- active learning
- cross domain
- structure learning
- domain adaptation
- multi task
- multi task learning
- transfer knowledge
- manifold alignment
- prediction with expert advice
- semi supervised learning
- machine learning algorithms
- nearest neighbor
- previously learned