Login / Signup
Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models.
Linqing Liu
Huan Wang
Jimmy Lin
Richard Socher
Caiming Xiong
Published in:
CoRR (2019)
Keyphrases
</>
multi task
prior knowledge
learning process
probabilistic model
multi task learning
feature selection
knowledge discovery
learning styles
classification models
collaborative filtering
unsupervised learning
gaussian processes
learned models
multiple tasks
sparse learning