Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression.
Xinge MaJin WangLiang-Chih YuXuejie ZhangPublished in: COLING (2022)
Keyphrases
- language model
- meta learning
- meta knowledge
- language modeling
- probabilistic model
- learning tasks
- n gram
- domain knowledge
- inductive learning
- retrieval model
- information retrieval
- knowledge acquisition
- model selection
- ad hoc information retrieval
- knowledge discovery
- knowledge base
- query expansion
- knowledge representation
- decision trees
- machine learning algorithms
- prediction accuracy
- mixture model
- support vector
- translation model
- machine learning