Dynamic Knowledge Distillation for Pre-trained Language Models.
Lei LiYankai LinShuhuai RenPeng LiJie ZhouXu SunPublished in: CoRR (2021)
Keyphrases
- language model
- pre trained
- language modeling
- n gram
- probabilistic model
- speech recognition
- document retrieval
- test collection
- context sensitive
- statistical language models
- language modelling
- smoothing methods
- query expansion
- retrieval model
- language models for information retrieval
- document ranking
- information retrieval
- relevance model
- prior knowledge
- viewpoint
- computer vision
- data mining