Login / Signup
Dynamic Knowledge Distillation for Pre-trained Language Models.
Lei Li
Yankai Lin
Shuhuai Ren
Peng Li
Jie Zhou
Xu Sun
Published in:
CoRR (2021)
Keyphrases
</>
language model
pre trained
language modeling
n gram
probabilistic model
speech recognition
document retrieval
test collection
context sensitive
statistical language models
language modelling
smoothing methods
query expansion
retrieval model
language models for information retrieval
document ranking
information retrieval
relevance model
prior knowledge
viewpoint
computer vision
data mining