Login / Signup
Dynamic Knowledge Distillation for Pre-trained Language Models.
Lei Li
Yankai Lin
Shuhuai Ren
Peng Li
Jie Zhou
Xu Sun
Published in:
EMNLP (1) (2021)
Keyphrases
</>
language model
pre trained
language modeling
document retrieval
speech recognition
probabilistic model
information retrieval
statistical language models
n gram
retrieval model
prior knowledge
context sensitive
test collection
relevance model
language modelling
query expansion
text mining
data sets