ReAugKD: Retrieval-Augmented Knowledge Distillation For Pre-trained Language Models.
Jianyi ZhangAashiq MuhamedAditya AnantharamanGuoyin WangChangyou ChenKai ZhongQingjun CuiYi XuBelinda ZengTrishul ChilimbiYiran ChenPublished in: ACL (2) (2023)
Keyphrases
- language model
- retrieval model
- document retrieval
- test collection
- query expansion
- language modeling
- language models for information retrieval
- information retrieval
- ad hoc information retrieval
- pre trained
- query terms
- smoothing methods
- statistical language models
- language modelling
- probabilistic model
- relevance model
- n gram
- document ranking
- speech recognition
- document length
- query specific
- term dependencies
- ad hoc retrieval
- passage retrieval
- vector space model
- retrieval effectiveness
- term frequency
- tf idf
- image database
- neural network