Arbitrary Few Parameters are Good Enough for Adapting Large-scale Pre-trained Language Models.
Yusheng SuChi-Min ChanJiali ChengYujia QinYankai LinShengding HuZonghan YangNing DingZhiyuan LiuMaosong SunPublished in: CoRR (2023)
Keyphrases
- language model
- pre trained
- language modeling
- n gram
- query expansion
- document retrieval
- language modelling
- statistical language models
- probabilistic model
- speech recognition
- retrieval model
- information retrieval
- test collection
- smoothing methods
- maximum likelihood
- relevance model
- document ranking
- statistical language modeling
- multi modal
- pairwise
- face recognition