PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models.
Zhuocheng GongJiahao LiuQifan WangYang YangJingang WangWei WuYunsen XianDongyan ZhaoRui YanPublished in: ACL (Findings) (2023)
Keyphrases
- language model
- pre trained
- language modeling
- training data
- document retrieval
- probabilistic model
- training examples
- n gram
- language modelling
- query expansion
- speech recognition
- retrieval model
- statistical language models
- control signals
- information retrieval
- language models for information retrieval
- test collection
- smoothing methods
- data sets
- relevance model
- decision trees