PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models.
Zhuocheng GongJiahao LiuQifan WangYang YangJingang WangWei WuYunsen XianDongyan ZhaoRui YanPublished in: CoRR (2023)
Keyphrases
- labeled data
- language model
- pre trained
- training data
- training examples
- semi supervised learning
- language modeling
- speech recognition
- information retrieval
- document retrieval
- probabilistic model
- language modelling
- test collection
- retrieval model
- query expansion
- n gram
- language models for information retrieval
- document ranking
- statistical language models
- data sets
- computer vision
- control signals
- bayesian networks
- machine learning
- neural network