Sign in

PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models.

Zhuocheng GongJiahao LiuQifan WangYang YangJingang WangWei WuYunsen XianDongyan ZhaoRui Yan
Published in: ACL (Findings) (2023)
Keyphrases