Login / Signup

Knowledge Distillation of Transformer-based Language Models Revisited.

Chengqiang LuJianwei ZhangYunfei ChuZhengyu ChenJingren ZhouFei WuHaiqing ChenHongxia Yang
Published in: CoRR (2022)
Keyphrases