Sign in

A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models.

Hayeon LeeRui HouJongpil KimDavis LiangSung Ju HwangAlexander Min
Published in: CoRR (2023)
Keyphrases