Sign in

bert2BERT: Towards Reusable Pretrained Language Models.

Cheng ChenYichun YinLifeng ShangXin JiangYujia QinFengyu WangZhi WangXiao ChenZhiyuan LiuQun Liu
Published in: ACL (1) (2022)
Keyphrases