Login / Signup

What Language Model Architecture and Pretraining Objective Work Best for Zero-Shot Generalization?

Thomas WangAdam RobertsDaniel HesslowTeven Le ScaoHyung Won ChungIz BeltagyJulien LaunayColin Raffel
Published in: CoRR (2022)
Keyphrases