Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All.
Eylon GuettaAvi ShmidmanShaltiel ShmidmanCheyn Shmuel ShmidmanJoshua GuedaliaMoshe KoppelDan BareketAmit SekerReut TsarfatyPublished in: CoRR (2022)