Login / Signup

Autoregressive Knowledge Distillation through Imitation Learning.

Alexander LinJeremy WohlwendHoward ChenTao Lei
Published in: EMNLP (1) (2020)
Keyphrases