Login / Signup

Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation.

Mitchell A. GordonKevin Duh
Published in: NGT@ACL (2020)
Keyphrases