Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation.
Mitchell A. GordonKevin DuhPublished in: NGT@ACL (2020)
Keyphrases
- machine translation
- domain models
- domain knowledge
- natural language processing
- information extraction
- cross lingual
- target language
- cross language information retrieval
- machine translation system
- natural language
- description language
- word alignment
- statistical machine translation
- process model
- supervised learning
- query translation
- knowledge structures
- reinforcement learning