Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation.
Mitchell A. GordonKevin DuhPublished in: CoRR (2020)
Keyphrases
- machine translation
- domain models
- natural language processing
- information extraction
- domain knowledge
- cross lingual
- target language
- machine translation system
- statistical machine translation
- word alignment
- description language
- cross language information retrieval
- knowledge structures
- query translation
- natural language
- natural language text
- process model
- training set
- knowledge acquisition
- text mining
- supervised learning
- source language
- information retrieval