Combining Sequence Distillation and Transfer Learning for Efficient Low-Resource Neural Machine Translation Models.
Raj DabreAtsushi FujitaPublished in: WMT@EMNLP (2020)
Keyphrases
- machine translation
- transfer learning
- cross lingual
- information extraction
- learning tasks
- target language
- natural language processing
- knowledge transfer
- cross domain
- cross language information retrieval
- semi supervised learning
- statistical translation models
- machine learning
- active learning
- natural language
- finite state transducers
- machine translation system
- reinforcement learning
- collaborative filtering
- labeled data
- model selection
- probabilistic model
- context sensitive
- knowledge representation
- parameter estimation
- bayesian networks
- learning algorithm
- domain adaptation
- information retrieval
- statistical machine translation
- neural network