Reverse Transfer Learning: Can Word Embeddings Trained for Different NLP Tasks Improve Neural Language Models?
Lyan VerwimpJerome R. BellegardaPublished in: INTERSPEECH (2019)
Keyphrases
- transfer learning
- language model
- n gram
- transfer knowledge
- language modeling
- text classification
- translation model
- multi task
- knowledge transfer
- speech recognition
- active learning
- text mining
- probabilistic model
- neural network
- query expansion
- collaborative filtering
- natural language processing
- retrieval model
- statistical language modeling
- transferring knowledge
- information retrieval
- cross domain
- domain adaptation
- reinforcement learning
- machine learning
- labeled data
- text categorization
- co occurrence
- semi supervised learning
- structure learning
- cross lingual
- test collection
- relevance model
- language models for information retrieval
- training data
- machine translation
- information extraction
- learning algorithm
- target domain
- part of speech
- vector space
- data analysis
- smoothing methods
- data sets
- error rate
- unlabeled data