Reverse Transfer Learning: Can Word Embeddings Trained for Different NLP Tasks Improve Neural Language Models?
Lyan VerwimpJerome R. BellegardaPublished in: CoRR (2019)
Keyphrases
- transfer learning
- language model
- n gram
- transfer knowledge
- language modeling
- translation model
- text classification
- knowledge transfer
- multi task
- labeled data
- cross domain
- machine learning
- speech recognition
- text mining
- neural network
- transferring knowledge
- statistical language modeling
- information retrieval
- semi supervised learning
- structure learning
- reinforcement learning
- machine learning algorithms
- part of speech
- active learning
- probabilistic model
- learning algorithm
- collaborative filtering
- query expansion
- test collection
- retrieval model
- text categorization
- smoothing methods
- cross lingual
- training set
- unlabeled data
- previously learned
- vector space
- machine translation
- natural language processing
- training data
- language models for information retrieval
- spoken term detection
- high dimensional
- target domain
- semi supervised
- co occurrence
- low dimensional
- question answering
- data sets