Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization.
Yangyang ShiMei-Yuh HwangXin LeiHaoyu ShengPublished in: CoRR (2019)
Keyphrases
- language modeling
- recurrent neural networks
- language model
- retrieval model
- query expansion
- knowledge base
- information retrieval
- cross lingual
- information retrieval systems
- probabilistic model
- recurrent networks
- knowledge discovery
- feed forward
- artificial neural networks
- neural network
- complex valued
- sentence retrieval
- reservoir computing
- learning process
- statistical language models