Knowledge Distillation for Recurrent Neural Network Language Modeling with Trust Regularization.
Yangyang ShiMei-Yuh HwangXin LeiHaoyu ShengPublished in: ICASSP (2019)
Keyphrases
- language modeling
- recurrent neural networks
- language model
- information retrieval
- query expansion
- retrieval model
- neural network
- feed forward
- recurrent networks
- probabilistic model
- n gram
- expert systems
- hidden layer
- word segmentation
- complex valued
- knowledge base
- statistical language models
- knowledge discovery
- sentence retrieval
- echo state networks
- improvements in retrieval effectiveness