Prediction of LSTM-RNN Full Context States as a Subtask for N-Gram Feedforward Language Models.
Kazuki IrieZhihong LeiRalf SchlüterHermann NeyPublished in: ICASSP (2018)
Keyphrases
- language model
- n gram
- recurrent neural networks
- feed forward
- context sensitive
- language modeling
- back propagation
- language modelling
- language independent
- artificial neural networks
- probabilistic model
- speech recognition
- neural network
- document retrieval
- information retrieval
- query expansion
- bag of words
- retrieval model
- part of speech
- prediction accuracy
- statistical language modeling
- smoothing methods
- pseudo relevance feedback
- query terms
- vector space model
- genetic algorithm
- web documents
- bayesian networks
- cross lingual
- artificial intelligence