Exploiting the succeeding words in recurrent neural network language models.
Yangyang ShiMartha A. LarsonPascal WiggersCatholijn M. JonkerPublished in: INTERSPEECH (2013)
Keyphrases
- language model
- recurrent neural networks
- n gram
- language modeling
- out of vocabulary
- document representation
- multiword
- statistical language modeling
- translation model
- document retrieval
- neural network
- probabilistic model
- retrieval model
- complex valued
- document level
- language modelling
- feed forward
- query expansion
- information retrieval
- echo state networks
- recurrent networks
- speech recognition
- test collection
- artificial neural networks
- word error rate
- reservoir computing
- word segmentation
- smoothing methods
- context sensitive
- ad hoc information retrieval
- document ranking
- pseudo relevance feedback
- statistical language models
- language models for information retrieval
- word recognition
- relevance model
- back propagation
- neural network structure
- fuzzy logic
- genetic algorithm