Latent words recurrent neural network language models.
Ryo MasumuraTaichi AsamiTakanobu ObaHirokazu MasatakiSumitaka SakauchiAkinori ItoPublished in: INTERSPEECH (2015)
Keyphrases
- language model
- recurrent neural networks
- n gram
- language modeling
- out of vocabulary
- document representation
- multiword
- probabilistic model
- statistical language modeling
- recurrent networks
- latent variables
- translation model
- document retrieval
- feed forward
- neural network
- information retrieval
- speech recognition
- language modelling
- complex valued
- reservoir computing
- bag of words
- dependency structure
- retrieval model
- word error rate
- artificial neural networks
- statistical language models
- smoothing methods
- echo state networks
- ad hoc information retrieval
- query expansion
- document level
- test collection
- word segmentation
- language models for information retrieval
- context sensitive
- pseudo relevance feedback
- relevance model
- vector space model
- query terms
- cross lingual
- text documents
- search engine
- collaborative filtering
- keywords
- bayesian networks
- feature selection
- learning algorithm
- machine learning