Feed forward pre-training for recurrent neural network language models.
Siva Reddy GangireddyFergus McInnesSteve RenalsPublished in: INTERSPEECH (2014)
Keyphrases
- feed forward
- language model
- recurrent neural networks
- recurrent networks
- feedforward neural networks
- feed forward neural networks
- echo state networks
- language modeling
- back propagation
- error back propagation
- artificial neural networks
- hidden layer
- neural network
- probabilistic model
- n gram
- speech recognition
- training algorithm
- statistical language models
- information retrieval
- language modelling
- neural nets
- retrieval model
- document retrieval
- activation function
- query expansion
- document ranking
- test collection
- complex valued
- reservoir computing
- language models for information retrieval
- neural network structure
- neural model
- relevance model
- hebbian learning
- context sensitive
- pseudo relevance feedback
- search engine
- fuzzy logic
- translation model
- smoothing methods
- multilayer perceptron
- multi layer perceptron
- real time
- training process