Representing Compositionality based on Multiple Timescales Gated Recurrent Neural Networks with Adaptive Temporal Hierarchy for Character-Level Language Models.
Dennis Singh MoirangthemJegyung SonMinho LeePublished in: Rep4NLP@ACL (2017)
Keyphrases
- language model
- recurrent neural networks
- language modeling
- n gram
- statistical language models
- document retrieval
- document level
- language modelling
- probabilistic model
- speech recognition
- query expansion
- echo state networks
- neural network
- recurrent networks
- retrieval model
- feed forward
- test collection
- ad hoc information retrieval
- information retrieval
- smoothing methods
- cascade correlation
- nonlinear dynamic systems
- machine learning
- pseudo relevance feedback
- vector space model
- context sensitive
- query terms
- artificial neural networks
- relevance model
- reservoir computing