Exploiting the Tibetan Radicals in Recurrent Neural Network for Low-Resource Language Models.
Tongtong ShenLongbiao WangXie ChenKuntharrgyal KhysruJianwu DangPublished in: ICONIP (2) (2017)
Keyphrases
- language model
- recurrent neural networks
- language modeling
- neural network
- document retrieval
- n gram
- language modelling
- feed forward
- probabilistic model
- information retrieval
- query expansion
- echo state networks
- complex valued
- speech recognition
- test collection
- reservoir computing
- retrieval model
- vector space model
- hidden layer
- language models for information retrieval
- recurrent networks
- artificial neural networks
- ad hoc information retrieval
- smoothing methods
- statistical language models
- relevance model
- document ranking
- language model for information retrieval
- term dependencies
- document length
- genetic algorithm
- query terms
- tf idf
- control system
- real valued