Recurrent Neural Network Language Model Training Using Natural Gradient.
Jianwei YuMax W. Y. LamXie ChenShoukang HuSongxiang LiuXixin WuXunying LiuHelen MengPublished in: ICASSP (2019)
Keyphrases
- language model
- recurrent neural networks
- recurrent networks
- language modeling
- natural gradient
- n gram
- feed forward
- probabilistic model
- speech recognition
- neural network
- information retrieval
- hidden layer
- independent component analysis
- training process
- back propagation
- artificial neural networks
- blind source separation
- training algorithm
- training set
- learning rate
- bayesian framework
- reinforcement learning
- machine learning
- genetic algorithm
- supervised learning
- training samples