Deep Bidirectional Long Short-Term Memory Recurrent Neural Networks for Grapheme-to-Phoneme Conversion Utilizing Complex Many-to-Many Alignments.
Amr El-Desoky MousaBjörn W. SchullerPublished in: INTERSPEECH (2016)
Keyphrases
- recurrent neural networks
- long short term memory
- complex valued
- artificial neural networks
- feed forward
- neural network
- echo state networks
- reservoir computing
- recurrent networks
- feedforward neural networks
- grapheme to phoneme conversion
- neural model
- information retrieval
- natural language
- nonlinear dynamic systems