On the Duration, Addressability, and Capacity of Memory-Augmented Recurrent Neural Networks.
Zhibin QuanZhiqiang GaoWeili ZengXuelian LiMan ZhuPublished in: IEEE Access (2018)
Keyphrases
- recurrent neural networks
- memory capacity
- feed forward
- neural network
- echo state networks
- feedforward neural networks
- artificial neural networks
- memory requirements
- hidden layer
- recurrent networks
- complex valued
- reservoir computing
- neural model
- hebbian learning
- memory size
- memory usage
- nonlinear dynamic systems
- cascade correlation
- memory space
- long short term memory
- storage capacity
- real time
- main memory
- long term
- evolutionary algorithm
- artificial intelligence