Encoding-based Memory Modules for Recurrent Neural Networks.
Antonio CartaAlessandro SperdutiDavide BacciuPublished in: CoRR (2020)
Keyphrases
- recurrent neural networks
- neural network
- feed forward
- complex valued
- reservoir computing
- echo state networks
- memory usage
- memory requirements
- recurrent networks
- artificial neural networks
- feedforward neural networks
- long short term memory
- hidden layer
- neural model
- nonlinear dynamic systems
- cascade correlation
- main memory
- functional modules
- hebbian learning
- long term
- radial basis function
- random access
- encoding scheme
- input output
- pose estimation
- low memory
- building blocks
- support vector