Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks.
Hisashi IwadeKohei NakajimaTakuma TanakaToshio AoyagiPublished in: CoRR (2018)
Keyphrases
- recurrent neural networks
- recurrent networks
- neural network
- complex valued
- feedforward neural networks
- echo state networks
- feed forward
- long short term memory
- artificial neural networks
- neural model
- reservoir computing
- hidden layer
- memory requirements
- data driven
- hebbian learning
- memory usage
- data structure
- reinforcement learning
- learning algorithm
- machine learning
- nonlinear dynamic systems
- real time