Dataflow matrix machines as programmable, dynamically expandable, self-referential generalized recurrent neural networks.
Michael A. BukatinSteve MatthewsAndrey RadulPublished in: CoRR (2016)
Keyphrases
- recurrent neural networks
- neural network
- reservoir computing
- feed forward
- echo state networks
- complex valued
- recurrent networks
- artificial neural networks
- nonlinear dynamic systems
- eigenvalue problems
- cascade correlation
- hebbian learning
- long short term memory
- eigenvalue decomposition
- software architecture
- general purpose
- feedforward neural networks
- parallel computing
- changing environment
- design methodology
- singular value decomposition
- radial basis function