Sufficient Conditions for Error Back Flow Convergence in Dynamical Recurrent Neural Networks.
Alex AussemPublished in: IJCNN (4) (2000)
Keyphrases
- sufficient conditions
- recurrent neural networks
- linear complementarity problem
- neural network
- recurrent networks
- feed forward
- echo state networks
- reservoir computing
- cascade correlation
- asymptotic stability
- exponential stability
- hidden layer
- fixed point
- asymptotic optimality
- linear systems
- lyapunov function
- artificial neural networks
- convergence speed
- nonlinear dynamic systems
- control algorithm
- neuro fuzzy
- input output
- efficiently computable
- reinforcement learning