An online gradient method with momentum for two-layer feedforward neural networks.
Naimin ZhangPublished in: Appl. Math. Comput. (2009)
Keyphrases
- feedforward neural networks
- gradient method
- convergence rate
- hidden layer
- learning rate
- neural network
- recurrent neural networks
- back propagation
- step size
- multilayer perceptron
- multi layer
- optimization methods
- feed forward
- training algorithm
- artificial neural networks
- k nearest neighbor
- negative matrix factorization
- multi objective