A Novel Learning Rate Schedule in Optimization for Neural Networks and It's Convergence.
Jieun ParkDokkyun YiSangmin JiPublished in: Symmetry (2020)
Keyphrases
- learning rate
- convergence rate
- neural network
- weight update
- rapid convergence
- convergence speed
- convergence theorem
- global convergence
- activation function
- hidden layer
- adaptive learning rate
- backpropagation algorithm
- training algorithm
- learning algorithm
- error function
- line search
- update rule
- faster convergence
- fuzzy neural network
- step size
- optimization problems
- optimization algorithm
- weight vector
- global optimization
- scheduling problem
- multilayer neural networks
- optimization method
- artificial neural networks
- neural network model
- evolutionary algorithm
- bp neural network algorithm
- delta bar delta
- evolution strategy
- back propagation
- convergence analysis
- feed forward
- np hard