ABNGrad: adaptive step size gradient descent for optimizing neural networks.
Wenhan JiangYuqing LiangZhixia JiangDongpo XuLinhua ZhouPublished in: Appl. Intell. (2024)
Keyphrases
- step size
- cost function
- variable step size
- neural network
- adaptive filter
- stochastic gradient descent
- convergence rate
- convergence speed
- least mean square
- evolutionary programming
- gradient method
- pattern recognition
- genetic algorithm
- faster convergence
- blind source separation
- steepest descent method
- hessian matrix
- learning rules
- conjugate gradient
- temporal difference
- line search
- artificial neural networks
- feature extraction
- metaheuristic
- loss function
- fuzzy logic
- image processing
- objective function
- high quality