A double incremental aggregated gradient method with linear convergence rate for large-scale optimization.
Aryan MokhtariMert GürbüzbalabanAlejandro RibeiroPublished in: ICASSP (2017)
Keyphrases
- gradient method
- convergence rate
- global convergence
- faster convergence rate
- step size
- optimization methods
- convergence speed
- recursive least squares
- learning rate
- global optimization
- faster convergence
- quadratic programming
- optimization method
- optimization algorithm
- optimization problems
- clustering algorithm
- wavelet neural network
- image classification
- numerical stability
- number of iterations required