Tight Dimension Independent Lower Bound on the Expected Convergence Rate for Diminishing Step Sizes in SGD.
Phuong Ha NguyenLam M. NguyenMarten van DijkPublished in: NeurIPS (2019)
Keyphrases
- step size
- convergence rate
- lower bound
- stochastic gradient descent
- upper bound
- convergence speed
- learning rate
- gradient method
- global convergence
- faster convergence
- evolutionary programming
- competitive ratio
- worst case
- line search
- objective function
- mutation operator
- global optimum
- primal dual
- numerical stability
- cost function
- variable step size
- faster convergence rate
- optimal solution
- particle swarm optimization algorithm
- global search
- genetic algorithm
- image reconstruction
- multiscale