On the linear convergence of the stochastic gradient method with constant step-size.
Volkan CevherBang Công VuPublished in: Optim. Lett. (2019)
Keyphrases
- gradient method
- step size
- convergence rate
- convergence speed
- steepest descent method
- faster convergence
- variable step size
- line search
- cost function
- global convergence
- learning rate
- recursive least squares
- stochastic gradient descent
- differential evolution
- quadratic programming
- non stationary
- machine learning
- independent component analysis
- artificial neural networks
- objective function