Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization.
Vien V. MaiMikael JohanssonPublished in: ICML (2020)
Keyphrases
- convex optimization
- gradient method
- convergence rate
- learning rate
- convex formulation
- convergence speed
- step size
- operator splitting
- primal dual
- interior point methods
- convex optimization problems
- total variation
- optimization methods
- convex relaxation
- negative matrix factorization
- basis pursuit
- text documents
- image restoration
- image representation
- natural images