Convergence of a Stochastic Gradient Method with Momentum for Nonsmooth Nonconvex Optimization.
Vien V. MaiMikael JohanssonPublished in: CoRR (2020)
Keyphrases
- gradient method
- globally convergent
- convergence rate
- global convergence
- optimization methods
- learning rate
- derivative free
- line search
- image restoration and reconstruction
- optimization problems
- step size
- variational inequalities
- global optimization
- autocalibration
- newton method
- convergence analysis
- convergence speed
- faster convergence
- nonlinear programming
- quasi newton
- convex formulation
- primal dual
- objective function
- optimization method
- evolutionary algorithm
- optimization algorithm
- mathematical programming
- signal processing
- alternating direction method of multipliers
- constrained optimization
- metaheuristic
- simulated annealing
- pairwise