On the global convergence of randomized coordinate gradient descent for non-convex optimization.
Ziang ChenYingzhou LiJianfeng LuPublished in: CoRR (2021)
Keyphrases
- convex optimization
- global convergence
- operator splitting
- global optimum
- convergence speed
- convergence analysis
- convergence rate
- optimization methods
- objective function
- interior point methods
- low rank
- constrained optimization problems
- line search
- cost function
- total variation
- primal dual
- augmented lagrangian
- conjugate gradient
- convex optimization problems
- step size
- simulated annealing
- convex sets
- differential evolution
- loss function
- hybrid algorithm
- optimization method
- newton method
- multiresolution
- search algorithm
- particle swarm
- genetic algorithm
- linear programming