Revisiting Optimal Convergence Rate for Smooth and Non-convex Stochastic Decentralized Optimization.
Kun YuanXinmeng HuangYiming ChenXiaohan ZhangYingya ZhangPan PanPublished in: NeurIPS (2022)
Keyphrases
- convergence rate
- global convergence
- faster convergence rate
- convergence speed
- step size
- learning rate
- line search
- gradient method
- globally optimal
- lp norm
- competitive ratio
- mutation operator
- global optimization
- optimization algorithm
- global optimality
- optimization problems
- number of iterations required
- primal dual
- global search
- piecewise linear
- dynamic programming
- optimal solution
- risk minimization
- optimal control
- numerical stability
- optimization methods
- convex optimization