Revisiting Optimal Convergence Rate for Smooth and Non-convex Stochastic Decentralized Optimization.
Kun YuanXinmeng HuangYiming ChenXiaohan ZhangYingya ZhangPan PanPublished in: CoRR (2022)
Keyphrases
- convergence rate
- global convergence
- faster convergence rate
- convergence speed
- step size
- learning rate
- line search
- competitive ratio
- gradient method
- mutation operator
- lp norm
- piecewise linear
- optimization problems
- global optimization
- optimization method
- optimization algorithm
- stochastic gradient descent
- convex optimization
- primal dual
- optimal solution
- variable step size
- numerical stability
- dynamic programming
- multi objective