Login / Signup
Statistically Preconditioned Accelerated Gradient Method for Distributed Optimization.
Hadrien Hendrikx
Lin Xiao
Sébastien Bubeck
Francis R. Bach
Laurent Massoulié
Published in:
CoRR (2020)
Keyphrases
</>
gradient method
derivative free
optimization methods
convergence rate
optimization algorithm
optimization problems
step size
global optimization
convex formulation
optimization method
negative matrix factorization
machine learning