Login / Signup
A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions.
André Uschmajew
Bart Vandereycken
Published in:
J. Optim. Theory Appl. (2022)
Keyphrases
</>
convergence rate
step size
gradient method
primal dual
faster convergence rate
convergence speed
cost function
optimization methods
dynamic programming
motion estimation
global convergence
evolutionary programming