Login / Signup
Tradeoffs between convergence rate and noise amplification for momentum-based accelerated optimization algorithms.
Hesameddin Mohammadi
Meisam Razaviyayn
Mihailo R. Jovanovic
Published in:
CoRR (2022)
Keyphrases
</>
convergence rate
learning rate
global convergence
faster convergence rate
convergence speed
step size
optimization problems
computationally efficient
gradient method
line search
learning algorithm
objective function
computational efficiency
optimization methods
primal dual