Login / Signup
A nonmonotone accelerated proximal gradient method with variable stepsize strategy for nonsmooth and nonconvex minimization problems.
Hongwei Liu
Ting Wang
Zexian Liu
Published in:
J. Glob. Optim. (2024)
Keyphrases
</>
step size
gradient method
minimization problems
convergence rate
cost function
variational inequalities
total variation
convergence speed
convex optimization
regularization term
primal dual
global optimization
low rank
multiscale
image restoration
newton method