Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems.
Zehui JiaJieru HuangXingju CaiPublished in: J. Glob. Optim. (2021)
Keyphrases
- gradient method
- convex optimization problems
- convex optimization
- convergence rate
- total variation
- optimization methods
- optimization problems
- primal dual
- step size
- learning problems
- negative matrix factorization
- image restoration
- interior point methods
- natural images
- feature selection
- image classification
- loss function
- machine learning
- denoising
- dynamic programming