Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates.
Hiva GhanbariKatya ScheinbergPublished in: Comput. Optim. Appl. (2018)
Keyphrases
- convex optimization
- convergence rate
- primal dual
- norm minimization
- operator splitting
- interior point methods
- low rank matrix
- step size
- low rank
- learning rate
- convergence speed
- total variation
- convex optimization problems
- convex relaxation
- least squares
- global convergence
- trace norm
- semidefinite programming
- high dimensional
- conjugate gradient
- feature extraction
- image restoration
- alternating direction method of multipliers
- feature selection