No quantum speedup over gradient descent for non-smooth convex optimization.
Ankit GargRobin KothariPraneeth NetrapalliSuhail SherifPublished in: CoRR (2020)
Keyphrases
- convex optimization
- operator splitting
- interior point methods
- low rank
- primal dual
- cost function
- total variation
- loss function
- convex optimization problems
- objective function
- image segmentation
- convex relaxation
- piecewise constant
- piecewise smooth
- higher order
- feature extraction
- image processing
- image restoration
- low rank matrix
- semidefinite program
- alternating direction method of multipliers