A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets
Nicolas Le RouxMark SchmidtFrancis R. BachPublished in: CoRR (2012)
Keyphrases
- gradient method
- convergence rate
- convex optimization
- convex formulation
- training set
- step size
- convergence speed
- interior point methods
- primal dual
- learning rate
- total variation
- global convergence
- supervised learning
- convex relaxation
- decision trees
- multi objective
- data sets
- optimization methods
- image restoration
- denoising
- object recognition
- image segmentation
- genetic algorithm