Convergence guarantees for RMSProp and ADAM in non-convex optimization and their comparison to Nesterov acceleration on autoencoders.
Amitabh BasuSoham DeAnirbit MukherjeeEnayat UllahPublished in: CoRR (2018)
Keyphrases
- convex optimization
- operator splitting
- denoising
- total variation
- interior point methods
- primal dual
- convex programming
- low rank
- structured prediction
- convex formulation
- convex optimization problems
- semidefinite program
- convex relaxation
- basis pursuit
- convergence rate
- norm minimization
- image processing
- augmented lagrangian
- convex sets
- computer vision
- image denoising
- natural images
- global convergence
- high quality
- support vector
- multiresolution
- supervised learning