On Fast Convergence of Proximal Algorithms for SQRT-Lasso Optimization: Don't Worry About its Nonsmooth Loss Function.
Xingguo LiHaoming JiangJarvis D. HauptRaman AroraHan LiuMingyi HongTuo ZhaoPublished in: UAI (2019)
Keyphrases
- loss function
- convex loss functions
- optimization problems
- learning algorithm
- global convergence
- convergence rate
- risk minimization
- pairwise
- stochastic gradient descent
- logistic regression
- coordinate descent method
- model selection
- worst case
- linear regression
- convergence speed
- regularization term
- boosting algorithms
- reproducing kernel hilbert space
- support vector
- quasi newton
- loss bounds
- globally convergent
- feature selection
- machine learning