AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods.
Robert M. FreundPaul GrigasRahul MazumderPublished in: CoRR (2013)
Keyphrases
- optimization methods
- empirical risk
- convex relaxation
- ridge regression
- weak classifiers
- boosting algorithms
- support vector
- efficient optimization
- loss function
- simulated annealing
- face detection
- optimization problems
- optimization method
- multi class
- regression model
- risk minimization
- linear regression
- global convergence
- continuous optimization
- decision function
- direct optimization
- feature selection
- unconstrained optimization
- object detection
- learning algorithm
- base learners
- gradient method
- quasi newton
- convex optimization
- reproducing kernel hilbert space
- support vector machine
- optimization approaches
- cost function
- training data
- stochastic methods
- cost sensitive
- evolutionary algorithm
- pairwise
- trust region
- search algorithm
- base classifiers
- model selection
- linear combination