Login / Signup
Fully corrective boosting with arbitrary loss and regularization.
Chunhua Shen
Hanxi Li
Anton van den Hengel
Published in:
Neural Networks (2013)
Keyphrases
</>
early stopping
learning algorithm
kernel machines
gradient boosting
kernel ridge regression
regularization parameter
square loss
weak classifiers
general loss functions
feature selection
ensemble methods
weak learners
hinge loss
projection operator