An empirical evaluation of bagging and boosting for artificial neural networks.
David W. OpitzRichard MaclinPublished in: ICNN (1997)
Keyphrases
- artificial neural networks
- negative correlation learning
- ensemble methods
- ensemble learning
- base classifiers
- ensemble classification
- gradient boosting
- randomized trees
- decision tree ensembles
- weak classifiers
- base learners
- random forests
- neural network
- majority voting
- decision stumps
- back propagation
- decision trees
- weak learners
- ensemble classifier
- prediction accuracy
- error function
- computational intelligence
- using artificial neural networks
- multi class
- random forest
- genetic algorithm
- training set
- learning algorithm
- variance reduction
- machine learning
- training samples
- benchmark datasets
- generalization ability
- neural network model
- genetic algorithm ga
- hidden layer
- boosting algorithms
- voting methods
- classification error
- rotation forest
- weighted voting
- multiple classifier systems
- logistic regression
- cross validation
- machine learning methods
- meta learning
- individual classifiers
- cost sensitive
- class distribution
- learning machines
- learning scheme