Login / Signup
Meta-Learning of Exploration/Exploitation Strategies: The Multi-Armed Bandit Case
Francis Maes
Damien Ernst
Louis Wehenkel
Published in:
CoRR (2012)
Keyphrases
</>
meta learning
inductive learning
learning tasks
model selection
exploration exploitation
machine learning algorithms
decision trees
multi armed bandit
feature selection
active learning
reinforcement learning
base classifiers
bandit problems
image classification