Login / Signup
Diversity-Driven Selection of Exploration Strategies in Multi-Armed Bandits.
Fabien C. Y. Benureau
Pierre-Yves Oudeyer
Published in:
CoRR (2018)
Keyphrases
</>
multi armed bandits
selection strategies
search strategies
bandit problems
exploration strategy
multi objective
state space
upper bound