Login / Signup

Diversity-driven selection of exploration strategies in multi-armed bandits.

Fabien C. Y. BenureauPierre-Yves Oudeyer
Published in: ICDL-EPIROB (2015)
Keyphrases
  • multi armed bandits
  • selection strategies
  • search strategies
  • bandit problems
  • exploration strategy
  • computational complexity
  • probability distribution
  • sufficient conditions