Login / Signup

-Armed Bandit Problem.

Matthew J. StreeterStephen F. Smith
Published in: CP (2006)
Keyphrases
  • random sampling
  • bandit problems
  • multi armed bandit
  • computer vision
  • markov chain
  • sample size
  • decision making
  • three dimensional
  • high quality
  • search algorithm
  • regret bounds
  • upper confidence bound