Sign in

Perturbed-History Exploration in Stochastic Multi-Armed Bandits.

Branislav KvetonCsaba SzepesváriMohammad GhavamzadehCraig Boutilier
Published in: IJCAI (2019)
Keyphrases
  • multi armed bandits
  • multi armed bandit
  • bandit problems
  • original data
  • monte carlo