Login / Signup
Perturbed-History Exploration in Stochastic Multi-Armed Bandits.
Branislav Kveton
Csaba Szepesvári
Mohammad Ghavamzadeh
Craig Boutilier
Published in:
CoRR (2019)
Keyphrases
</>
multi armed bandits
multi armed bandit
bandit problems
special case
np hard
probability distribution
monte carlo
original data