Login / Signup
Perturbed-History Exploration in Stochastic Multi-Armed Bandits.
Branislav Kveton
Csaba Szepesvári
Mohammad Ghavamzadeh
Craig Boutilier
Published in:
IJCAI (2019)
Keyphrases
</>
multi armed bandits
multi armed bandit
bandit problems
original data
monte carlo