Login / Signup
An Asymptotically Optimal Strategy for Constrained Multi-armed Bandit Problems.
Hyeong Soo Chang
Published in:
CoRR (2018)
Keyphrases
</>
optimal strategy
multi armed bandit problems
decision problems
expected cost
monte carlo
bandit problems
expected utility
cooperative game
mathematical models
cooperative
sample size
utility function
competitive ratio
constrained problems