Login / Signup
An asymptotically optimal strategy for constrained multi-armed bandit problems.
Hyeong Soo Chang
Published in:
Math. Methods Oper. Res. (2020)
Keyphrases
</>
optimal strategy
multi armed bandit problems
decision problems
monte carlo
bandit problems
expected utility
expected cost
asymptotically optimal
case study
sample size
competitive ratio
cooperative game
database
central limit theorem