Login / Signup

Best Arm Identification in Restless Markov Multi-Armed Bandits.

P. N. KarthikKota Srinivas ReddyVincent Y. F. Tan
Published in: CoRR (2022)
Keyphrases
  • multi armed bandits
  • semi markov
  • bandit problems
  • markov chain
  • markov model