Login / Signup
Restless Multi-Armed Bandits under Exogenous Global Markov Process.
Tomer Gafni
Michal Yemini
Kobi Cohen
Published in:
CoRR (2022)
Keyphrases
</>
markov process
multi armed bandits
markov chain
stochastic process
stationary distribution
bandit problems
transition probabilities
random walk
optimal control
multi armed bandit