Login / Signup

Restless Multi-Armed Bandits under Exogenous Global Markov Process.

Tomer GafniMichal YeminiKobi Cohen
Published in: ICASSP (2022)
Keyphrases
  • markov process
  • multi armed bandits
  • markov chain
  • stochastic process
  • stationary distribution
  • optimal control
  • bandit problems
  • autoregressive
  • machine learning
  • special case
  • least squares
  • steady state
  • dynamical systems