Login / Signup

Revisiting Simple Regret Minimization in Multi-Armed Bandits.

Yao ZhaoConnor StephensCsaba SzepesváriKwang-Sung Jun
Published in: CoRR (2022)
Keyphrases
  • multi armed bandits
  • regret minimization
  • decision makers
  • nash equilibrium
  • bayesian networks
  • markov decision processes