Login / Signup

On Distributed Multi-Player Multiarmed Bandit Problems in Abruptly Changing Environment.

Lai WeiVaibhav Srivastava
Published in: CDC (2018)
Keyphrases
  • changing environment
  • multi player
  • multiarmed bandit
  • multi agent
  • mathematical programming
  • mobile robot
  • markov decision processes