Login / Signup

Bandits with budgets.

Chong JiangR. Srikant
Published in: CDC (2013)
Keyphrases
  • multi armed bandits
  • stochastic systems
  • decision making
  • multiresolution
  • budget constraints
  • data streams
  • active learning