Login / Signup
Bandits with budgets.
Chong Jiang
R. Srikant
Published in:
CDC (2013)
Keyphrases
</>
multi armed bandits
stochastic systems
decision making
multiresolution
budget constraints
data streams
active learning