Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms.
Richard CombesChong JiangRayadurgam SrikantPublished in: SIGMETRICS (2015)
Keyphrases
- lower bound
- regret bounds
- worst case
- online algorithms
- theoretical analysis
- multi armed bandit
- running times
- optimal solution
- optimal cost
- lower and upper bounds
- upper bound
- optimization problems
- branch and bound algorithm
- vc dimension
- data structure
- online convex optimization
- online learning
- linear regression
- least squares
- np hard
- computational complexity
- bandit problems
- learning algorithm