Thompson Sampling for Bandits with Clustered Arms.
Emil CarlssonDevdatt P. DubhashiFredrik D. JohanssonPublished in: IJCAI (2021)
Keyphrases
- multi armed bandits
- multi armed bandit
- multi armed bandit problems
- random sampling
- monte carlo
- bandit problems
- active learning
- sampling strategies
- multiscale
- sparse sampling
- sample size
- stochastic systems
- sampling methods
- sampling strategy
- sampling rate
- real world
- machine learning
- information retrieval
- sampled data
- artificial intelligence
- sampling algorithm
- data sets
- mobile robot
- reinforcement learning
- knowledge base
- information systems