Login / Signup
Thompson Sampling on Symmetric Alpha-Stable Bandits.
Abhimanyu Dubey
Alex Pentland
Published in:
IJCAI (2019)
Keyphrases
</>
random sampling
multi armed bandit
monte carlo
multiscale
sampling strategy
multi armed bandits
search algorithm
sampling algorithm
sampling strategies
stochastic systems
active learning
online learning
sample size
sampling methods