Login / Signup
Scalable Discrete Sampling as a Multi-Armed Bandit Problem.
Yutian Chen
Zoubin Ghahramani
Published in:
ICML (2016)
Keyphrases
</>
monte carlo
data sets
band limited
sampling strategy
sampling strategies
online learning
discrete geometry
highly scalable
multi armed bandit
real time
discrete version
discrete space
sampling algorithm
sample size
lower bound
information systems
data mining