Login / Signup
Minimal Exploration in Structured Stochastic Bandits.
Richard Combes
Stefan Magureanu
Alexandre Proutière
Published in:
NIPS (2017)
Keyphrases
</>
stochastic systems
structured data
multi armed bandits
multiscale
stochastic model
monte carlo
learning automata
stochastic optimization
stochastic models
regret bounds
information retrieval
information systems
optimal solution
decision problems
multi armed bandit