Login / Signup
Multi-armed bandits with episode context.
Christopher D. Rosin
Published in:
Ann. Math. Artif. Intell. (2011)
Keyphrases
</>
multi armed bandits
decision trees
np hard
markov chain
bandit problems