Login / Signup
Experience-efficient learning in associative bandit problems.
Alexander L. Strehl
Chris Mesterharm
Michael L. Littman
Haym Hirsh
Published in:
ICML (2006)
Keyphrases
</>
efficient learning
bandit problems
multi armed bandits
decision problems
learning algorithm
bayes net
expected utility
pattern languages
membership queries
utility function
structural information
multi armed bandit problems