Login / Signup

Thompson Sampling for Dynamic Multi-armed Bandits.

Neha GuptaOle-Christoffer GranmoAshok K. Agrawala
Published in: ICMLA (1) (2011)
Keyphrases
  • multi armed bandits
  • monte carlo
  • bandit problems
  • feature selection
  • decision makers
  • maximum likelihood
  • density estimation