Login / Signup

Multi-armed bandits in the wild: Pitfalls and strategies in online experiments.

David Issa MattosJan BoschHelena Holmström Olsson
Published in: Inf. Softw. Technol. (2019)
Keyphrases
  • multi armed bandits
  • online learning
  • bandit problems
  • maximum likelihood
  • multi armed bandit