Login / Signup

Ranking in Contextual Multi-Armed Bandits.

Amitis ShidaniGeorge DeligiannidisArnaud Doucet
Published in: CoRR (2022)
Keyphrases
  • multi armed bandits
  • bandit problems
  • ranking algorithm
  • contextual information
  • web search
  • ranking functions
  • multi armed bandit
  • bayesian networks
  • evolutionary algorithm
  • special case