Login / Signup

Hierarchical multi-armed bandits for discovering hidden populations.

Suhansanu KumarHeting GaoChangyu WangKevin Chen-Chuan ChangHari Sundaram
Published in: ASONAM (2019)
Keyphrases
  • multi armed bandits
  • bandit problems
  • multi armed bandit
  • pairwise
  • decision theoretic