• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Hierarchical multi-armed bandits for discovering hidden populations.

Suhansanu KumarHeting GaoChangyu WangKevin Chen-Chuan ChangHari Sundaram
Published in: ASONAM (2019)
Keyphrases
  • multi armed bandits
  • bandit problems
  • multi armed bandit
  • pairwise
  • decision theoretic