Login / Signup
Hierarchical multi-armed bandits for discovering hidden populations.
Suhansanu Kumar
Heting Gao
Changyu Wang
Kevin Chen-Chuan Chang
Hari Sundaram
Published in:
ASONAM (2019)
Keyphrases
</>
multi armed bandits
bandit problems
multi armed bandit
pairwise
decision theoretic