Login / Signup
Medoids in Almost-Linear Time via Multi-Armed Bandits.
Vivek Kumar Bagaria
Govinda M. Kamath
Vasilis Ntranos
Martin J. Zhang
David Tse
Published in:
AISTATS (2018)
Keyphrases
</>
multi armed bandits
bandit problems
average distance
multi armed bandit
worst case
multi objective
decision problems