Login / Signup
Output-Weighted Sampling for Multi-Armed Bandits with Extreme Payoffs.
Yibo Yang
Antoine Blanchard
Themistoklis P. Sapsis
Paris Perdikaris
Published in:
CoRR (2021)
Keyphrases
</>
multi armed bandits
multi armed bandit
bandit problems
monte carlo
game theory