Login / Signup
Forced-exploration free Strategies for Unimodal Bandits.
Hassan Saber
Pierre Ménard
Odalric-Ambrym Maillard
Published in:
CoRR (2020)
Keyphrases
</>
search strategies
probability distribution
exploration strategy
artificial intelligence
metadata
optimal strategy
image processing
website
multiscale
multi agent
evolutionary algorithm
online auctions
action selection
multi armed bandit problems