Login / Signup
Exploration vs Exploitation vs Safety: Risk-averse Multi-Armed Bandits.
Nicolas Galichet
Michèle Sebag
Olivier Teytaud
Published in:
CoRR (2014)
Keyphrases
</>
risk averse
multi armed bandits
risk neutral
risk aversion
utility function
stochastic programming
decision makers
bandit problems
portfolio management
expected utility
bayesian networks
multistage
decision making
dynamic programming
decision problems