Login / Signup
Multi-Armed Bandits for Minesweeper: Profiting From Exploration-Exploitation Synergy.
Igor Q. Lordeiro
Diego B. Haddad
Douglas de O. Cardoso
Published in:
IEEE Trans. Games (2022)
Keyphrases
</>
bandit problems
exploration exploitation
multi armed bandits
decision problems
decision making
linear regression
data sets
pairwise
incremental learning