Login / Signup

Multi-Armed Bandits for Minesweeper: Profiting From Exploration-Exploitation Synergy.

Igor Q. LordeiroDiego B. HaddadDouglas de O. Cardoso
Published in: IEEE Trans. Games (2022)
Keyphrases
  • bandit problems
  • exploration exploitation
  • multi armed bandits
  • decision problems
  • decision making
  • linear regression
  • data sets
  • pairwise
  • incremental learning