Login / Signup

Causally Abstracted Multi-armed Bandits.

Fabio Massimo ZennaroNicholas BishopJoel DyerYorgos FelekisAnisoara CalinescuMichael J. WooldridgeTheodoros Damoulas
Published in: CoRR (2024)
Keyphrases
  • multi armed bandits
  • bandit problems
  • causal relationships
  • causal models
  • multi armed bandit
  • reinforcement learning
  • maximum likelihood