Login / Signup
Causally Abstracted Multi-armed Bandits.
Fabio Massimo Zennaro
Nicholas Bishop
Joel Dyer
Yorgos Felekis
Anisoara Calinescu
Michael J. Wooldridge
Theodoros Damoulas
Published in:
CoRR (2024)
Keyphrases
</>
multi armed bandits
bandit problems
causal relationships
causal models
multi armed bandit
reinforcement learning
maximum likelihood