Login / Signup
Multi-armed Bandit Experimental Design: Online Decision-making and Adaptive Inference.
David Simchi-Levi
Chonghuan Wang
Published in:
AISTATS (2023)
Keyphrases
</>
experimental design
decision making
decentralized decision making
multi armed bandit
active learning
empirical studies
experimental designs
online learning
sample size
bayesian networks
multi agent
feature selection
pairwise
data mining
text categorization
class imbalance
bandit problems