Login / Signup
Two-Phase Multi-armed Bandit for Online Recommendation.
Cairong Yan
Haixia Han
Zijian Wang
Yanting Zhang
Published in:
DSAA (2021)
Keyphrases
</>
multi armed bandit
online learning
multi armed bandits
recommender systems
machine learning
feature selection
pairwise
collaborative filtering
graphical models