Login / Signup

Two-Phase Multi-armed Bandit for Online Recommendation.

Cairong YanHaixia HanZijian WangYanting Zhang
Published in: DSAA (2021)
Keyphrases
  • multi armed bandit
  • online learning
  • multi armed bandits
  • recommender systems
  • machine learning
  • feature selection
  • pairwise
  • collaborative filtering
  • graphical models