Dynamic clustering based contextual combinatorial multi-armed bandit for online recommendation.

Cairong YanHaixia HanYanting ZhangDandan ZhuYongquan Wan
Published in: Knowl. Based Syst. (2022)
Keyphrases
  • multi armed bandit
  • learning algorithm
  • contextual information
  • reinforcement learning
  • recommender systems
  • collaborative filtering
  • online learning
  • similarity measure
  • active learning
  • multi armed bandits