Login / Signup

Online Interactive Collaborative Filtering Using Multi-Armed Bandit with Dependent Arms.

Qing WangChunqiu ZengWubai ZhouTao LiS. S. IyengarLarisa ShwartzGenady Ya. Grabarnik
Published in: IEEE Trans. Knowl. Data Eng. (2019)
Keyphrases
  • multi armed bandits
  • collaborative filtering
  • multi armed bandit
  • online learning
  • recommender systems
  • matrix factorization
  • decision making
  • reinforcement learning