Login / Signup

Graph-Enhanced Hybrid Sampling for Multi-Armed Bandit Recommendation.

Fen WangTaihao LiWuyue ZhangXue ZhangCheng Yang
Published in: ICASSP (2024)
Keyphrases
  • multi armed bandit
  • multi armed bandits
  • reinforcement learning
  • collaborative filtering
  • recommender systems
  • random sampling
  • regret bounds
  • e learning
  • least squares
  • mutual information
  • sample size