Fast Distributed Bandits for Online Recommendation Systems.
Kanak MahadikQingyun WuShuai LiAmit SabnePublished in: CoRR (2020)
Keyphrases
- recommendation systems
- collaborative filtering
- web personalization
- recommender systems
- user preferences
- online stores
- user modeling
- online learning
- web search
- distributed systems
- recommendation quality
- search engine
- peer to peer
- personalized recommendation
- social recommendation
- multi armed bandits
- collaborative filtering recommendation algorithm
- user behavior
- expert systems
- feature space
- learning algorithm