Unreliable Multi-Armed Bandits: A Novel Approach to Recommendation Systems.
Aditya Narayan RaviPranav PoduvalSharayu MoharirPublished in: CoRR (2019)
Keyphrases
- recommendation systems
- multi armed bandits
- collaborative filtering
- bandit problems
- user preferences
- recommender systems
- web search
- social recommendation
- search engine
- user feedback
- recommendation quality
- personalized recommendation
- multi armed bandit
- collaborative filtering recommendation algorithm
- machine learning
- decision making
- online stores
- information retrieval systems
- pairwise