Unreliable Multi-Armed Bandits: A Novel Approach to Recommendation Systems.
Aditya Narayan RaviPranav PoduvalSharayu MoharirPublished in: COMSNETS (2020)
Keyphrases
- recommendation systems
- multi armed bandits
- collaborative filtering
- bandit problems
- recommender systems
- user preferences
- multi armed bandit
- search engine
- collaborative filtering recommendation algorithm
- user feedback
- recommendation quality
- web search
- recommendation algorithms
- social recommendation
- objective function
- decision problems
- median filter
- personalized recommendation
- dynamic programming
- decision making