Multi-Armed Bandits in Recommendation Systems: A survey of the state-of-the-art and future directions.
Nícollas SilvaHeitor WerneckThiago SilvaAdriano C. M. PereiraLeonardo RochaPublished in: Expert Syst. Appl. (2022)
Keyphrases
- future directions
- recommendation systems
- multi armed bandits
- collaborative filtering
- bandit problems
- user preferences
- lessons learned
- web search
- recommender systems
- current challenges
- current status
- multi armed bandit
- user feedback
- recommendation quality
- collaborative filtering recommendation algorithm
- current trends
- search engine
- social recommendation
- reinforcement learning
- advanced technologies
- pairwise