A Contextual Multi-armed Bandit Approach Based on Implicit Feedback for Online Recommendation.
Yongquan WanJunli XianCairong YanPublished in: KMO (2021)
Keyphrases
- implicit feedback
- contextual information
- recommender systems
- collaborative filtering
- multi armed bandit
- explicit feedback
- user behavior
- eye tracking
- user feedback
- personalized recommendation
- matrix factorization
- web search
- search result
- item recommendation
- online learning
- relevance feedback
- user preferences
- latent factors
- machine learning
- user interests
- recommendation systems