Contextual-Bandit Based Personalized Recommendation with Time-Varying User Interests.
Xiao XuFang DongYanghua LiShaojian HeXin LiPublished in: AAAI (2020)
Keyphrases
- user interests
- personalized recommendation
- contextual bandit
- upper confidence bound
- user behavior
- user profiles
- news recommendation
- recommender systems
- user profiling
- collaborative recommendation
- collaborative filtering recommendation
- collaborative filtering
- news items
- web content
- user model
- web personalization
- user preferences
- implicit feedback
- information retrieval
- user feedback
- data mining
- query expansion
- probabilistic model
- website
- feature selection