Contextual Bandits for adapting to changing User preferences over time.
Dattaraj RaoPublished in: CoRR (2020)
Keyphrases
- user preferences
- user behavior
- collaborative filtering
- user profiles
- recommender systems
- contextual information
- user feedback
- user interests
- preference models
- user specific
- user behaviour
- recommendation systems
- multi armed bandits
- stochastic systems
- personalized recommendation
- preference model
- skyline queries
- dynamic programming