Diverse User Preference Elicitation with Multi-Armed Bandits.
Javier ParaparFilip RadlinskiPublished in: WSDM (2021)
Keyphrases
- user preferences
- multi armed bandits
- user behavior
- bandit problems
- user profiles
- user feedback
- recommender systems
- collaborative filtering
- preference model
- recommendation systems
- user behaviour
- utility function
- active learning
- social influence
- collaborative recommendation
- multi armed bandit
- objective function
- recommendation algorithms
- pairwise