Spoiled for Choice? Personalized Recommendation for Healthcare Decisions: A Multi-Armed Bandit Approach.
Tongxin ZhouYingfei WangLu Lucy YanYong TanPublished in: CoRR (2020)
Keyphrases
- personalized recommendation
- multi armed bandit
- multi armed bandits
- recommender systems
- user interests
- collaborative filtering
- decision making
- collaborative filtering recommendation
- user feedback
- reinforcement learning
- user preferences
- recommendation systems
- web users
- website
- nearest neighbor
- recommendation algorithms