Login / Signup
A Hidden Markov Restless Multi-armed Bandit Model for Playout Recommendation Systems.
Rahul Meshram
Aditya Gopalan
D. Manjunath
Published in:
COMSNETS (Revised Selected Papers and Invited Papers) (2017)
Keyphrases
</>
probabilistic model
recommendation systems
information retrieval
probability distribution
machine learning
closed form