Indexability of Finite State Restless Bandits.
Vishesh MittalRahul MeshramDeepak DevSurya PrakashPublished in: COMSNETS (2024)
Keyphrases
- finite state
- semi markov
- markov chain
- markov decision processes
- optimal control
- stochastic systems
- model checking
- optimal policy
- average cost
- context free
- partially observable markov decision processes
- least squares
- vector quantizer
- action sets
- transition systems
- decision problems
- finite state transducers
- multi armed bandit
- continuous time bayesian networks