Learn Quasi-stationary Distributions of Finite State Markov Chain.
Zhiqiang CaiLing LinXiang ZhouPublished in: CoRR (2021)
Keyphrases
- markov chain
- finite state
- stationary distribution
- markov process
- steady state
- transition probabilities
- product form
- state dependent
- markov model
- random walk
- monte carlo
- partially observable markov decision processes
- state space
- monte carlo method
- markov models
- average cost
- queue length
- generative model
- expectation maximization
- transition matrix