Learn Quasi-Stationary Distributions of Finite State Markov Chain.
Zhiqiang CaiLing LinXiang ZhouPublished in: Entropy (2022)
Keyphrases
- markov chain
- finite state
- stationary distribution
- markov process
- steady state
- transition probabilities
- random walk
- state space
- product form
- monte carlo
- markov model
- monte carlo method
- state dependent
- transition matrix
- markov chain monte carlo
- partially observable markov decision processes
- special case
- learning algorithm