Keyphrases
- markov chain
- gibbs sampling
- markov chain monte carlo
- gibbs sampler
- steady state
- transition probabilities
- markov process
- finite state
- stationary distribution
- stochastic process
- random walk
- monte carlo method
- monte carlo
- state space
- monte carlo simulation
- markov model
- transition matrix
- probabilistic inference
- bayesian inference
- belief networks
- bayesian networks
- maximum likelihood
- neural network