Simulating Stable Stochastic Systems, II: Markov Chains.
Michael A. CraneDonald L. IglehartPublished in: J. ACM (1974)
Keyphrases
- markov chain
- stochastic systems
- sample path
- confidence intervals
- steady state
- finite state
- monte carlo
- stationary distribution
- transition probabilities
- stochastic models
- markov process
- markov processes
- random walk
- state space
- stochastic process
- markov model
- probabilistic automata
- importance sampling
- maximum likelihood
- probability distribution
- machine learning
- graphical models