Joint Continuous and Impulsive Control of Markov Chains.
Alexander B. MillerBoris M. MillerKaren V. StepanyanPublished in: MED (2018)
Keyphrases
- markov chain
- markov processes
- transition probabilities
- steady state
- finite state
- monte carlo
- stochastic process
- probabilistic automata
- markov process
- stationary distribution
- monte carlo method
- random walk
- control theory
- markov model
- state space
- sample path
- pattern matching
- transition matrix
- queueing theory
- assemble to order systems
- finite automata