Switch-based Markov Chains for Sampling Hamiltonian Cycles in Dense Graphs.
Pieter KleerViresh PatelFabian StrohPublished in: CoRR (2020)
Keyphrases
- markov chain
- monte carlo
- markov chain monte carlo
- steady state
- importance sampling
- gibbs sampler
- markov model
- transition probabilities
- random walk
- stationary distribution
- state space
- finite state
- monte carlo simulation
- monte carlo method
- markov process
- markov processes
- probabilistic automata
- stochastic process
- transition matrix
- gibbs sampling
- search algorithm