Beyond random walk and metropolis-hastings samplers: why you should not backtrack for unbiased graph sampling.
Chul-Ho LeeXin XuDo Young EunPublished in: SIGMETRICS (2012)
Keyphrases
- random walk
- metropolis hastings
- markov chain monte carlo
- markov chain
- sampling algorithm
- random sampling
- posterior distribution
- directed graph
- particle filtering
- flow graph
- bayesian inference
- variational approximation
- graph laplacian
- generative model
- posterior probability
- monte carlo
- parameter estimation
- transition probability matrix
- search tree
- nodes of a graph
- particle filter
- state space
- machine learning
- spectral graph partitioning
- approximate inference
- probability distribution
- proposal distribution
- hyper graph
- expectation maximization
- higher order