Speed up the inference of diffusion models via shortcut MCMC sampling.
Gang ChenPublished in: CoRR (2023)
Keyphrases
- markov chain monte carlo
- diffusion models
- sampling algorithm
- posterior distribution
- gibbs sampler
- markov chain
- bayesian inference
- diffusion model
- gibbs sampling
- parameter estimation
- monte carlo
- generative model
- metropolis hastings
- information diffusion
- particle filter
- posterior probability
- social networks
- approximate inference
- particle filtering
- influence maximization
- probability distribution
- random walk
- viral marketing
- greedy algorithm
- metropolis hastings algorithm
- steady state
- probabilistic model
- optical flow
- image processing