A Multi-armed Bandit MCMC, with applications in sampling from doubly intractable posterior.
Guanyang WangPublished in: CoRR (2019)
Keyphrases
- markov chain monte carlo
- multi armed bandit
- approximate inference
- posterior distribution
- parameter estimation
- multi armed bandits
- posterior probability
- sampling algorithm
- metropolis hastings
- monte carlo
- markov chain
- generative model
- graphical models
- reinforcement learning
- bayesian inference
- metropolis hastings algorithm
- gaussian process
- decentralized decision making
- probabilistic inference
- particle filter
- particle filtering
- bayesian networks
- proposal distribution
- latent variables
- belief propagation
- least squares
- data association
- variational methods
- probability distribution
- bayesian framework
- probabilistic model
- computational complexity
- markov random field
- exponential family
- bandit problems
- support vector machine