MCMC: Sampling through Exploration Exploitation.
Evgeny LagutinDaniil SelikhanovychAchille ThinSergey SamsonovAlexey NaumovDenis BelomestnyMaxim PanovEric MoulinesPublished in: CoRR (2021)
Keyphrases
- exploration exploitation
- active learning
- markov chain monte carlo
- bandit problems
- monte carlo
- reinforcement learning
- relevance feedback
- markov chain
- sampling algorithm
- random sampling
- metropolis hastings
- parameter estimation
- markov chain monte carlo sampling
- high dimensional state space
- multiple features
- posterior probability
- posterior distribution
- importance sampling
- generative model
- semi supervised
- image retrieval
- metropolis hastings algorithm
- decision problems
- state space
- probability distribution
- special case
- feature space
- decision trees
- learning algorithm