Marginal Posterior Sampling for Slate Bandits.
Maria DimakopoulouNikos VlassisTony JebaraPublished in: IJCAI (2019)
Keyphrases
- probability distribution
- markov chain monte carlo
- metropolis hastings
- sampling strategy
- random sampling
- posterior distribution
- multi armed bandit
- stochastic systems
- sampling strategies
- sampling rate
- posterior probability
- sample size
- sampling algorithm
- parameter space
- bayesian framework
- sampling methods
- importance sampling
- multi armed bandits
- sampled data
- machine learning
- gaussian process
- image reconstruction
- monte carlo
- parameter estimation
- markov chain
- least squares
- probabilistic model
- multiscale
- image segmentation