Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence.
Ghassen JerfelSerena WangClara FannjiangKatherine A. HellerYi-An MaMichael I. JordanPublished in: CoRR (2021)
Keyphrases
- importance sampling
- kullback leibler divergence
- monte carlo
- mutual information
- information theoretic
- information theory
- kalman filter
- probability density function
- markov chain
- particle filter
- distance measure
- image segmentation
- particle filtering
- marginal distributions
- optical flow
- machine learning
- approximate inference
- markov chain monte carlo
- diffusion tensor
- computer vision
- posterior distribution
- probability distribution
- image processing