A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound.
Tomohiro NishiyamaPublished in: CoRR (2019)
Keyphrases
- kullback leibler divergence
- lower bound
- upper bound
- mutual information
- information theoretic
- information theory
- distance measure
- kl divergence
- probability density function
- optimal solution
- objective function
- marginal distributions
- high dimensional
- diffusion tensor
- probabilistic model
- probability density
- training data
- computer vision