A lower bound on the differential entropy for log-concave random variables with applications to rate-distortion theory.
Arnaud MarsigliettiVictoria KostinaPublished in: ISIT (2017)
Keyphrases
- random variables
- rate distortion theory
- information theory
- lower bound
- information theoretic
- upper bound
- graphical models
- probability distribution
- rate distortion
- kullback leibler divergence
- mutual information
- bayesian networks
- latent variables
- joint distribution
- objective function
- worst case
- probabilistic inference
- probabilistic model
- spatially adaptive
- image processing