Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets.
Igal SasonSergio VerdúPublished in: ITW Fall (2015)
Keyphrases
- relative entropy
- total variation
- upper bound
- mahalanobis distance
- image restoration
- image denoising
- denoising
- information theoretic
- kullback leibler divergence
- lower bound
- mutual information
- covariance matrix
- information theory
- log likelihood
- natural images
- distance measure
- kl divergence
- image processing
- bregman divergences
- worst case
- noise model
- euclidean distance
- partial differential equations
- maximum entropy
- multi class
- distance metric
- sample size
- objective function
- similarity measure
- noisy images