About distances of discrete distributions satisfying the data processing theorem of information theory.
M. C. PardoIgor VajdaPublished in: IEEE Trans. Inf. Theory (1997)
Keyphrases
- information theory
- data processing
- kullback leibler divergence
- information theoretic
- statistical mechanics
- efficient algorithms to compute
- data management
- statistical learning
- jensen shannon divergence
- conditional entropy
- data analysis
- probability distribution
- distance measure
- mutual information
- kl divergence
- relative entropy
- random variables
- distance function
- gaussian distribution
- mdl principle
- statistical physics
- shannon entropy
- joint distribution
- euclidean distance
- information geometry
- rate distortion theory