Concentration of Measure Inequalities in Information Theory, Communications and Coding
Maxim RaginskyIgal SasonPublished in: CoRR (2012)
Keyphrases
- information theory
- information theoretic
- rate distortion theory
- jensen shannon divergence
- statistical learning
- shannon entropy
- kullback leibler divergence
- relative entropy
- conditional entropy
- statistical mechanics
- statistical physics
- mdl principle
- coding scheme
- machine learning
- coding method
- video surveillance
- similarity measure