Concentration of Measure Inequalities and Their Communication and Information-Theoretic Applications.
Maxim RaginskyIgal SasonPublished in: CoRR (2015)
Keyphrases
- information theoretic
- information theory
- information theoretic measures
- mutual information
- entropy measure
- jensen shannon divergence
- relative entropy
- theoretic framework
- kullback leibler divergence
- kl divergence
- jensen shannon
- information bottleneck
- multi modality
- similarity measure
- computational learning theory
- shannon entropy
- minimum description length
- log likelihood
- sufficient conditions
- text categorization