An Elementary Proof of a Classical Information-Theoretic Formula.
Xianming LiuRonit BustinGuangyue HanShlomo ShamaiPublished in: CoRR (2018)
Keyphrases
- information theoretic
- mutual information
- information theory
- theoretic framework
- information theoretic measures
- information bottleneck
- log likelihood
- jensen shannon divergence
- kullback leibler divergence
- multi modality
- relative entropy
- entropy measure
- computational learning theory
- minimum description length
- bregman divergences