A Practical & Unified Notation for Information-Theoretic Quantities in ML.
Andreas KirschYarin GalPublished in: CoRR (2021)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- maximum likelihood
- information bottleneck
- jensen shannon divergence
- multi modality
- minimum description length
- log likelihood
- entropy measure
- information theoretic measures
- kullback leibler divergence
- relative entropy
- data mining
- em algorithm
- bregman divergences
- jensen shannon