Information-theoretic applications of the logarithmic probability comparison bound.
Rami AtarNeri MerhavPublished in: CoRR (2014)
Keyphrases
- information theoretic
- information theory
- mutual information
- worst case
- theoretic framework
- probability distribution
- entropy measure
- information theoretic measures
- upper bound
- log likelihood
- jensen shannon divergence
- information bottleneck
- kullback leibler divergence
- distributional clustering
- relative entropy
- computational learning theory
- multi modality
- lower bound
- image registration
- cross entropy
- minimum description length
- maximum entropy
- multi modal
- knn
- image segmentation