Information-theoretic applications of the logarithmic probability comparison bound.
Rami AtarNeri MerhavPublished in: ISIT (2015)
Keyphrases
- information theoretic
- mutual information
- information theory
- theoretic framework
- worst case
- information bottleneck
- log likelihood
- information theoretic measures
- upper bound
- multi modality
- lower bound
- jensen shannon divergence
- entropy measure
- probability distribution
- relative entropy
- cross entropy
- kullback leibler divergence
- feature selection
- minimum description length
- medical images