One-shot bounds for various information theoretic problems using smooth min and max Rényi divergences.
Naqueeb Ahmad WarsiPublished in: ITW (2013)
Keyphrases
- information theoretic
- information theory
- mutual information
- computational learning theory
- theoretic framework
- bregman divergences
- entropy measure
- jensen shannon divergence
- log likelihood
- kullback leibler divergence
- upper bound
- information bottleneck
- information theoretic measures
- lower bound
- multi modality
- relative entropy