Information-Theoretic Bounds for Integral Estimation.
Donald Q. AdamsAdarsh BarikJean HonorioPublished in: CoRR (2021)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- information bottleneck
- upper bound
- jensen shannon divergence
- log likelihood
- information theoretic measures
- kullback leibler divergence
- entropy measure
- worst case
- relative entropy
- medical images
- closed form solutions
- theoretical guarantees
- parameter estimation
- computational learning theory
- minimum description length
- bregman divergences
- objective function