An Upper Bound on the Error Induced by Saddlepoint Approximations - Applications to Information Theory.
Dadja AnadeJean-Marie GorcePhilippe MarySamir PerlazaPublished in: CoRR (2020)
Keyphrases
- information theory
- upper bound
- information theoretic
- error probability
- conditional entropy
- generalization error
- lower bound
- statistical learning
- statistical mechanics
- rate distortion theory
- partition function
- jensen shannon divergence
- statistical physics
- free energy
- linear functions
- relative entropy
- kullback leibler divergence
- worst case
- information geometry
- error rate
- error metrics
- mutual information
- mdl principle
- computer vision
- feature selection
- estimation error
- image quality
- shannon entropy