An Upper Bound on the Error Induced by Saddlepoint Approximations - Applications to Information Theory.
Dadja AnadeJean-Marie GorcePhilippe MarySamir M. PerlazaPublished in: Entropy (2020)
Keyphrases
- information theory
- upper bound
- information theoretic
- error probability
- conditional entropy
- generalization error
- lower bound
- rate distortion theory
- statistical learning
- partition function
- statistical mechanics
- jensen shannon divergence
- free energy
- worst case
- statistical physics
- relative entropy
- linear functions
- error rate
- shannon entropy
- coding scheme
- kullback leibler divergence
- mdl principle
- closed form
- error metrics