Tight bound on relative entropy by entropy difference
David ReebMichael M. WolfPublished in: CoRR (2013)
Keyphrases
- relative entropy
- upper bound
- lower bound
- worst case
- information theoretic
- information theory
- log likelihood
- mutual information
- covariance matrix
- kullback leibler divergence
- maximum entropy
- mahalanobis distance
- squared euclidean distance
- sample size
- bregman divergences
- learning algorithm
- statistical models
- special case