On the Upper Bound of the Kullback-Leibler Divergence and Cross Entropy.
Min ChenMateu SbertPublished in: CoRR (2019)
Keyphrases
- cross entropy
- kullback leibler divergence
- upper bound
- kullback leibler
- information theoretic
- log likelihood
- mutual information
- information theory
- maximum likelihood
- probability density function
- kl divergence
- distance measure
- evaluation metrics
- language modeling
- marginal distributions
- scoring function
- ranking functions
- error function
- image registration