Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Distributions.
Yi HaoAlon OrlitskyPublished in: NeurIPS (2020)
Keyphrases
- cumulative residual entropy
- kullback leibler divergence
- information theory
- shannon entropy
- entropy measure
- information theoretic
- kl divergence
- probability distribution
- random variables
- relative entropy
- information entropy
- information content
- dissimilarity measure
- marginal distributions
- inductive inference
- neural network
- distance measure
- probabilistic model
- similarity measure