Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Discrete Distributions.
Yi HaoAlon OrlitskyPublished in: CoRR (2020)
Keyphrases
- cumulative residual entropy
- kullback leibler divergence
- information theory
- shannon entropy
- entropy measure
- similarity measure
- information content
- mutual information
- information theoretic
- kl divergence
- probability distribution
- relative entropy
- discrete space
- finite automata
- efficient algorithms to compute
- inductive inference
- uniform distribution
- boolean functions
- correlation coefficient