Sampling Lower Bounds via Information Theory
Ziv Bar-YossefPublished in: Electron. Colloquium Comput. Complex. (2003)
Keyphrases
- information theory
- lower bound
- information theoretic
- upper bound
- jensen shannon divergence
- statistical learning
- statistical mechanics
- statistical physics
- objective function
- relative entropy
- kullback leibler divergence
- sample size
- random sampling
- conditional entropy
- information geometry
- vc dimension
- worst case
- pattern recognition
- parameter space
- image compression
- mutual information
- mdl principle
- active learning
- image processing