The Information Theory Bound Is Tight for Selection in a Heap
Greg N. FredericksonPublished in: STOC (1990)
Keyphrases
- information theory
- upper bound
- lower bound
- information theoretic
- worst case
- statistical learning
- jensen shannon divergence
- statistical mechanics
- data structure
- conditional entropy
- kullback leibler divergence
- statistical physics
- mutual information
- rate distortion theory
- relative entropy
- mdl principle
- information geometry
- computational complexity
- selection algorithm
- wavelet transform
- image analysis
- generalization error bounds