A metric entropy bound is not sufficient for learnability.
R. M. DudleySanjeev R. KulkarniT. J. RichardsonOfer ZeitouniPublished in: IEEE Trans. Inf. Theory (1994)
Keyphrases
- uniform convergence
- lower bound
- upper bound
- mutual information
- perceptual image quality
- real valued functions
- normalized mutual information
- information theoretic
- worst case
- metric space
- information theory
- finite automata
- learning algorithm
- error bounds
- boolean functions
- vc dimension
- information entropy
- sufficient conditions
- uniform distribution
- distance metric
- pattern languages
- vapnik chervonenkis dimension
- distance function
- dnf formulas
- similarity metric
- euclidean distance