Exactly Tight Information-Theoretic Generalization Error Bound for the Quadratic Gaussian Problem.
Ruida ZhouChao TianTie LiuPublished in: CoRR (2023)
Keyphrases
- information theoretic
- error bounds
- worst case
- mutual information
- information theory
- theoretic framework
- theoretical analysis
- jensen shannon divergence
- information bottleneck
- upper bound
- information theoretic measures
- relative entropy
- pairwise
- kullback leibler divergence
- entropy measure
- lower bound
- minimum description length
- training error
- log likelihood
- parzen window
- machine learning
- np hard
- special case
- kl divergence
- computational complexity
- feature selection
- objective function