Rethinking Information-theoretic Generalization: Loss Entropy Induced PAC Bounds.
Yuxin DongTieliang GongHong ChenShujian YuChen LiPublished in: ICLR (2024)
Keyphrases
- information theoretic
- information theory
- vc dimension
- pac bayesian
- upper bound
- generalization bounds
- mutual information
- worst case bounds
- mistake bound
- theoretic framework
- learning machines
- lower bound
- information theoretic measures
- relative entropy
- entropy measure
- sample complexity
- information bottleneck
- jensen shannon divergence
- data dependent
- log likelihood
- multi modality
- kullback leibler divergence
- perceptron algorithm
- learning theory
- worst case
- computational learning theory
- distributional clustering
- minimum description length
- feature selection
- pac learning
- model selection
- pac learning model
- multi class