On the tightness of information-theoretic bounds on generalization error of learning algorithms.
Xuetong WuJonathan H. MantonUwe AickelinJingge ZhuPublished in: CoRR (2023)
Keyphrases
- information theoretic
- generalization error
- upper bound
- lower bound
- learning algorithm
- algorithmic stability
- generalization error bounds
- training error
- mutual information
- information theory
- learning machines
- uniform convergence
- active learning
- sample complexity
- cross validation
- training data
- binary classification
- model selection
- linear classifiers
- target function
- training set
- information bottleneck
- information theoretic measures
- supervised learning
- machine learning algorithms
- worst case
- boosting algorithms
- objective function
- vc dimension
- sample size
- computational learning theory
- generalization bounds
- image registration
- labeled data
- reinforcement learning