Error Bounds of Supervised Classification from Information-Theoretic Perspective.
Binchuan QiWei GongLi LiPublished in: CoRR (2024)
Keyphrases
- information theoretic
- supervised classification
- error bounds
- information theory
- mutual information
- supervised learning
- theoretical analysis
- theoretic framework
- unsupervised clustering
- unsupervised learning
- information bottleneck
- worst case
- supervised classifiers
- log likelihood
- relative entropy
- jensen shannon divergence
- minimum description length
- information theoretic measures
- statistical learning theory
- kullback leibler divergence
- kl divergence
- active learning
- image registration
- entropy measure
- pattern recognition
- neural network
- distributional clustering
- feature selection