Speeding up Feature Selection by Using an Information Theoretic Bound.
Patrick Emmanuel MeyerOlivier CaelenGianluca BontempiPublished in: BNAIC (2005)
Keyphrases
- information theoretic
- mutual information
- feature selection
- information theory
- jensen shannon divergence
- information bottleneck
- theoretic framework
- upper bound
- text categorization
- feature selection algorithms
- information theoretic measures
- feature set
- relative entropy
- log likelihood
- distributional clustering
- multi modality
- text classification
- machine learning
- kl divergence
- minimum description length
- entropy measure
- worst case
- image registration
- feature space
- kullback leibler divergence
- model selection
- multi class
- feature extraction
- bayesian networks
- jensen shannon