A fast information-theoretic approximation of joint mutual information feature selection.
Heng LiuGregory DitzlerPublished in: IJCNN (2017)
Keyphrases
- mutual information
- information theoretic
- feature selection
- information theory
- conditional mutual information
- theoretic framework
- information gain
- information theoretic measures
- image registration
- information bottleneck
- kullback leibler divergence
- similarity measure
- jensen shannon divergence
- multi modality
- competitive learning
- minimum description length
- log likelihood
- text categorization
- relative entropy
- kl divergence
- registration accuracy
- information geometry
- deformation field
- selection criterion
- machine learning
- feature selection algorithms
- closed form
- feature space
- text classification
- dimensionality reduction
- support vector
- multi modal
- bregman divergences
- entropy measure
- feature set
- knn
- distributional clustering