Revisiting Probability Distribution Assumptions for Information Theoretic Feature Selection.
Yuan SunWei WangMichael KirleyXiaodong LiJeffrey ChanPublished in: AAAI (2020)
Keyphrases
- information theoretic
- probability distribution
- mutual information
- feature selection
- information theory
- theoretic framework
- jensen shannon divergence
- information bottleneck
- random variables
- distributional clustering
- information theoretic measures
- text categorization
- bayesian networks
- kullback leibler divergence
- feature set
- image registration
- minimum description length
- log likelihood
- machine learning
- multi modality
- posterior distribution
- feature space
- similarity measure
- model selection
- entropy measure
- classification accuracy
- support vector
- feature subset
- posterior probability
- conditional independence
- relative entropy
- multi class
- computational learning theory
- dimensionality reduction
- feature extraction
- support vector machine
- feature ranking
- bayesian framework
- marginal distributions
- selection criterion
- probability density
- feature selection algorithms
- probability density function