A global evaluation criterion for feature selection in text categorization using Kullback-Leibler divergence.
Zhilong ZhenXiaoqin ZengHaijuan WangLixin HanPublished in: SoCPaR (2011)
Keyphrases
- text categorization
- kullback leibler divergence
- evaluation criteria
- feature selection
- mutual information
- text classification
- automated text categorization
- information theoretic
- information gain
- information theory
- probability density function
- distance measure
- knn
- k nearest neighbor
- naive bayes
- machine learning
- feature selection and classifier
- semi supervised learning
- data mining
- classification accuracy
- transfer learning
- support vector machine
- k means
- support vector
- bayesian networks
- linear svm
- document frequency
- feature extraction
- decision trees
- feature selections