A comparison of ℓ1-regularizion, PCA, KPCA and ICA for dimensionality reduction in logistic regression.
Abdallah Bashir MusaPublished in: Int. J. Mach. Learn. Cybern. (2014)
Keyphrases
- logistic regression
- dimensionality reduction
- principal component analysis
- kernel pca
- independent component analysis
- principal components analysis
- linear discriminant analysis
- face recognition
- kernel principal component analysis
- principal components
- feature extraction
- feature space
- decision trees
- principle component analysis
- support vector
- logistic regression models
- credit scoring
- low dimensional
- discriminant analysis
- involving high dimensional data
- naive bayes
- chi square
- odds ratio
- kernel methods
- subspace methods
- kernel function
- data points
- dimension reduction
- linear support vector machines
- high dimensional data
- lower dimensional
- face images
- factor analysis
- pattern recognition
- input space
- loss function
- covariance matrix
- high dimensional
- kernel matrix
- chi square test
- linear svm
- logistic model
- high dimensionality
- feature vectors
- high dimensional feature space
- machine learning
- learning algorithm
- training set
- pairwise
- supervised learning
- spectral clustering
- text classification
- data sets