A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine.
Lijuan CaoKok Seng ChuaW. K. ChongH. P. LeeQ. M. GuPublished in: Neurocomputing (2003)
Keyphrases
- dimensionality reduction
- principal component analysis
- kernel principal component analysis
- kernel pca
- principal components analysis
- independent component analysis
- support vector machine
- principal components
- feature extraction
- face recognition
- linear discriminant analysis
- feature space
- kernel methods
- principle component analysis
- low dimensional
- neighborhood preserving
- dimensionality reduction methods
- independent components
- feature selection
- high dimensional data
- feature vectors
- dimension reduction methods
- discriminant analysis
- high dimensional
- high dimensionality
- kernel function
- dimension reduction
- subspace learning
- covariance matrix
- preprocessing
- subspace methods
- data points
- lower dimensional
- kernel matrix
- random projections
- input space
- manifold learning
- kernel fisher discriminant
- data representation
- multi class
- high dimensional feature space
- pattern recognition
- pattern recognition and machine learning
- linear dimensionality reduction
- face images
- image classification
- support vector
- support vector machine svm
- discriminant projection
- decision trees
- structure preserving
- training procedure
- locally linear embedding
- signal processing
- sparse representation