Generalized Spectral Dimensionality Reduction Based on Kernel Representations and Principal Component Analysis.
MacArthur Ortega-BustamanteWaldo HasperuéDiego Hernán Peluffo-OrdóñezJuan González-VergaraJosué Marín-GaviñoMartín Vélez-FalconíPublished in: ICCSA (4) (2021)
Keyphrases
- dimensionality reduction
- principal component analysis
- kernel pca
- feature space
- class separability
- input space
- kernel discriminant analysis
- kernel trick
- principal components
- low dimensional
- discriminant analysis
- high dimensional
- linear discriminant analysis
- random projections
- independent component analysis
- dimension reduction
- lower dimensional
- dimensionality reduction methods
- manifold learning
- high dimensional data
- kernel learning
- feature extraction
- subspace learning
- kernel function
- kernel principal component analysis
- singular value decomposition
- pattern recognition and machine learning
- graph embedding
- high dimensionality
- kernel methods
- high dimensional feature space
- kernel matrix
- linear dimensionality reduction
- metric learning
- covariance matrix
- nonlinear dimensionality reduction
- data representation
- locally linear embedding
- spectral analysis
- feature selection
- pattern recognition
- hyperspectral imagery
- data points
- laplacian matrix
- face recognition
- reproducing kernel hilbert space
- normalized cut
- negative matrix factorization
- feature vectors