Dimensionality Reduction has Quantifiable Imperfections: Two Geometric Bounds.
Kry Yik Chau LuiGavin Weiguang DingRuitong HuangRobert J. McCannPublished in: CoRR (2018)
Keyphrases
- dimensionality reduction
- upper bound
- feature extraction
- low dimensional
- high dimensional
- lower bound
- principal component analysis
- data representation
- linear dimensionality reduction
- high dimensional data
- upper and lower bounds
- geometric constraints
- manifold learning
- principal components
- linear discriminant analysis
- tight bounds
- structure preserving
- worst case
- high dimensionality
- error bounds
- input space
- pattern recognition
- pattern recognition and machine learning
- geometric structure
- random projections
- contingency tables
- dimensionality reduction methods
- linear projection
- kernel learning
- confidence bounds
- lower and upper bounds
- multi class
- data points
- feature selection
- kernel pca
- dimension reduction
- high order
- nonlinear dimensionality reduction
- graph embedding
- feature set
- regret bounds
- np hard
- data sets